Prerequisites
- AWS account with Bedrock access
- Python 3.8+
- Cognee 0.2.0+
Setup
1. Install LiteLLM Proxy
Use LiteLLM Proxy (not the SDK) for this integration. The proxy acts as a server that Cognee can connect to.
For detailed setup instructions, refer to the official LiteLLM Bedrock tutorial.
2. Configure LiteLLM Proxy
Create aconfig.yaml
file:
The
drop_params: true
setting is important for proper Bedrock integration.3. Start LiteLLM Proxy
http://localhost:4000
by default.
4. Configure Cognee
Create a.env
file:
Set
LLM_PROVIDER = "openai"
- LiteLLM works with this format for Bedrock models.5. Install Cognee
Usage Example
Supported Bedrock Models
- Anthropic Claude:
bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0
- Amazon Titan:
bedrock/amazon.titan-text-express-v1
- Cohere Command:
bedrock/cohere.command-text-v14
- AI21 Jurassic:
bedrock/ai21.j2-ultra-v1
Troubleshooting
Common Issues
- Authentication Errors: Verify your AWS credentials and Bedrock permissions
- Model Not Found: Ensure the model name matches exactly in your config
- Connection Issues: Check that LiteLLM proxy is running on the correct port