Use AWS Bedrock LLMs with Cognee through LiteLLM proxy
config.yaml
file:
drop_params: true
setting is important for proper Bedrock integration.http://localhost:4000
by default.
.env
file:
LLM_PROVIDER = "openai"
- LiteLLM works with this format for Bedrock models.bedrock/anthropic.claude-3-5-sonnet-20240620-v1:0
bedrock/amazon.titan-text-express-v1
bedrock/cohere.command-text-v14
bedrock/ai21.j2-ultra-v1