Replies: 2 comments 1 reply
-
|
The default llm is |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
John - thanks. but this should not force me to give an api key, if in my config i am asking it to use another embedding model. Would be great to have this mandate removed... |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am trying to make mem0 work with bedrock embedding model. first thing. it demands a openai key ... even though i provide the config to be amazon bedrock model
The code look like this....
from mem0 import Memory
config = {
"embedder": {
"provider": "aws_bedrock",
"config": {
"model": "amazon.titan-embed-text-v1"
}
}
}
m = Memory.from_config(config)
Beta Was this translation helpful? Give feedback.
All reactions