-
Notifications
You must be signed in to change notification settings - Fork 466
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
States that the contextwindow is 1024, which is wrong.
LLM {
model: 'deepseek-chat',
temperature: 0.1,
topP: 1,
maxTokens: undefined,
contextWindow: 1024,
tokenizer: 'cl100k_base',
structuredOutput: true
}
To Reproduce
Initialize deepseek llm and log the metadata
Settings.llm = new DeepSeekLLM({
apiKey: config.deepSeekApiKey,
model: "deepseek-chat",
timeout: 30000, // 30 second timeout
maxRetries: 2,
temperature: 0.1,
});
console.log(Settings.llm.metadata);
Expected behavior
Should be 64k.
- OS: [e.g. macOS, Linux]
- JS Runtime / Framework / Bundler (select all applicable)
- Node.js
- Deno
- Bun
- Next.js
- ESBuild
- Rollup
- Webpack
- Turbopack
- Vite
- Waku
- Edge Runtime
- AWS Lambda
- Cloudflare Worker
- Others (please elaborate on this)
- Version [e.g. 22]
Additional context
Add any other context about the problem here.
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working