Skip to content

Conversation

PeriniM
Copy link
Contributor

@PeriniM PeriniM commented Jan 12, 2025

  • Fixed 1024 max tokens bug for Ollama models
  • Added new structured output for Ollama models
  • Exposed Browser backend to config loader_kwargs

@PeriniM PeriniM linked an issue Jan 12, 2025 that may be closed by this pull request
@dosubot dosubot bot added the size:L This PR changes 100-499 lines, ignoring generated files. label Jan 12, 2025
@PeriniM PeriniM changed the base branch from main to pre/beta January 12, 2025 11:48
Copy link

Dependency Review

✅ No vulnerabilities or license issues or OpenSSF Scorecard issues found.

OpenSSF Scorecard

PackageVersionScoreDetails

Scanned Files

  • pyproject.toml

@PeriniM PeriniM merged commit a523df0 into pre/beta Jan 12, 2025
6 checks passed
@dosubot dosubot bot added bug Something isn't working enhancement New feature or request labels Jan 12, 2025
Copy link

🎉 This PR is included in version 1.35.1-beta.1 🎉

The release is available on:

Your semantic-release bot 📦🚀

Copy link
Contributor

codebeaver-ai bot commented Jan 12, 2025

I added some Unit Tests (average coverage improvement: +59.28%). Check it out here to merge it!

Copy link

🎉 This PR is included in version 1.36.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

@VinciGit00 VinciGit00 deleted the 856-token-indices-sequence-length branch January 26, 2025 14:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request released on @dev released on @stable size:L This PR changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Token indices sequence length
1 participant