Skip to content

fix: allow custom LLM clients to skip modelApiKey validation in BrowserBase #966

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

gingeekrishna
Copy link

This PR fixes issue #899 by updating [api.ts] to allow custom LLM clients (such as AWS Bedrock) to skip the [modelApiKey] validation in the BrowserBase environment. The API key requirement is now enforced only for built-in providers (OpenAI, Anthropic, Google, AISDK), enabling custom LLMs to work without an API key. This resolves the error and supports broader LLM integration.

Copy link

changeset-bot bot commented Aug 14, 2025

⚠️ No Changeset found

Latest commit: bb30164

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

Copy link
Contributor

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Greptile Summary

This PR modifies the API key validation logic in lib/api.ts to resolve issue #899, which prevented custom LLM clients like AWS Bedrock from working in the BrowserBase environment. The change updates the StagehandAPI.init() method to conditionally validate API keys based on the provider type.

The implementation works by parsing the provider name from the modelName parameter (e.g., "openai/gpt-4o" -> "openai") and checking if it matches one of the built-in providers: OpenAI, Anthropic, Google, or AISDK. Only these built-in providers now require a modelApiKey to be provided. Custom LLM clients that don't use these providers can skip the validation entirely, allowing them to handle authentication through their own mechanisms.

This change maintains backward compatibility with existing integrations while expanding support for custom LLM providers. The solution is elegant because it leverages the existing modelName parameter format to infer provider type without requiring additional configuration parameters. For custom clients, the modelApiKey defaults to an empty string when not provided, which aligns with how these clients typically handle authentication internally.

The fix integrates well with the existing Stagehand architecture, which already supports custom LLM clients through the modelClientOptions configuration. This change simply removes the artificial barrier that was preventing these custom clients from working in the BrowserBase environment.

Confidence score: 4/5

  • This PR is safe to merge with minimal risk as it relaxes validation for a specific use case without compromising security
  • Score reflects well-targeted fix that addresses the core issue without breaking existing functionality for built-in providers
  • Pay close attention to the provider name parsing logic and ensure custom clients handle authentication properly

1 file reviewed, no comments

Edit Code Review Bot Settings | Greptile

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant