Use any LLM model with Stagehand for optimal performance
Understand web pages, plan actions, and interact with complex interfaces with Google, OpenAI, Anthropic, xAI, DeepSeek, Perplexity, Azure, Ollama, or any other LLM model from the Vercel AI SDK.
Set your API key in .env and Stagehand handles the rest. No explicit
configuration needed!
Get started with Google Gemini (recommended for speed and cost):
Copy
Ask AI
import { Stagehand } from "@browserbasehq/stagehand";const stagehand = new Stagehand({ env: "BROWSERBASE", model: "google/gemini-2.5-flash" // API key auto-loads from GOOGLE_GENERATIVE_AI_API_KEY - set in your .env});await stagehand.init();
Use any model from the following supported providers.
Google
Anthropic
OpenAI
Azure
Cerebras
DeepSeek
Groq
Mistral
Ollama
Perplexity
TogetherAI
xAI
Copy
Ask AI
import { Stagehand } from "@browserbasehq/stagehand";const stagehand = new Stagehand({ env: "BROWSERBASE", model: "google/gemini-2.5-flash" // API key auto-loads from GOOGLE_GENERATIVE_AI_API_KEY - set in your .env});await stagehand.init();
import { Stagehand } from "@browserbasehq/stagehand";const stagehand = new Stagehand({ env: "BROWSERBASE", model: "anthropic/claude-haiku-4-5" // API key auto-loads from ANTHROPIC_API_KEY - set in your .env});await stagehand.init();
import { Stagehand } from "@browserbasehq/stagehand";const stagehand = new Stagehand({ env: "BROWSERBASE", model: "openai/gpt-5" // API key auto-loads from OPENAI_API_KEY - set in your .env});await stagehand.init();
import { Stagehand } from "@browserbasehq/stagehand";const stagehand = new Stagehand({ env: "BROWSERBASE", model: "azure/gpt-5" // API key auto-loads from AZURE_API_KEY - set in your .env});await stagehand.init();
import { Stagehand } from "@browserbasehq/stagehand";const stagehand = new Stagehand({ env: "BROWSERBASE", model: "cerebras/llama-4-scout" // API key auto-loads from CEREBRAS_API_KEY - set in your .env});await stagehand.init();
import { Stagehand } from "@browserbasehq/stagehand";const stagehand = new Stagehand({ env: "BROWSERBASE", model: "deepseek/deepseek-chat" // API key auto-loads from DEEPSEEK_API_KEY - set in your .env});await stagehand.init();
import { Stagehand } from "@browserbasehq/stagehand";const stagehand = new Stagehand({ env: "BROWSERBASE", model: "groq/llama-3.1-8b-instant" // API key auto-loads from GROQ_API_KEY - set in your .env});await stagehand.init();
import { Stagehand } from "@browserbasehq/stagehand";const stagehand = new Stagehand({ env: "BROWSERBASE", model: "mistral/codestral-2508" // API key auto-loads from MISTRAL_API_KEY - set in your .env});await stagehand.init();
import { Stagehand } from "@browserbasehq/stagehand";const stagehand = new Stagehand({ env: "BROWSERBASE", model: "ollama/llama3.2" // No API key required});await stagehand.init();
import { Stagehand } from "@browserbasehq/stagehand";const stagehand = new Stagehand({ env: "BROWSERBASE", model: "perplexity/sonar-reasoning" // API key auto-loads from PERPLEXITY_API_KEY - set in your .env});await stagehand.init();
import { Stagehand } from "@browserbasehq/stagehand";const stagehand = new Stagehand({ env: "BROWSERBASE", model: "togetherai/Qwen/Qwen3-235B-A22B-Instruct-2507-tput" // API key auto-loads from TOGETHER_AI_API_KEY - set in your .env});await stagehand.init();
import { Stagehand } from "@browserbasehq/stagehand";const stagehand = new Stagehand({ env: "BROWSERBASE", model: "xai/grok-4-fast-reasoning" // API key auto-loads from XAI_API_KEY - set in your .env});await stagehand.init();
Amazon Bedrock, Cohere, all first class models, and any model from the Vercel AI SDK is supported.Use this configuration for custom endpoints and custom retry or caching logic.We’ll use Amazon Bedrock and Google as examples below.
Amazon Bedrock
1
Install dependencies
Install the Vercel AI SDK for your provider.
npm
pnpm
yarn
bun
Copy
Ask AI
npm install @ai-sdk/amazon-bedrock
Copy
Ask AI
pnpm add @ai-sdk/amazon-bedrock
Copy
Ask AI
yarn add @ai-sdk/amazon-bedrock
Copy
Ask AI
bun add @ai-sdk/amazon-bedrock
2
Import, create provider, and create client
Copy
Ask AI
import { createAmazonBedrock } from '@ai-sdk/amazon-bedrock';import { AISdkClient } from '@browserbasehq/stagehand';const bedrockProvider = createAmazonBedrock({ region: 'us-east-1', accessKeyId: 'xxxxxxxxx', secretAccessKey: 'xxxxxxxxx', sessionToken: 'xxxxxxxxx',});const bedrockClient = new AISdkClient({ model: bedrockProvider("amazon/nova-pro-latest"),});
3
Pass client to Stagehand
Copy
Ask AI
const stagehand = new Stagehand({ env: "BROWSERBASE", llmClient: bedrockClient});await stagehand.init();
Google
1
Install dependencies
Install the Vercel AI SDK for your provider.
npm
pnpm
yarn
bun
Copy
Ask AI
npm install @ai-sdk/google
Copy
Ask AI
pnpm add @ai-sdk/google
Copy
Ask AI
yarn add @ai-sdk/google
Copy
Ask AI
bun add @ai-sdk/google
2
Import, create provider, and create client
Copy
Ask AI
import { createGoogle } from '@ai-sdk/google';import { AISdkClient } from '@browserbasehq/stagehand';const googleProvider = createGoogle({ apiKey: process.env.GEMINI_API_KEY,});const googleClient = new AISdkClient({ model: googleProvider("google/gemini-2.5-flash"),});
3
Pass client to Stagehand
Copy
Ask AI
const stagehand = new Stagehand({ env: "BROWSERBASE", llmClient: googleClient});await stagehand.init();
All Providers
To implement a custom model, follow the steps for the provider you are using. See the Amazon Bedrock and Google examples above. All supported providers and models are in the Vercel AI SDK.
1
Install dependencies
Install the Vercel AI SDK for your provider.
2
Import, create provider, and create client
Copy
Ask AI
import { createProvider } from '@ai-sdk/provider';import { AISdkClient } from '@browserbasehq/stagehand';const provider = createProvider({ apiKey: 'xxxxxxxxx',});const providerClient = new AISdkClient({ model: provider("model/name"),});
3
Pass client to Stagehand
Copy
Ask AI
const stagehand = new Stagehand({ env: "BROWSERBASE", llmClient: providerClient});await stagehand.init();
DefaultThe Stagehand agent by default uses the same model passed to Stagehand. All models (first class and custom) are supported. Here’s an example with Gemini:
Copy
Ask AI
const stagehand = new Stagehand({ env: "BROWSERBASE", model: "google/gemini-2.5-flash", // GOOGLE_GENERATIVE_AI_API_KEY is auto-loaded from .env // ... other stagehand options});// Agent will use google/gemini-2.5-flashconst agent = stagehand.agent();
Override (with CUA support)However, the stagehand agent also accepts a model parameter, which accepts any first class model, including computer use agents (CUA). This is useful when you’d like the agent to use a different model than the one passed to Stagehand.
To use a CUA model, you must pass the mode: "cua" parameter to the agent() method. If a non-CUA model is used, whether specified in Stagehand or overridden in the agent() method, an error will be thrown.
Deprecation Notice: The cua: true option is deprecated and will be removed in a future version. Use mode: "cua" instead.
Google CUA
Anthropic CUA
OpenAI CUA
Example First Class Model
Copy
Ask AI
const agent = stagehand.agent({ mode: "cua", model: "google/gemini-2.5-computer-use-preview-10-2025", // GOOGLE_GENERATIVE_AI_API_KEY is auto-loaded from .env // ... other agent options});
Copy
Ask AI
const agent = stagehand.agent({ mode: "cua", model: "anthropic/claude-3-7-sonnet-latest", // ANTHROPIC_API_KEY is auto-loaded from .env // ... other agent options});
Copy
Ask AI
const agent = stagehand.agent({ mode: "cua", model: "openai/computer-use-preview", // OPENAI_API_KEY is auto-loaded from .env // ... other agent options});
const agent = stagehand.agent({ model: "google/gemini-2.5-pro", // GOOGLE_GENERATIVE_AI_API_KEY is auto-loaded from .env // ... other agent options});
All Supported CUA Models
Provider
Model
Google
google/gemini-2.5-computer-use-preview-10-2025
Anthropic
anthropic/claude-3-7-sonnet-latest
Anthropic
anthropic/claude-haiku-4-5-20251001
Anthropic
anthropic/claude-sonnet-4-20250514
Anthropic
anthropic/claude-sonnet-4-5-20250929
OpenAI
openai/computer-use-preview
OpenAI
openai/computer-use-preview-2025-03-11
For overriding the agent API key, using a corporate proxy, adding provider-specific options, or other advanced use cases, the agent model can also take the form of an object. To learn more, see the Agent Reference.