A multi-provider Computer Use Agent (CUA) template for Kernel. Supports Anthropic, OpenAI, and Google Gemini as interchangeable backends with automatic fallback.
uv syncCopy the example env file and add your API keys:
cp .env.example .envSet CUA_PROVIDER to your preferred provider and add the matching API key:
| Provider | Env var for key | Model used |
|---|---|---|
anthropic |
ANTHROPIC_API_KEY |
claude-sonnet-4-6 |
openai |
OPENAI_API_KEY |
gpt-5.4 |
gemini |
GOOGLE_API_KEY |
gemini-2.5-computer-use-preview-10-2025 |
kernel deploy main.py --env-file .envkernel invoke python-cua cua-task --payload '{"query": "Go to https://news.ycombinator.com and get the top 5 stories"}'Set CUA_FALLBACK_PROVIDERS to automatically try another provider if the primary fails:
CUA_PROVIDER=anthropic
CUA_FALLBACK_PROVIDERS=openai,geminiThis will try Anthropic first, then OpenAI, then Gemini. Only providers with valid API keys are used.
Pass record_replay: true in the payload to capture a video replay of the browser session:
kernel invoke python-cua cua-task --payload '{"query": "Navigate to example.com", "record_replay": true}'The response will include a replay_url you can open in your browser.
main.py — Kernel app entrypoint
session.py — Browser session lifecycle with replay support
providers/
__init__.py — Provider factory and fallback logic
anthropic.py — Anthropic Claude adapter
openai.py — OpenAI GPT adapter
gemini.py — Google Gemini adapter
Each provider adapter is self-contained. To customize a provider's behavior (system prompt, model, tool handling), edit the corresponding file in providers/.
To add a new provider, create a new file that implements the CuaProvider protocol and register it in providers/__init__.py.