Manage Providers and Credentials#
AI agents typically need credentials to access external services: an API key for the AI model provider, a token for GitHub or GitLab, and so on. OpenShell manages these credentials as first-class entities called providers.
Create and manage providers that supply credentials to sandboxes.
Create a Provider#
Providers can be created from local environment variables or with explicit credential values.
From Local Credentials#
The fastest way to create a provider is to let the CLI discover credentials from your shell environment:
$ openshell provider create --name my-claude --type claude --from-existing
This reads ANTHROPIC_API_KEY or CLAUDE_API_KEY from your current environment
and stores them in the provider.
With Explicit Credentials#
Supply a credential value directly:
$ openshell provider create --name my-api --type generic --credential API_KEY=sk-abc123
Bare Key Form#
Pass a key name without a value to read the value from the environment variable of that name:
$ openshell provider create --name my-api --type generic --credential API_KEY
This looks up the current value of $API_KEY in your shell and stores it.
Manage Providers#
List, inspect, update, and delete providers from the active cluster.
List all providers:
$ openshell provider list
Inspect a provider:
$ openshell provider get my-claude
Update a provider’s credentials:
$ openshell provider update my-claude --type claude --from-existing
Delete a provider:
$ openshell provider delete my-claude
Attach Providers to Sandboxes#
Pass one or more --provider flags when creating a sandbox:
$ openshell sandbox create --provider my-claude --provider my-github -- claude
Each --provider flag attaches one provider. The sandbox receives all
credentials from every attached provider at runtime.
Warning
Providers cannot be added to a running sandbox. If you need to attach an additional provider, delete the sandbox and recreate it with all required providers specified.
Auto-Discovery Shortcut#
When the trailing command in openshell sandbox create is a recognized tool name (claude, codex, or opencode), the CLI auto-creates the required
provider from your local credentials if one does not already exist. You do not
need to create the provider separately:
$ openshell sandbox create -- claude
This detects claude as a known tool, finds your ANTHROPIC_API_KEY, creates
a provider, attaches it to the sandbox, and launches Claude Code.
Supported Provider Types#
The following provider types are supported.
Type |
Environment Variables Injected |
Typical Use |
|---|---|---|
|
|
Claude Code, Anthropic API |
|
|
OpenAI Codex |
|
User-defined |
Any service with custom credentials |
|
|
GitHub API, |
|
|
GitLab API, |
|
|
NVIDIA API Catalog |
|
|
Any OpenAI-compatible endpoint. Set |
|
|
opencode tool |
Tip
Use the generic type for any service not listed above. You define the
environment variable names and values yourself with --credential.
Supported Inference Providers#
The following providers have been tested with inference.local. Any provider that exposes an OpenAI-compatible API works with the openai type. Set --config OPENAI_BASE_URL to the provider’s base URL and --credential OPENAI_API_KEY to your API key.
Provider |
Name |
Type |
Base URL |
API Key Variable |
|---|---|---|---|---|
NVIDIA API Catalog |
|
|
|
|
Anthropic |
|
|
|
|
Baseten |
|
|
|
|
Bitdeer AI |
|
|
|
|
Deepinfra |
|
|
|
|
Ollama (local) |
|
|
|
|
LM Studio (local) |
|
|
|
|
Refer to your provider’s documentation for the correct base URL, available models, and API key setup. To configure inference routing, refer to Configure Inference Routing.
Next Steps#
Explore related topics:
To control what the agent can access, refer to Customize Sandbox Policies.
To use a pre-built environment, refer to the Community Sandboxes catalog.
To view the complete field reference for the policy YAML, refer to the Policy Schema Reference.