Skip to main content
Credential injection keeps API keys out of the sandboxed process entirely. The agent sees a local HTTP URL and the proxy handles authentication transparently.

How It Works

The proxy acts as a reverse proxy for configured credential routes. The agent sends plain HTTP to localhost:<port>/<service>/... and the proxy:
  1. Strips the service prefix
  2. Injects the real credential as an HTTP header
  3. Forwards to the upstream over TLS
  4. Streams the response back
Agent sends:  POST http://127.0.0.1:PORT/openai/v1/chat/completions
Proxy sends:  POST https://api.openai.com/v1/chat/completions
              Authorization: Bearer sk-... (injected from keystore)
The agent never sees the API key. Even if the agent is compromised, it cannot extract credentials from its own environment or memory. Streaming responses (SSE for chat completions, MCP Streamable HTTP, A2A JSON-RPC) are forwarded without buffering.

Quick Start

# 1. Store credentials in the system keystore
security add-generic-password -s "nono" -a "openai_api_key" -w "sk-..."  # macOS

# 2. Run with credential injection
nono run --allow-cwd --network-profile claude-code --proxy-credential openai -- my-agent
The proxy sets OPENAI_BASE_URL=http://127.0.0.1:<port>/openai in the child’s environment. Most LLM SDKs respect this variable and redirect API calls through the proxy automatically.

Storing Credentials

Credentials are stored in the system keyring under the service name nono. The username/account corresponds to the credential_key defined in network-policy.json:
CLI ServiceKeyring UsernameKeyring Service
openaiopenai_api_keynono
anthropicanthropic_api_keynono
geminigemini_api_keynono
google-aigoogle_generative_ai_api_keynono

macOS

security add-generic-password -s "nono" -a "openai_api_key" -w "sk-..."
security add-generic-password -s "nono" -a "anthropic_api_key" -w "sk-ant-..."
security add-generic-password -s "nono" -a "gemini_api_key" -w "your-gemini-key"

Linux

The keyring crate uses service, username, and target attributes. You must use these exact attribute names:
echo -n "sk-..." | secret-tool store --label="nono: openai_api_key" \
    service nono username openai_api_key target default

echo -n "sk-ant-..." | secret-tool store --label="nono: anthropic_api_key" \
    service nono username anthropic_api_key target default

echo -n "your-gemini-key" | secret-tool store --label="nono: gemini_api_key" \
    service nono username gemini_api_key target default
Important: Use username (not account) as the attribute name. The target default attribute is required for the keyring crate to find the entry.

Environment Variables

When credential routes are configured, the proxy sets SDK-specific base URL environment variables:
RouteEnvironment VariableValue
openaiOPENAI_BASE_URLhttp://127.0.0.1:<port>/openai
anthropicANTHROPIC_BASE_URLhttp://127.0.0.1:<port>/anthropic
Most LLM SDKs (OpenAI Python, Anthropic Python, etc.) respect these variables and redirect API calls through the proxy automatically.

Credential Route Configuration

The built-in network-policy.json defines default credential routes:
ServiceUpstreamHeaderFormat
openaihttps://api.openai.com/v1AuthorizationBearer
anthropichttps://api.anthropic.comx-api-key
geminihttps://generativelanguage.googleapis.comx-goog-api-key
google-aihttps://generativelanguage.googleapis.comx-goog-api-key
Note: OpenAI’s upstream includes /v1 because the OpenAI SDK expects the base URL to include the version prefix. Anthropic’s SDK adds /v1/messages automatically, so its upstream is the root URL.

Using Credentials in Profiles

User profiles can specify which credential services to enable in the network section:
{
  "meta": { "name": "my-agent" },
  "filesystem": {
    "allow": ["$WORKDIR"]
  },
  "network": {
    "network_profile": "claude-code",
    "proxy_credentials": ["openai", "anthropic"]
  }
}

Custom Credential Definitions

For APIs not covered by the built-in services, you can define custom credentials in your profile. This lets you use --proxy-credential with any API while keeping credentials out of the sandbox.
{
  "meta": { "name": "my-agent" },
  "network": {
    "network_profile": "minimal",
    "proxy_credentials": ["openai", "telegram"],
    "custom_credentials": {
      "telegram": {
        "upstream": "https://api.telegram.org",
        "credential_key": "telegram_bot_token",
        "inject_header": "Authorization",
        "credential_format": "Bearer {}"
      }
    }
  }
}
FieldRequiredDefaultDescription
upstreamYes-Upstream URL to proxy requests to (must be HTTPS, or HTTP for localhost only)
credential_keyYes-Keystore account name (alphanumeric and underscores only)
inject_modeNoheaderCredential injection mode: header, url_path, query_param, or basic_auth
inject_headerNoAuthorizationHTTP header to inject the credential into (used with header and basic_auth modes)
credential_formatNoBearer {}Format string for the credential value ({} is replaced with the credential)
path_patternConditional-Required for url_path mode. URL path pattern with {} placeholder (e.g., /bot{}/)
path_replacementNo-Optional replacement pattern for url_path mode (e.g., /v2/bot{}/)
query_param_nameConditional-Required for query_param mode. Query parameter name for credential injection (e.g., key or api_key)
Important: Use underscores, not hyphens, in credential names (the keys in custom_credentials). The credential name is used to generate environment variables like TELEGRAM_BASE_URL. Shell variable names cannot contain hyphens, so my-api would create MY-API_BASE_URL which cannot be referenced as $MY-API_BASE_URL in shell scripts. Use my_api instead.

Injection Modes

Custom credentials support multiple injection patterns to accommodate different API authentication schemes:
Header Mode (default)
Injects the credential as an HTTP header with optional formatting. This is the most common authentication pattern.
{
  "custom_credentials": {
    "telegram": {
      "upstream": "https://api.telegram.org",
      "credential_key": "telegram_bot_token",
      "inject_mode": "header",
      "inject_header": "Authorization",
      "credential_format": "Bearer {}"
    }
  }
}
URL Path Mode
Replaces a phantom token in the URL path with the real credential. Useful for APIs like Telegram Bot API that embed authentication tokens in the path (e.g., /bot{token}/method).
{
  "custom_credentials": {
    "telegram": {
      "upstream": "https://api.telegram.org",
      "credential_key": "telegram_bot_token",
      "inject_mode": "url_path",
      "path_pattern": "/bot{}/",
      "path_replacement": "/bot{}/"
    }
  }
}
The agent sends requests with a phantom token:
POST http://127.0.0.1:PORT/telegram/bot<NONO_PROXY_TOKEN>/sendMessage
The proxy validates the phantom token matches the session token, then replaces it with the real credential:
POST https://api.telegram.org/bot<REAL_TOKEN>/sendMessage
Query Parameter Mode
Adds or replaces a query parameter with the credential value. Common for APIs that use URL query parameters for authentication (e.g., Google Maps API).
{
  "custom_credentials": {
    "google_maps": {
      "upstream": "https://maps.googleapis.com",
      "credential_key": "google_maps_api_key",
      "inject_mode": "query_param",
      "query_param_name": "key"
    }
  }
}
The agent sends requests with a phantom token in the query parameter:
GET http://127.0.0.1:PORT/google_maps/maps/api/geocode/json?key=<NONO_PROXY_TOKEN>&address=...
The proxy validates the phantom token, then replaces it with the real credential:
GET https://maps.googleapis.com/maps/api/geocode/json?key=<REAL_API_KEY>&address=...
The credential value is URL-encoded automatically.
Basic Auth Mode
Injects a Base64-encoded Basic Authentication header. The credential value should be stored in username:password format in the keystore.
{
  "custom_credentials": {
    "private_api": {
      "upstream": "https://api.example.com",
      "credential_key": "example_basic_auth",
      "inject_mode": "basic_auth"
    }
  }
}
Store the credential in username:password format:
# macOS
security add-generic-password -s "nono" -a "example_basic_auth" -w "myuser:mypassword"

# Linux
echo -n "myuser:mypassword" | secret-tool store --label="nono: example_basic_auth" \
    service nono username example_basic_auth target default
The proxy automatically Base64-encodes the credential and injects it as Authorization: Basic <encoded>.

Phantom Token Validation

For url_path and query_param modes, the agent must include the session token (NONO_PROXY_TOKEN) as a placeholder in the request. The proxy validates this phantom token before replacing it with the real credential. Invalid or missing phantom tokens result in HTTP 401 Unauthorized responses. Store the credential in the system keystore:
# macOS
security add-generic-password -s "nono" -a "telegram_bot_token" -w "your-bot-token"

# Linux
echo -n "your-bot-token" | secret-tool store --label="nono: telegram_bot_token" \
    service nono username telegram_bot_token target default
Then run with the custom credential:
nono run --profile my-agent --proxy-credential telegram -- my-bot
Custom credentials can also override built-in services. For example, to route OpenAI requests through a custom proxy:
{
  "network": {
    "custom_credentials": {
      "openai": {
        "upstream": "https://my-openai-proxy.example.com/v1",
        "credential_key": "my_openai_key",
        "inject_header": "Authorization",
        "credential_format": "Bearer {}"
      }
    }
  }
}

Security Validation

Custom credentials are validated at startup:
  • Upstream URL must be HTTPS (HTTP is only allowed for localhost, 127.0.0.1, or ::1)
  • Credential key must be alphanumeric (letters, numbers, and underscores only)
Invalid configurations will fail with a clear error message before the sandbox is applied.

Session Token Authentication

Reverse proxy requests are authenticated using an X-Nono-Token header containing the session token. The proxy generates a unique 256-bit token per session and passes it to the child via the NONO_PROXY_TOKEN environment variable. Every request to a credential route must include this header — requests without a valid token are rejected with 407 Proxy Authentication Required. This prevents other localhost processes from accessing the credential injection routes.

Security Properties

  • Credentials never enter the sandbox - The agent process has no access to API keys, even through environment variables or memory
  • Session token isolation - Reverse proxy routes require X-Nono-Token authentication; CONNECT tunnels use Proxy-Authorization
  • Keystore-backed storage - Credentials are loaded from the OS keyring (Keychain on macOS, Secret Service on Linux), not from plaintext files
  • Zeroized in memory - Credential values are stored in Zeroizing<String> and wiped from memory on drop
  • Session-scoped - Credentials are loaded once at proxy startup and never written to disk or logged
  • Header stripping - The proxy strips any Authorization or x-api-key headers from the agent’s request before injecting the real credential, preventing the agent from overriding the injected value

Audit Logging

Reverse proxy requests are logged with the service name and status code, but credential values are never logged:
ALLOW REVERSE openai POST /v1/chat/completions -> 200
ALLOW REVERSE anthropic POST /v1/messages -> 200

Next Steps