Skip to main content
A ready-to-run example is available here!
The LLMProfileStore class provides a centralized mechanism for managing LLM configurations. Define a profile once, reuse it everywhere — across scripts, sessions, and even machines.

Benefits

  • Persistence: Saves model parameters (API keys, temperature, max tokens, …) to a stable disk format.
  • Reusability: Import a defined profile into any script or session with a single identifier.
  • Portability: Simplifies the synchronization of model configurations across different machines or deployment environments.

How It Works

1

Create a Store

The store manages a directory of JSON profile files. By default it uses ~/.openhands/profiles, but you can point it anywhere.
from openhands.sdk import LLMProfileStore

# Default location: ~/.openhands/profiles
store = LLMProfileStore()

# Or bring your own directory
store = LLMProfileStore(base_dir="./my-profiles")
2

Save a Profile

Got an LLM configured just right? Save it for later.
from pydantic import SecretStr
from openhands.sdk import LLM, LLMProfileStore

fast_llm = LLM(
    usage_id="fast",
    model="anthropic/claude-sonnet-4-5-20250929",
    api_key=SecretStr("sk-..."),
    temperature=0.0,
)

store = LLMProfileStore()
store.save("fast", fast_llm)
API keys are excluded by default for security. Pass include_secrets=True to the save method if you wish to persist them; otherwise, they will be read from the environment at load time.
3

Load a Profile

Next time you need that LLM, just load it:
# Same model, ready to go.
llm = store.load("fast")
4

List and Clean Up

See what you’ve got, delete what you don’t need:
print(store.list())   # ['fast.json', 'creative.json']

store.delete("creative")
print(store.list())   # ['fast.json']

Good to Know

Profile names must be simple filenames (no slashes, no dots at the start).

Ready-to-run Example

This example is available on GitHub: examples/01_standalone_sdk/37_llm_profile_store.py
examples/01_standalone_sdk/37_llm_profile_store.py
"""Example: Using LLMProfileStore to save and reuse LLM configurations.

LLMProfileStore persists LLM configurations as JSON files, so you can define
a profile once and reload it across sessions without repeating setup code.
"""

import os
import tempfile

from pydantic import SecretStr

from openhands.sdk import LLM, LLMProfileStore


# Use a temporary directory so this example doesn't pollute your home folder.
# In real usage you can omit base_dir to use the default (~/.openhands/profiles).
store = LLMProfileStore(base_dir=tempfile.mkdtemp())


# 1. Create two LLM profiles with different usage

api_key = os.getenv("LLM_API_KEY")
assert api_key is not None, "LLM_API_KEY environment variable is not set."
base_url = os.getenv("LLM_BASE_URL")
model = os.getenv("LLM_MODEL", "anthropic/claude-sonnet-4-5-20250929")

fast_llm = LLM(
    usage_id="fast",
    model=model,
    api_key=SecretStr(api_key),
    base_url=base_url,
    temperature=0.0,
)

creative_llm = LLM(
    usage_id="creative",
    model=model,
    api_key=SecretStr(api_key),
    base_url=base_url,
    temperature=0.9,
)

# 2. Save profiles

# Note that secrets are excluded by default for safety.
store.save("fast", fast_llm)
store.save("creative", creative_llm)

# To persist the API key as well, pass `include_secrets=True`:
# store.save("fast", fast_llm, include_secrets=True)

# 3. List available persisted profiles

print(f"Stored profiles: {store.list()}")

# 4. Load a profile

loaded = store.load("fast")
assert isinstance(loaded, LLM)
print(
    "Loaded profile. "
    f"usage:{loaded.usage_id}, "
    f"model: {loaded.model}, "
    f"temperature: {loaded.temperature}."
)

# 5. Delete a profile

store.delete("creative")
print(f"After deletion: {store.list()}")

print("EXAMPLE_COST: 0")
You can run the example code as-is.
The model name should follow the LiteLLM convention: provider/model_name (e.g., anthropic/claude-sonnet-4-5-20250929, openai/gpt-4o). The LLM_API_KEY should be the API key for your chosen provider.
ChatGPT Plus/Pro subscribers: You can use LLM.subscription_login() to authenticate with your ChatGPT account and access Codex models without consuming API credits. See the LLM Subscriptions guide for details.

Mid-Conversation Model Switching

You can use a saved profile to switch the active model on a running conversation between turns. This is useful when you want to start with one model, then switch to another for later user messages while keeping the same conversation history and combined usage metrics.
examples/01_standalone_sdk/44_model_switching_in_convo.py
"""Mid-conversation model switching.

Usage:
    uv run examples/01_standalone_sdk/44_model_switching_in_convo.py
"""

import os

from openhands.sdk import LLM, Agent, LocalConversation, Tool
from openhands.sdk.llm.llm_profile_store import LLMProfileStore
from openhands.tools.terminal import TerminalTool


LLM_API_KEY = os.getenv("LLM_API_KEY")
store = LLMProfileStore()

store.save(
    "gpt",
    LLM(model="openhands/gpt-5.2", api_key=LLM_API_KEY),
    include_secrets=True,
)

agent = Agent(
    llm=LLM(
        model=os.getenv("LLM_MODEL", "openhands/claude-sonnet-4-5-20250929"),
        api_key=LLM_API_KEY,
    ),
    tools=[Tool(name=TerminalTool.name)],
)
conversation = LocalConversation(agent=agent, workspace=os.getcwd())

# Send a message with the default model
conversation.send_message("Say hello in one sentence.")
conversation.run()

# Switch to a different model and send another message
conversation.switch_profile("gpt")
print(f"Switched to: {conversation.agent.llm.model}")

conversation.send_message("Say goodbye in one sentence.")
conversation.run()

# Print metrics per model
for usage_id, metrics in conversation.state.stats.usage_to_metrics.items():
    print(f"  [{usage_id}] cost=${metrics.accumulated_cost:.6f}")

combined = conversation.state.stats.get_combined_metrics()
print(f"Total cost: ${combined.accumulated_cost:.6f}")
print(f"EXAMPLE_COST: {combined.accumulated_cost}")

store.delete("gpt")
You can run the example code as-is.
The model name should follow the LiteLLM convention: provider/model_name (e.g., anthropic/claude-sonnet-4-5-20250929, openai/gpt-4o). The LLM_API_KEY should be the API key for your chosen provider.
ChatGPT Plus/Pro subscribers: You can use LLM.subscription_login() to authenticate with your ChatGPT account and access Codex models without consuming API credits. See the LLM Subscriptions guide for details.

Next Steps