Skip to content

Model Providers

What are Model Providers?

A model provider is a service or platform that hosts and serves large language models through an API. The Strands Agents SDK abstracts away the complexity of working with different providers, offering a unified interface that makes it easy to switch between models or use multiple providers in the same application.

Supported Providers

The following table shows all model providers supported by Strands Agents SDK and their availability in Python and TypeScript:

Provider Python Support TypeScript Support
Custom Providers
Amazon Bedrock
Amazon Nova
OpenAI
Anthropic
Gemini
LiteLLM
llama.cpp
LlamaAPI
MistralAI
Ollama
SageMaker
Writer
Cohere
CLOVA Studio
FireworksAI

Getting Started

Installation

Most providers are available as optional dependencies. Install the provider you need:

# Install with specific provider
pip install 'strands-agents[bedrock]'
pip install 'strands-agents[openai]'
pip install 'strands-agents[anthropic]'

# Or install with all providers
pip install 'strands-agents[all]'
# Core SDK includes BedrockModel by default
npm install @strands-agents/sdk

# To use OpenAI, install the openai package
npm install openai

Note: All model providers except Bedrock are listed as optional dependencies in the SDK. This means npm will attempt to install them automatically, but won't fail if they're unavailable. You can explicitly install them when needed.

Basic Usage

Each provider follows a similar pattern for initialization and usage. Models are interchangeable - you can easily switch between providers by changing the model instance:

from strands import Agent
from strands.models.bedrock import BedrockModel
from strands.models.openai import OpenAIModel

# Use Bedrock
bedrock_model = BedrockModel(
    model_id="anthropic.claude-sonnet-4-20250514-v1:0"
)
agent = Agent(model=bedrock_model)
response = agent("What can you help me with?")

# Alternatively, use OpenAI by just switching model provider
openai_model = OpenAIModel(
    client_args={"api_key": "<KEY>"},
    model_id="gpt-4o"
)
agent = Agent(model=openai_model)
response = agent("What can you help me with?")
import { Agent } from '@strands-agents/sdk'
import { BedrockModel } from '@strands-agents/sdk/models/bedrock'
import { OpenAIModel } from '@strands-agents/sdk/models/openai'

// Use Bedrock
const bedrockModel = new BedrockModel({
  modelId: 'anthropic.claude-sonnet-4-20250514-v1:0',
})
let agent = new Agent({ model: bedrockModel })
let response = await agent.invoke('What can you help me with?')

// Alternatively, use OpenAI by just switching model provider
const openaiModel = new OpenAIModel({
  apiKey: process.env.OPENAI_API_KEY,
  modelId: 'gpt-4o',
})
agent = new Agent({ model: openaiModel })
response = await agent.invoke('What can you help me with?')

Next Steps

Explore Model Providers

  • Amazon Bedrock - Default provider with wide model selection, enterprise features, and full Python/TypeScript support
  • OpenAI - GPT models with streaming support
  • Custom Providers - Build your own model integration
  • Anthropic - Direct Claude API access (Python only)