AI  

Which Models does CrewAI Support?

🤖 Protocol‑Agnostic Architecture

CrewAI implements open standards (Model Context Protocol – MCP and Agent2Agent – A2A), so you can:

  • Swap models without rewriting orchestration logic

  • Bridge between agents built on different LLMs

  • Extend to custom or fine‑tuned checkpoints

No matter where your model lives—AWS, OpenAI, private data center—CrewAI simply passes inputs/outputs over MCP or A2A.

🧩 Supported Foundation Models

You can target any model by specifying its endpoint or ID:

  • Amazon Bedrock Models

    • Amazon Titan (Text & Embeddings)

    • AI21 Labs Jurassic‑2

    • Anthropic Claude 2 & 3

    • Cohere Command

    • Meta Llama 2

    • Mistral AI

  • OpenAI Models

    • GPT‑4, GPT‑4 Turbo

    • GPT‑3.5, Codex

  • Other Cloud & Open‑Source Models

    • Google Gemini & PaLM

    • Stability AI’s Stable Diffusion XL (vision)

    • Bloom, RedPajama, MPT series

    • Your own fine‑tuned checkpoints (via custom MCP client)

🛠️ Example: Switching Between Models

from crewai.protocols import MCPClient
from crewai import Agent, Crew

# Instantiate two MCP clients for different providers
openai_client = MCPClient(
    endpoint="https://api.openai.com/v1",
    api_key="OPENAI_KEY"
)

anthropic_client = MCPClient(
    endpoint="https://api.anthropic.com/v1",
    api_key="ANTHROPIC_KEY"
)

class DynamicAgent(Agent):
    def __init__(self, client):
        super().__init__()
        self.client = client

    def run(self, prompt):
        # Send prompt to the chosen model
        return self.client.generate(prompt)

# Create a Crew that uses both OpenAI GPT-4 and Anthropic Claude
crew = Crew([
    DynamicAgent(openai_client),
    DynamicAgent(anthropic_client),
])

responses = crew.run("Summarize the benefits of CrewAI.")
print(responses)

In this example, you’re orchestrating a hybrid Crew that leverages GPT‑4 for creativity and Claude for safety checks, without changing any orchestration code.

🚀 Getting Started

  1. Choose your model(s): Pick any foundation model’s API endpoint or Bedrock ID.

  2. Configure MCPClient or A2A transport: Supply credentials and endpoint.

  3. Attach to agents: Pass the client into your Agent class.

  4. Run your Crew or Flow: CrewAI handles the rest—routing inputs, aggregating outputs, and managing sessions.

With CrewAI’s model‑agnostic design, you’re free to innovate, compare, and evolve your AI stack without lock‑in.