Official OpenAI integration — access GPT-4o, DALL-E, Whisper, and Embeddings from any MCP client
The official OpenAI MCP server lets you use OpenAI's full model suite as tools within any MCP-compatible AI client — including Claude, Cursor, and Windsurf. Features: - Chat completions with any GPT model (GPT-4o, o1, o3-mini) - DALL-E 3 image generation from text prompts - Whisper audio transcription - Text embeddings for semantic search - Moderation API for content safety checks - Function/tool calling passthrough - Streaming response support - Batch API for high-volume requests Cross-model use cases: - Use Claude as orchestrator while delegating image generation to DALL-E - Use GPT-4o for tasks where it outperforms other models - Use Whisper for voice transcription within a Claude workflow - Generate embeddings for RAG pipelines directly from MCP Full OpenAI API compatibility — all models and endpoints supported.
from agents import Agent
from agents.mcp import MCPServerStdio
mcp_server = MCPServerStdio(
command="npx",
args=["-y", "@openai/mcp"],
env={
"OPENAI_API_KEY": "sk-proj-...",
},
)
agent = Agent(
name="My Agent",
model="gpt-4o",
mcp_servers=[mcp_server],
)If you maintain OpenAI MCP, add this badge to your README to show it's verified on CuratedMCP:
[](https://curatedmcp.com/marketplace/openai-mcp)