Official OpenAI integration — access GPT-4o, DALL-E, Whisper, and Embeddings from any MCP client
Runs in an isolated sandbox · 5 free calls per hour · keys never stored
Used for this session only — never stored.
The official OpenAI MCP server lets you use OpenAI's full model suite as tools within any MCP-compatible AI client — including Claude, Cursor, and Windsurf. Features: - Chat completions with any GPT model (GPT-4o, o1, o3-mini) - DALL-E 3 image generation from text prompts - Whisper audio transcription - Text embeddings for semantic search - Moderation API for content safety checks - Function/tool calling passthrough - Streaming response support - Batch API for high-volume requests Cross-model use cases: - Use Claude as orchestrator while delegating image generation to DALL-E - Use GPT-4o for tasks where it outperforms other models - Use Whisper for voice transcription within a Claude workflow - Generate embeddings for RAG pipelines directly from MCP Full OpenAI API compatibility — all models and endpoints supported.
Like what you see?
Host OpenAI MCP always-on for $9/mo — no config files, no restarts