Access 500k+ AI models, datasets, and Spaces on Hugging Face directly from your AI agent
The Hugging Face MCP server gives AI agents direct access to the world's largest AI model repository β search models, run inference, access datasets, and manage Spaces. Features: - Search 500,000+ models by task, framework, and performance - Run inference on any model via the Inference API - Access 150,000+ datasets with previews and schema inspection - Manage and deploy Hugging Face Spaces - View model cards, benchmarks, and community discussions - Compare model performance across leaderboards - Download model weights and tokenizer configs - Access the Open LLM Leaderboard results Use cases: - "Find the best open-source model for sentiment analysis under 1B parameters" - "Run this text through the Mistral-7B model and return the output" - "What datasets are available for training a code completion model?" - "Deploy this Gradio Space and get me the URL" Essential for AI/ML engineers building with open-source models.
from agents import Agent
from agents.mcp import MCPServerStdio
mcp_server = MCPServerStdio(
command="npx",
args=["-y", "@huggingface/mcp"],
env={
"HF_TOKEN": "hf_...",
},
)
agent = Agent(
name="My Agent",
model="gpt-4o",
mcp_servers=[mcp_server],
)If you maintain Hugging Face MCP, add this badge to your README to show it's verified on CuratedMCP:
[](https://curatedmcp.com/marketplace/hugging-face-mcp)