Access 500k+ AI models, datasets, and Spaces on Hugging Face directly from your AI agent
Runs in an isolated sandbox · 5 free calls per hour · keys never stored
Used for this session only — never stored.
The Hugging Face MCP server gives AI agents direct access to the world's largest AI model repository — search models, run inference, access datasets, and manage Spaces. Features: - Search 500,000+ models by task, framework, and performance - Run inference on any model via the Inference API - Access 150,000+ datasets with previews and schema inspection - Manage and deploy Hugging Face Spaces - View model cards, benchmarks, and community discussions - Compare model performance across leaderboards - Download model weights and tokenizer configs - Access the Open LLM Leaderboard results Use cases: - "Find the best open-source model for sentiment analysis under 1B parameters" - "Run this text through the Mistral-7B model and return the output" - "What datasets are available for training a code completion model?" - "Deploy this Gradio Space and get me the URL" Essential for AI/ML engineers building with open-source models.
Like what you see?
Host Hugging Face MCP always-on for $9/mo — no config files, no restarts