🚀
LM Studio MCP Server
Enable models to collaborate and query each other
What this enables:
- Query multiple models concurrently
- Models can call other models for help
- True async operations with connection pooling
- Self-referential AI capabilities
- Tool use with 100% effectiveness (14B+ models)
Configuration to be installed:
Alternative Installation Methods:
Method 1: Direct Deeplink
Method 2: Manual Configuration
Add to ~/.lmstudio/mcp.json in the mcpServers section
⚠️ Security Notice:
This MCP server is from your local installation and will have access to execute Python code and make network requests to your LM Studio instance.