C
cherry-studio
By CherryHQ·Visit Source
Cross-platform desktop client for multiple LLM providers
April 22, 2025
Today
1 Clicks
What is this MCP
Cherry Studio's MCP (Model Context Protocol) Server enables seamless integration and management of multiple LLM providers through a unified interface.
How to use this MCP
The MCP can be configured via the Cherry Studio desktop application, allowing users to connect various LLM services (OpenAI, Gemini, Anthropic, etc.) and local models (Ollama, LM Studio) through a single protocol.
What this MCP can be used for
The MCP facilitates multi-model conversations, AI assistant creation, document processing, and tool integration while maintaining context across different LLM providers and platforms.
Repository Info
Stars:
43419
Forks:
4074
Watchers:
43419
Last Updated: Today
Sponsored
Vernclaw Plugins for OpenClaw
Ready-to-use connectors for SEO data, social reading & content generation. Pay-as-you-go credits with audit logs.
