LLM-MCP (LLM Platform Model Context Protocol)
LLM-MCP is the orchestration and agent registry server for the Acme AI Platform. It provides a unified protocol for tool registration, vector operations, and model management. This documentation hub contains comprehensive guides, references, and examples for working with the LLM-MCP system.
Table of Contentsโ
Documentation Standardsโ
- Documentation Standards - Style guide and standards for all documentation
- Documentation Cross References - Mapping of relationships between docs
- Documentation Progress Summary - Current progress and status
Quick Startโ
Get started with LLM-MCP in minutes:
- Install dependencies:
npm install
- Start the server:
npm run start
- Access the OpenAPI explorer:
Architectureโ
LLM-MCP integrates with other platform components through REST APIs and standard protocols:
graph TD;
A[Drupal llm_platform_mcp UI] -->|OpenAPI, REST| B(llm-mcp Server);
B -->|OpenAPI, REST| C(GitLab CI/CD);
C -->|Status API| D[Drupal UI: Deploy Status];
Key architectural components:
- Agent actions in Drupal trigger llm-mcp endpoints and GitLab pipelines
- OpenAPI contracts are validated in CI/CD
- Pipeline/job status is surfaced in the UI
Documentation Sectionsโ
The documentation is organized into these main sections:
API & Usageโ
- API Documentation - API references and specifications
- Examples - Usage scenarios and code examples
- Basic Usage Examples - Comprehensive code examples for common operations
- Integration - Integration with other platform components
Architecture & Designโ
- Architecture Documentation - Architecture overview and design
- MCP Configuration - MCP server configuration
- Advanced Configuration - Comprehensive configuration documentation
- Performance Optimization - Performance tuning and optimization guides
- Security Best Practices - Security hardening and best practices
- Custom Tool Development - Developing and deploying custom tools
Guides & Setupโ
- Getting Started Guide - Comprehensive step-by-step guide for new users
- Tool Registration Guide - Detailed guide for registering and using tools
- Vector Operations Guide - Detailed guide for vector storage and retrieval
- AI Streaming Integration - Guide for real-time AI response streaming
- AI Streaming & Vector Integration - Comprehensive guide for using AI streaming, vector operations, and function calling
- Function Calling Integration - Guide for integrating function calling capabilities
- Setup Guide - Detailed installation and setup
- Configuration Guide - Configuration options
- Cursor Directory Integration - Integration with Cursor Directory
Operations & Deploymentโ
- Deployment Documentation - Deployment instructions and environments
- Monitoring and Logging - Monitoring, logging, and observability guides
- Troubleshooting - Troubleshooting guides
Referenceโ
- Reference Documentation - Reference materials and specifications
- Changelog - Release notes and version history
Common Use Casesโ
LLM-MCP supports these common use cases:
- Tool Registration and Discovery: Register custom tools and make them available for AI agents
- Vector Operations: Store, retrieve, and search vector embeddings for semantic search
- AI Streaming: Real-time AI response streaming with metrics tracking
- Function Calling: Register and execute functions with AI models
- Retrieval Augmented Generation (RAG): Implement RAG workflows with vector operations
- Multi-Provider Support: Abstract multiple AI providers behind a unified interface
- Agent Orchestration: Coordinate multiple AI agents and tools
- GitLab CI/CD Integration: Automate deployment and validation through pipelines
- Drupal Integration: Connect LLM-MCP with Drupal-based UIs
Contributingโ
To contribute to LLM-MCP documentation:
- See README.md for contribution guidelines
- Follow the Documentation Standards
- See changelog.md for release notes
Referencesโ
- OpenAPI Spec - OpenAPI specification for LLM-MCP endpoints
- GitLab API Docs - GitLab API documentation
- Spectral Linter - OpenAPI linting tool
- Redocly - OpenAPI documentation generation
For questions or contributions, see the Acme AI Platform documentation hub or contact the platform team.- DIRECT_MCP_TOOL_EXECUTION.md