Getting Started with LLM-MCP
This guide provides a step-by-step introduction to get you up and running with the LLM Platform Model Context Protocol (LLM-MCP) platform. By the end, you'll understand how to install, configure, and use LLM-MCP for AI tool orchestration and model management.
Table of Contentsโ
- What is LLM-MCP?
- Prerequisites
- Installation
- Quick Start
- Basic Usage
- Integration with Other Services
- Vector Storage Operations
- Next Steps
- Troubleshooting
- Support and Resources
What is LLM-MCP?โ
LLM-MCP (LLM Platform Model Context Protocol) is a unified protocol and platform for:
- Tool Registration: Register and discover AI tools across services
- Tool Execution: Execute tools through a standardized protocol
- Model Management: Register and manage AI models
- Vector Operations: Store and retrieve vector embeddings
- Agent Workflows: Coordinate multi-agent AI workflows
Prerequisitesโ
Before getting started with LLM-MCP, ensure you have:
- Node.js: v18.x or later
- npm: v9.x or later
- Docker: (optional) for containerized deployment
- Redis: v6.x or later for caching and message queue
- MongoDB: v5.x or later or Qdrant: latest version for vector storage
Installationโ
Method 1: NPM Installationโ
# Create a new project directory
mkdir my-llm-mcp-project
cd my-llm-mcp-project
# Initialize a new Node.js project
npm init -y
# Install LLM-MCP
npm install @bluefly/llm-mcp
# Install peer dependencies
npm install @bluefly/bfllm @bluefly/bfapi
Method 2: Docker Installationโ
# Pull the LLM-MCP Docker image
docker pull bluefly/llm-mcp:latest
# Run the container
docker run -p 3001:3001 \
-e NODE_ENV=production \
-e REDIS_URL=redis://redis:6379 \
-e MONGODB_URI=mongodb://mongodb:27017/llm-mcp \
bluefly/llm-mcp:latest
Method 3: Clone and Buildโ
# Clone the repository
git clone https://github.com/bluefly/llm-mcp.git
cd llm-mcp
# Install dependencies
npm install
# Build the project
npm run build
# Start the service
npm start
Quick Startโ
1. Initialize LLM-MCPโ
Create a file named index.js
with the following content:
const [BfmcpServer] = require('@bluefly/llm-mcp');
// Create a new LLM-MCP server instance
const server = new BfmcpServer({
port: 3001,
redis: {
host: 'localhost',
port: 6379
},
storage: {
type: 'mongodb', // or 'qdrant'
uri: 'mongodb://localhost:27017/llm-mcp'
}
});
// Start the server
server.start()
.then(() => console.log('LLM-MCP Server running on port 3001'))
.catch(err => console.error('Failed to start LLM-MCP server:', err));
2. Run Your LLM-MCP Serverโ
node index.js
3. Test the Installationโ
You can test if your LLM-MCP server is running correctly by checking the health endpoint:
curl http://localhost:3001/health
You should receive a response like:
{
"status": "ok",
"version": "1.0.0",
"services": {
"redis": "connected",
"storage": "connected",
"toolRegistry": "active"
}
}
Basic Usageโ
Registering a Toolโ
const [registerTool] = require('@bluefly/llm-mcp/client');
// Connect to LLM-MCP server
const client = new BfmcpClient({
serverUrl: 'http://localhost:3001'
});
// Register a tool
const toolDefinition = {
id: "weather_lookup",
name: "Weather Lookup",
description: "Get current weather for a location",
version: "1.0.0",
input_schema: {
type: "object",
properties: {
location: {
type: "string",
description: "City and country/state"
}
},
required: ["location"]
},
output_schema: {
type: "object",
properties: {
temperature: {
type: "number",
description: "Current temperature in Celsius"
},
conditions: {
type: "string",
description: "Weather conditions description"
}
}
},
function: async ([location]) => {
// Implementation of the weather lookup
// ...
return { temperature: 22.5, conditions: "Partly cloudy" };
}
};
// Register the tool
await client.registerTool(toolDefinition);
console.log(`Tool [toolDefinition.id] registered successfully`);
Executing a Toolโ
// Execute the tool
const result = await client.executeTool("weather_lookup", {
location: "New York, NY"
});
console.log("Weather result:", result);
// Output: Weather result: { temperature: 22.5, conditions: "Partly cloudy" }
Using the gRPC APIโ
For high-performance applications, LLM-MCP provides a gRPC API:
const [BfmcpGrpcClient] = require('@bluefly/llm-mcp/grpc');
// Create a gRPC client
const grpcClient = new BfmcpGrpcClient('localhost:3002');
// Execute a tool via gRPC
const result = await grpcClient.executeTool("weather_lookup", {
location: "Seattle, WA"
});
console.log("Weather result:", result);
Integration with Other Servicesโ
Integrating with BFLLMโ
LLM-MCP works seamlessly with BFLLM for AI model inference:
const [BfllmClient] = require('@bluefly/bfllm');
const [BfmcpClient] = require('@bluefly/llm-mcp/client');
// Initialize clients
const llmClient = new BfllmClient('http://localhost:3002');
const mcpClient = new BfmcpClient('http://localhost:3001');
// Register BFLLM as a tool provider
await mcpClient.registerProvider({
id: "bfllm_provider",
name: "BFLLM Provider",
description: "Provides access to language models",
endpoint: "http://localhost:3002"
});
// Now BFLLM models are available as tools through LLM-MCP
Integrating with Drupalโ
For Drupal integration, use the provided module:
# From your Drupal root
composer require bluefly/llm_platform_recipe
# Enable the MCP module
drush en llm_mcp
Configure the connection in your Drupal settings:
// settings.php
$settings['llm_mcp.settings'] = [
'endpoint' => 'http://localhost:3001',
'api_key' => 'your-api-key',
];
Vector Storage Operationsโ
LLM-MCP provides built-in vector storage and retrieval:
// Store a vector
await client.storeVector({
id: "doc-123",
vector: [0.1, 0.2, 0.3, ...], // embedding vector
metadata: {
title: "Important Document",
content: "This is an important document about...",
tags: ["important", "document"]
}
});
// Retrieve similar vectors
const results = await client.findSimilarVectors({
vector: [0.15, 0.22, 0.28, ...],
limit: 5,
scoreThreshold: 0.75
});
console.log("Similar documents:", results);
Next Stepsโ
Now that you have LLM-MCP up and running, here are some next steps:
- Explore the API Reference to learn about all available endpoints and operations
- Review the Architecture Overview to understand the system design
- Configure advanced settings using the Advanced Configuration Guide
- Set up monitoring and observability with the Monitoring and Logging Guide
- Develop custom tools by following the Custom Tool Development Guide
- Optimize performance with the Performance Tuning Guide
- Secure your deployment using Security Best Practices
For more detailed information, see the full Documentation Index.
Troubleshootingโ
Common Issuesโ
Connection Refusedโ
If you see a "Connection refused" error:
- Ensure the LLM-MCP server is running
- Check if the port (default: 3001) is available
- Verify firewall settings allow connections to the port
Authentication Failedโ
If you encounter authentication errors:
- Check that your API key is correctly configured
- Ensure your client has the proper credentials
- Verify the authentication settings in your configuration
Storage Connection Issuesโ
If you experience storage connection problems:
- Verify that Redis is running and accessible
- Check your MongoDB or Qdrant connection string
- Ensure database users have proper permissions
For more help, please refer to our Troubleshooting Guide or open an issue on our GitHub repository.
Support and Resourcesโ
- GitHub Repository: https://github.com/bluefly/llm-mcp
- Documentation: https://docs.bluefly.ai/llm-mcp
- Community Forum: https://community.bluefly.ai/categories/llm-mcp
- Issue Tracker: https://github.com/bluefly/llm-mcp/issues
See Alsoโ
- Tool Registration Guide - Detailed guide for registering and managing tools
- Vector Operations Guide - Comprehensive guide to vector storage and retrieval
- Basic Usage Examples - Code examples for common LLM-MCP operations
- Troubleshooting Guide - Solutions to common problems
- API Reference - Complete API documentation