Skip to main content

LLM-MCP Drupal Integration Guide

This guide provides detailed instructions for integrating the LLM Platform Model Context Protocol (LLM-MCP) with Drupal sites and applications. LLM-MCP enables Drupal to leverage advanced AI capabilities, vector storage, tool orchestration, and model management through a standardized protocol.

Overviewโ€‹

New in 2025.06: Enhanced AI Streaming integration provides real-time AI response streaming with metrics tracking. See the AI Streaming Integration Guide for details.

Drupal integration with LLM-MCP allows your Drupal site to:

  1. Access AI Models: Interact with language models through a standardized API
  2. Store and Search Vectors: Enable semantic search and recommendation features
  3. Register and Execute Tools: Extend Drupal with AI-powered tools
  4. Orchestrate Complex AI Workflows: Coordinate multi-step AI processes
  5. Manage Model Resources: Access model information and capabilities
  6. Stream AI Responses: Real-time response streaming with metrics tracking

This integration is implemented through Drupal modules that connect to the LLM-MCP server via REST and gRPC APIs.

Architectureโ€‹

The integration architecture connects Drupal to the LLM-MCP server, which in turn communicates with other services like BFLLM and BFAPI:

graph TD
subgraph "Drupal"
DrupalCore["Drupal Core"]
LLMModule["LLM Module"]
MCPModule["MCP Client Module"]
VectorModule["Vector Storage Module"]
EntityModule["Entity Embeddings Module"]
end

subgraph "LLM-MCP"
MCPServer["MCP Server"]
ToolRegistry["Tool Registry"]
VectorService["Vector Service"]
ResourceRegistry["Resource Registry"]
end

subgraph "AI Services"
BFLLM["BFLLM"]
BFAPI["BFAPI"]
CustomTools["Custom Tools"]
end

DrupalCore --> LLMModule
DrupalCore --> MCPModule
DrupalCore --> VectorModule
DrupalCore --> EntityModule

LLMModule --> MCPModule
VectorModule --> MCPModule
EntityModule --> MCPModule

MCPModule --> MCPServer

MCPServer --> ToolRegistry
MCPServer --> VectorService
MCPServer --> ResourceRegistry

ToolRegistry --> BFLLM
ToolRegistry --> BFAPI
ToolRegistry --> CustomTools

VectorService --> BFLLM

Prerequisitesโ€‹

Before integrating LLM-MCP with Drupal, ensure you have:

  1. Drupal 10.x or 11.x: A working Drupal installation
  2. LLM-MCP Server: A running LLM-MCP server (version 1.0.0+)
  3. PHP 8.1+: Required for modern Drupal installations
  4. Composer: For installing required packages
  5. API Key: Authentication credentials for the LLM-MCP server

Installationโ€‹

1. Install Required Drupal Modulesโ€‹

Use Composer to install the required modules:

# Navigate to your Drupal project root
cd /path/to/drupal

# Install the LLM Platform Recipe (includes all required modules)
composer require bluefly/llm_platform_recipe

# Or install individual modules
composer require bluefly/llm
composer require bluefly/llm_mcp
composer require bluefly/llm_vector

2. Enable the Modulesโ€‹

Enable the modules through Drush or the Drupal admin interface:

# Using Drush
drush en llm llm_mcp llm_vector -y

# Or enable the complete platform recipe
drush recipe:apply llm_platform

3. Configure Connection Settingsโ€‹

Configure the connection to your LLM-MCP server:

Using the Admin Interfaceโ€‹

  1. Navigate to /admin/config/llm/settings
  2. Enter your LLM-MCP server URL and API key
  3. Save the configuration

Using Drushโ€‹

# Set the MCP server URL
drush config-set llm_mcp.settings server_url "http://your-mcp-server:3001" -y

# Set the API key
drush config-set llm_mcp.settings api_key "your-api-key" -y

# Enable vector storage integration
drush config-set llm_vector.settings enabled true -y

Using settings.phpโ€‹

For production environments, add these settings to your settings.php file:

// settings.php
$settings['llm_mcp.settings'] = [
'server_url' => 'http://your-mcp-server:3001',
'api_key' => 'your-api-key',
'connection_timeout' => 30,
'retry_attempts' => 3,
'logging_enabled' => true,
];

$settings['llm_vector.settings'] = [
'enabled' => true,
'collection' => 'drupal_content',
'dimensions' => 1536,
'similarity' => 'cosine',
];

Basic Usage Examplesโ€‹

Connecting to the LLM-MCP Serverโ€‹

The MCP client service is available through Drupal's service container:

// Get the MCP client service
$mcp_client = \Drupal::service('llm_mcp.client');

// Check connection status
if ($mcp_client->isConnected()) {
\Drupal::messenger()->addStatus(t('Connected to MCP server'));
} else {
\Drupal::messenger()->addError(t('Failed to connect to MCP server'));
}

Listing Available AI Modelsโ€‹

Retrieve available AI models from the LLM-MCP server:

// Get the model service
$model_service = \Drupal::service('llm.model_service');

// List available models
$models = $model_service->getAvailableModels();

// Display model information
foreach ($models as $model) {
echo "Model: {$model['id']}\n";
echo "Provider: [$model['provider']]\n";
echo "Description: [$model['description']]\n";
echo "Max Tokens: [$model['max_tokens']]\n";
echo "\n";
}

Generating Text with an AI Modelโ€‹

Use an AI model to generate text:

// Get the completion service
$completion_service = \Drupal::service('llm.completion_service');

// Define generation parameters
$params = [
'model' => 'gpt-4',
'prompt' => 'Write a brief summary of quantum computing:',
'max_tokens' => 150,
'temperature' => 0.7,
];

try {
// Generate text
$result = $completion_service->generateCompletion($params);

// Display result
echo "Generated text: {$result['text']}\n";
} catch (\Exception $e) {
echo "Error: {$e->getMessage()}\n";
}

Vector Storage Integrationโ€‹

LLM-MCP provides vector storage capabilities that can be used with Drupal entities.

Storing Entity Embeddingsโ€‹

Store vector embeddings for Drupal entities:

// Get the vector storage service
$vector_service = \Drupal::service('llm_vector.storage');

// Get a node entity
$node = \Drupal\node\Entity\Node::load(123);

// Generate embedding for the node
$embedding_service = \Drupal::service('llm_vector.embedding');
$embedding = $embedding_service->generateEmbedding([
'text' => $node->getTitle() . ' ' . $node->get('body')->value,
'model' => 'text-embedding-ada-002',
]);

// Store the embedding
$vector_service->storeVector([
'id' => 'node:' . $node->id(),
'vector' => $embedding['vector'],
'metadata' => [
'entity_type' => 'node',
'bundle' => $node->bundle(),
'title' => $node->getTitle(),
'created' => $node->getCreatedTime(),
'status' => $node->isPublished(),
],
'collection' => 'drupal_content',
]);

Semantic Search with Vector Embeddingsโ€‹

Implement semantic search for Drupal content:

// Get the vector search service
$vector_search = \Drupal::service('llm_vector.search');

// Generate embedding for search query
$embedding_service = \Drupal::service('llm_vector.embedding');
$query_embedding = $embedding_service->generateEmbedding([
'text' => 'artificial intelligence applications',
'model' => 'text-embedding-ada-002',
]);

// Search for similar content
$results = $vector_search->findSimilarVectors([
'vector' => $query_embedding['vector'],
'collection' => 'drupal_content',
'limit' => 5,
'scoreThreshold' => 0.75,
'filter' => [
'status' => true, // Only published content
'bundle' => ['article', 'page'], // Only specific content types
],
]);

// Process search results
foreach ($results as $result) {
$entity_id = str_replace('node:', '', $result['id']);
$node = \Drupal\node\Entity\Node::load($entity_id);

if ($node) {
echo "Match: {$node->getTitle()} (Score: [$result['score']])\n";
}
}

Automated Entity Embedding with Hooksโ€‹

Implement automatic embedding generation for nodes:

/**
* Implements hook_entity_insert().
*/
function my_module_entity_insert(\Drupal\Core\Entity\EntityInterface $entity) {
_my_module_update_entity_embedding($entity);
}

/**
* Implements hook_entity_update().
*/
function my_module_entity_update(\Drupal\Core\Entity\EntityInterface $entity) {
_my_module_update_entity_embedding($entity);
}

/**
* Updates entity embedding in vector storage.
*/
function _my_module_update_entity_embedding(\Drupal\Core\Entity\EntityInterface $entity) {
// Only process nodes
if ($entity->getEntityTypeId() !== 'node') {
return;
}

// Only process specific content types
if (!in_array($entity->bundle(), ['article', 'page'])) {
return;
}

// Get services
$vector_service = \Drupal::service('llm_vector.storage');
$embedding_service = \Drupal::service('llm_vector.embedding');

// Generate content text
$text = $entity->getTitle() . ' ';
if ($entity->hasField('body') && !$entity->get('body')->isEmpty()) {
$text .= strip_tags($entity->get('body')->value);
}

// Generate embedding
try {
$embedding = $embedding_service->generateEmbedding([
'text' => $text,
'model' => 'text-embedding-ada-002',
]);

// Store vector
$vector_service->storeVector([
'id' => 'node:' . $entity->id(),
'vector' => $embedding['vector'],
'metadata' => [
'entity_type' => 'node',
'bundle' => $entity->bundle(),
'title' => $entity->getTitle(),
'created' => $entity->getCreatedTime(),
'changed' => $entity->getChangedTime(),
'status' => $entity->isPublished(),
],
'collection' => 'drupal_content',
]);

\Drupal::logger('my_module')->notice('Updated embedding for node @id', [
'@id' => $entity->id(),
]);
}
catch (\Exception $e) {
\Drupal::logger('my_module')->error('Failed to update embedding for node @id: @message', [
'@id' => $entity->id(),
'@message' => $e->getMessage(),
]);
}
}

Tool Registration and Executionโ€‹

LLM-MCP allows registering and executing tools that can be used by AI models.

Registering a Custom Drupal Toolโ€‹

Register a custom tool from Drupal:

// Get the tool registry service
$tool_registry = \Drupal::service('llm_mcp.tool_registry');

// Define a custom tool
$tool_definition = [
'id' => 'drupal_content_search',
'name' => 'Drupal Content Search',
'description' => 'Search for content in the Drupal site',
'version' => '1.0.0',
'category' => 'drupal',
'tags' => ['content', 'search', 'drupal'],
'input_schema' => [
'type' => 'object',
'properties' => [
'query' => [
'type' => 'string',
'description' => 'Search query',
],
'content_type' => [
'type' => 'string',
'description' => 'Content type to search (optional)',
],
'limit' => [
'type' => 'integer',
'description' => 'Maximum number of results',
'default' => 10,
],
],
'required' => ['query'],
],
'output_schema' => [
'type' => 'object',
'properties' => [
'results' => [
'type' => 'array',
'items' => [
'type' => 'object',
'properties' => [
'id' => ['type' => 'string'],
'title' => ['type' => 'string'],
'url' => ['type' => 'string'],
'summary' => ['type' => 'string'],
'content_type' => ['type' => 'string'],
],
],
],
'count' => ['type' => 'integer'],
],
],
// Custom endpoint for remote execution
'endpoint' => 'https://your-drupal-site.com/api/content-search',
'auth_config' => [
'type' => 'api_key',
'header_name' => 'X-API-Key',
'key' => 'your-api-key',
],
];

// Register the tool
try {
$tool_registry->registerTool($tool_definition);
\Drupal::messenger()->addStatus(t('Tool registered successfully'));
} catch (\Exception $e) {
\Drupal::messenger()->addError(t('Failed to register tool: @message', [
'@message' => $e->getMessage(),
]));
}

Executing a Toolโ€‹

Execute a registered tool from Drupal:

// Get the tool execution service
$tool_executor = \Drupal::service('llm_mcp.tool_executor');

// Execute the tool
try {
$result = $tool_executor->executeTool('drupal_content_search', [
'query' => 'artificial intelligence',
'content_type' => 'article',
'limit' => 5,
]);

// Process results
foreach ($result['results'] as $item) {
echo "Found: {$item['title']} ([$item['url']])\n";
echo "Summary: [$item['summary']]\n\n";
}

echo "Total results: [$result['count']]\n";
} catch (\Exception $e) {
echo "Error executing tool: {$e->getMessage()}\n";
}

Advanced Integration Featuresโ€‹

Implementing Drupal REST Endpoints for Tool Executionโ€‹

Create a custom REST endpoint for tool execution:

/**
* Implements hook_rest_resource_alter().
*/
function my_module_rest_resource_alter(&$definitions) {
$definitions['my_module:content_search'] = [
'id' => 'my_module:content_search',
'plugin_id' => 'my_module_content_search',
'label' => 'Content Search',
'description' => 'Search for content',
'uri_paths' => [
'canonical' => '/api/content-search',
],
];
}

/**
* Content search REST resource plugin.
*/
class ContentSearchResource extends ResourceBase {

/**
* Responds to POST requests.
*
* @param array $data
* Request data.
*
* @return \Drupal\rest\ModifiedResourceResponse
* The response.
*/
public function post(array $data) {
// Validate request data
if (empty($data['query'])) {
return new ModifiedResourceResponse(['error' => 'Query parameter is required'], 400);
}

// Set default limit
$limit = isset($data['limit']) ? min(intval($data['limit']), 50) : 10;

// Prepare query
$query = \Drupal::entityQuery('node')
->accessCheck(TRUE)
->condition('status', 1);

// Add content type filter if provided
if (!empty($data['content_type'])) {
$query->condition('type', $data['content_type']);
}

// Add search condition
$search_index = \Drupal::service('search.index');
$search_index->markForReindex('node_search', $query);
$search_condition = $search_index->searchExpression($data['query'], 'node_search');
$query->condition($search_condition);

// Execute query
$query->range(0, $limit);
$entity_ids = $query->execute();

// Load nodes
$nodes = \Drupal\node\Entity\Node::loadMultiple($entity_ids);

// Prepare results
$results = [];
foreach ($nodes as $node) {
$results[] = [
'id' => $node->id(),
'title' => $node->getTitle(),
'url' => $node->toUrl()->setAbsolute()->toString(),
'summary' => !$node->get('body')->isEmpty()
? text_summary(strip_tags($node->get('body')->value), NULL, 200)
: '',
'content_type' => $node->bundle(),
];
}

// Return response
return new ModifiedResourceResponse([
'results' => $results,
'count' => count($results),
]);
}
}

Batch Entity Embedding Updatesโ€‹

Create a batch operation to update entity embeddings:

/**
* Implements hook_cron().
*
* Updates entity embeddings periodically.
*/
function my_module_cron() {
// Check if we need to update embeddings
$last_update = \Drupal::state()->get('my_module.last_embedding_update', 0);
$now = \Drupal::time()->getRequestTime();

// Update embeddings once a day
if ($now - $last_update > 86400) {
// Queue batch operation
$operations = [];

// Get content types to embed
$content_types = ['article', 'page', 'blog'];

foreach ($content_types as $content_type) {
// Query for entities of this type
$query = \Drupal::entityQuery('node')
->condition('type', $content_type)
->condition('status', 1)
->accessCheck(FALSE);

$entity_ids = $query->execute();

// Add batch operations in chunks
$chunks = array_chunk($entity_ids, 10);
foreach ($chunks as $chunk) {
$operations[] = [
'my_module_batch_update_embeddings',
[$chunk, $content_type],
];
}
}

// Create and set the batch
$batch = [
'title' => t('Updating content embeddings'),
'operations' => $operations,
'finished' => 'my_module_batch_update_embeddings_finished',
'file' => drupal_get_path('module', 'my_module') . '/my_module.batch.inc',
];

batch_set($batch);

// Process batch through cron
$batch =& batch_get();
$batch['progressive'] = FALSE;
batch_process();

// Update last run time
\Drupal::state()->set('my_module.last_embedding_update', $now);
}
}

/**
* Batch operation callback for updating embeddings.
*/
function my_module_batch_update_embeddings($entity_ids, $content_type, &$context) {
$vector_service = \Drupal::service('llm_vector.storage');
$embedding_service = \Drupal::service('llm_vector.embedding');

// Load entities
$nodes = \Drupal\node\Entity\Node::loadMultiple($entity_ids);

foreach ($nodes as $node) {
// Generate content text
$text = $node->getTitle() . ' ';
if ($node->hasField('body') && !$node->get('body')->isEmpty()) {
$text .= strip_tags($node->get('body')->value);
}

try {
// Generate embedding
$embedding = $embedding_service->generateEmbedding([
'text' => $text,
'model' => 'text-embedding-ada-002',
]);

// Store vector
$vector_service->storeVector([
'id' => 'node:' . $node->id(),
'vector' => $embedding['vector'],
'metadata' => [
'entity_type' => 'node',
'bundle' => $node->bundle(),
'title' => $node->getTitle(),
'created' => $node->getCreatedTime(),
'changed' => $node->getChangedTime(),
'status' => $node->isPublished(),
],
'collection' => 'drupal_content',
]);

// Update progress information
if (!isset($context['results']['updated'])) {
$context['results']['updated'] = 0;
}
$context['results']['updated']++;
}
catch (\Exception $e) {
// Log error
\Drupal::logger('my_module')->error('Failed to update embedding for node @id: @message', [
'@id' => $node->id(),
'@message' => $e->getMessage(),
]);

// Update error count
if (!isset($context['results']['errors'])) {
$context['results']['errors'] = 0;
}
$context['results']['errors']++;
}

// Update progress message
$context['message'] = t('Updating embeddings for @type content (@current of @total)', [
'@type' => $content_type,
'@current' => $context['results']['updated'] + $context['results']['errors'],
'@total' => count($entity_ids),
]);
}
}

/**
* Batch finished callback.
*/
function my_module_batch_update_embeddings_finished($success, $results, $operations) {
if ($success) {
$message = t('Successfully updated @count embeddings.', [
'@count' => $results['updated'],
]);

if (!empty($results['errors'])) {
$message .= ' ' . t('@error_count embeddings failed to update.', [
'@error_count' => $results['errors'],
]);
}

\Drupal::messenger()->addStatus($message);
}
else {
\Drupal::messenger()->addError(t('An error occurred while updating embeddings.'));
}
}

Creating a Content Recommendation Blockโ€‹

Create a block that shows content recommendations based on the current page:

/**
* Implements a content recommendation block.
*/
class ContentRecommendationBlock extends BlockBase [/**
* {@inheritdoc]
*/
public function build() {
$node = \Drupal::routeMatch()->getParameter('node');

// Only show recommendations on node pages
if (!$node) {
return [];
}

// Get vector services
$vector_service = \Drupal::service('llm_vector.storage');
$vector_search = \Drupal::service('llm_vector.search');
$embedding_service = \Drupal::service('llm_vector.embedding');

try {
// Check if this node has an embedding
$vector_data = $vector_service->getVector([
'id' => 'node:' . $node->id(),
'collection' => 'drupal_content',
]);

if (empty($vector_data)) {
// Generate embedding if not found
$text = $node->getTitle() . ' ';
if ($node->hasField('body') && !$node->get('body')->isEmpty()) {
$text .= strip_tags($node->get('body')->value);
}

$embedding = $embedding_service->generateEmbedding([
'text' => $text,
'model' => 'text-embedding-ada-002',
]);

$vector = $embedding['vector'];
} else {
$vector = $vector_data['vector'];
}

// Find similar content
$results = $vector_search->findSimilarVectors([
'vector' => $vector,
'collection' => 'drupal_content',
'limit' => 5,
'scoreThreshold' => 0.75,
'filter' => [
'status' => true,
'entity_type' => 'node',
// Exclude current node
'id' => ['$ne' => 'node:' . $node->id()],
],
]);

// Prepare recommendation items
$items = [];
foreach ($results as $result) {
$entity_id = str_replace('node:', '', $result['id']);
$recommended_node = \Drupal\node\Entity\Node::load($entity_id);

if ($recommended_node && $recommended_node->access('view')) {
$items[] = [
'#type' => 'link',
'#title' => $recommended_node->getTitle(),
'#url' => $recommended_node->toUrl(),
'#suffix' => '<div class="recommendation-score">Relevance: ' .
number_format($result['score'] * 100, 0) . '%</div>',
];
}
}

// Return block content
return [
'#theme' => 'item_list',
'#items' => $items,
'#title' => $this->t('Related Content'),
'#empty' => $this->t('No related content found'),
'#cache' => [
'contexts' => ['route'],
'tags' => ['node:' . $node->id()],
'max-age' => 3600, // Cache for 1 hour
],
];
}
catch (\Exception $e) {
\Drupal::logger('content_recommendation')->error('Error generating recommendations: @message', [
'@message' => $e->getMessage(),
]);

return [];
}
}
}

Configuration Formsโ€‹

Create a configuration form for the MCP integration:

/**
* MCP settings form.
*/
class MCPSettingsForm extends ConfigFormBase [/**
* {@inheritdoc]
*/
public function getFormId() {
return 'llm_mcp_settings_form';
}

/**
* [@inheritdoc]
*/
protected function getEditableConfigNames() {
return ['llm_mcp.settings'];
}

/**
* [@inheritdoc]
*/
public function buildForm(array $form, FormStateInterface $form_state) {
$config = $this->config('llm_mcp.settings');

$form['connection'] = [
'#type' => 'fieldset',
'#title' => $this->t('Connection Settings'),
];

$form['connection']['server_url'] = [
'#type' => 'url',
'#title' => $this->t('MCP Server URL'),
'#description' => $this->t('URL of the LLM-MCP server (e.g., http://localhost:3001)'),
'#default_value' => $config->get('server_url'),
'#required' => TRUE,
];

$form['connection']['api_key'] = [
'#type' => 'textfield',
'#title' => $this->t('API Key'),
'#description' => $this->t('Authentication key for the LLM-MCP server'),
'#default_value' => $config->get('api_key'),
'#required' => TRUE,
];

$form['connection']['connection_timeout'] = [
'#type' => 'number',
'#title' => $this->t('Connection Timeout'),
'#description' => $this->t('Timeout in seconds for MCP server connections'),
'#default_value' => $config->get('connection_timeout') ?: 30,
'#min' => 1,
'#max' => 120,
];

$form['vector'] = [
'#type' => 'fieldset',
'#title' => $this->t('Vector Storage Settings'),
];

$form['vector']['vector_enabled'] = [
'#type' => 'checkbox',
'#title' => $this->t('Enable Vector Storage'),
'#description' => $this->t('Enable vector storage and search capabilities'),
'#default_value' => $config->get('vector_enabled') ?: FALSE,
];

$form['vector']['vector_collection'] = [
'#type' => 'textfield',
'#title' => $this->t('Default Collection'),
'#description' => $this->t('Default collection for storing vectors'),
'#default_value' => $config->get('vector_collection') ?: 'drupal_content',
'#states' => [
'visible' => [
':input[name="vector_enabled"]' => ['checked' => TRUE],
],
],
];

$form['advanced'] = [
'#type' => 'details',
'#title' => $this->t('Advanced Settings'),
'#open' => FALSE,
];

$form['advanced']['logging_enabled'] = [
'#type' => 'checkbox',
'#title' => $this->t('Enable Logging'),
'#description' => $this->t('Log MCP API calls for debugging'),
'#default_value' => $config->get('logging_enabled') ?: FALSE,
];

$form['advanced']['retry_attempts'] = [
'#type' => 'number',
'#title' => $this->t('Retry Attempts'),
'#description' => $this->t('Number of retry attempts for failed requests'),
'#default_value' => $config->get('retry_attempts') ?: 3,
'#min' => 0,
'#max' => 10,
];

$form['test'] = [
'#type' => 'details',
'#title' => $this->t('Connection Test'),
'#open' => TRUE,
];

$form['test']['test_connection'] = [
'#type' => 'button',
'#value' => $this->t('Test Connection'),
'#ajax' => [
'callback' => [$this, 'testConnectionCallback'],
'wrapper' => 'connection-test-result',
],
];

$form['test']['connection_result'] = [
'#type' => 'markup',
'#markup' => '<div id="connection-test-result"></div>',
];

return parent::buildForm($form, $form_state);
}

/**
* Ajax callback for connection test.
*/
public function testConnectionCallback(array &$form, FormStateInterface $form_state) {
$server_url = $form_state->getValue('server_url');
$api_key = $form_state->getValue('api_key');

// Create a temporary client
$client = new BfmcpClient($server_url, $api_key);

try {
$result = $client->testConnection();

if ($result['status'] === 'ok') {
$element = [
'#type' => 'html_tag',
'#tag' => 'div',
'#value' => $this->t('Connection successful! Server version: @version', [
'@version' => $result['version'],
]),
'#attributes' => ['class' => ['messages', 'messages--status']],
];
} else {
$element = [
'#type' => 'html_tag',
'#tag' => 'div',
'#value' => $this->t('Connection established but server reported issues: @status', [
'@status' => $result['message'],
]),
'#attributes' => ['class' => ['messages', 'messages--warning']],
];
}
} catch (\Exception $e) {
$element = [
'#type' => 'html_tag',
'#tag' => 'div',
'#value' => $this->t('Connection failed: @message', [
'@message' => $e->getMessage(),
]),
'#attributes' => ['class' => ['messages', 'messages--error']],
];
}

return $element;
}

/**
* [@inheritdoc]
*/
public function submitForm(array &$form, FormStateInterface $form_state) {
$this->config('llm_mcp.settings')
->set('server_url', $form_state->getValue('server_url'))
->set('api_key', $form_state->getValue('api_key'))
->set('connection_timeout', $form_state->getValue('connection_timeout'))
->set('vector_enabled', $form_state->getValue('vector_enabled'))
->set('vector_collection', $form_state->getValue('vector_collection'))
->set('logging_enabled', $form_state->getValue('logging_enabled'))
->set('retry_attempts', $form_state->getValue('retry_attempts'))
->save();

parent::submitForm($form, $form_state);
}
}

Troubleshootingโ€‹

Common Integration Issuesโ€‹

IssuePossible CauseSolution
Connection failedIncorrect server URL or API keyVerify the URL and API key in your configuration
Connection timeoutServer is not respondingCheck if the LLM-MCP server is running and accessible
Authentication failedInvalid API keyEnsure your API key is correct and has proper permissions
Tool registration failedMalformed tool definitionValidate your tool definition against the schema
Vector storage failedVector service unavailableCheck if the vector service is enabled in LLM-MCP
"Module not found" errorMissing dependenciesEnsure all required modules are installed and enabled

Checking MCP Client Logsโ€‹

Enable logging to troubleshoot connection issues:

// Enable logging in settings.php
$settings['llm_mcp.settings']['logging_enabled'] = TRUE;

// Check logs
\Drupal::logger('llm_mcp')->notice('Connecting to MCP server...');

Check the Drupal log for MCP-related messages:

drush watchdog:show --severity=error --type=llm_mcp

Testing MCP Connectionโ€‹

Use Drush to test the MCP connection:

# Run a custom Drush command to test connection
drush llm:test-mcp-connection

# Check MCP server health
drush php:eval "print_r(\Drupal::service('llm_mcp.client')->checkHealth());"

Performance Optimizationโ€‹

Caching MCP Responsesโ€‹

Implement caching for MCP responses:

/**
* A service that caches MCP responses.
*/
class CachedMcpClient implements MCPClientInterface {

/**
* The MCP client service.
*
* @var \Drupal\llm_mcp\MCPClientInterface
*/
protected $mcpClient;

/**
* The cache backend.
*
* @var \Drupal\Core\Cache\CacheBackendInterface
*/
protected $cacheBackend;

/**
* Constructor.
*/
public function __construct(MCPClientInterface $mcp_client, CacheBackendInterface $cache_backend) {
$this->mcpClient = $mcp_client;
$this->cacheBackend = $cache_backend;
}

/**
* [@inheritdoc]
*/
public function listTools($options = []) {
$cache_key = 'llm_mcp:tools:' . md5(serialize($options));
$cache = $this->cacheBackend->get($cache_key);

if ($cache) {
return $cache->data;
}

$tools = $this->mcpClient->listTools($options);

// Cache for 1 hour
$this->cacheBackend->set($cache_key, $tools, time() + 3600);

return $tools;
}

/**
* [@inheritdoc]
*/
public function executeTool($tool_id, $params = [], $options = []) {
// Don't cache tool execution by default
return $this->mcpClient->executeTool($tool_id, $params, $options);
}

/**
* [@inheritdoc]
*/
public function findSimilarVectors($params = []) {
// Only cache vector searches with specific options
if (!empty($params['cache']) && $params['cache'] === TRUE) {
$cache_key = 'llm_mcp:vectors:' . md5(serialize($params));
$cache = $this->cacheBackend->get($cache_key);

if ($cache) {
return $cache->data;
}

$results = $this->mcpClient->findSimilarVectors($params);

// Cache time based on params
$cache_time = !empty($params['cache_time']) ? $params['cache_time'] : 1800;
$this->cacheBackend->set($cache_key, $results, time() + $cache_time);

return $results;
}

return $this->mcpClient->findSimilarVectors($params);
}
}

Register the cached client service:

# llm_mcp.services.yml
services:
llm_mcp.client.uncached:
class: Drupal\llm_mcp\Client\McpClient
arguments: ['@config.factory', '@http_client', '@logger.factory']

llm_mcp.client:
class: Drupal\llm_mcp\Client\CachedMcpClient
arguments: ['@llm_mcp.client.uncached', '@cache.default']

Batch Processing for Vector Operationsโ€‹

Use batch processing for large vector operations:

/**
* Process vectors in batches to avoid memory issues.
*/
function my_module_process_vectors_in_batches($collection, $batch_size = 100) {
$vector_service = \Drupal::service('llm_vector.storage');

// Get total count
$total = $vector_service->countVectors(['collection' => $collection]);

// Create batch
$operations = [];
for ($offset = 0; $offset < $total; $offset += $batch_size) {
$operations[] = [
'my_module_process_vector_batch',
[$collection, $offset, $batch_size],
];
}

$batch = [
'title' => t('Processing vectors'),
'operations' => $operations,
'finished' => 'my_module_process_vectors_finished',
];

batch_set($batch);
}

/**
* Process a single batch of vectors.
*/
function my_module_process_vector_batch($collection, $offset, $limit, &$context) {
$vector_service = \Drupal::service('llm_vector.storage');

// Get vectors for this batch
$vectors = $vector_service->listVectors([
'collection' => $collection,
'offset' => $offset,
'limit' => $limit,
]);

// Process each vector
foreach ($vectors as $vector) {
// Do something with the vector
// ...

// Update progress
$context['results'][] = $vector['id'];
$context['message'] = t('Processed vector @id', ['@id' => $vector['id']]);
}

// Update progress
if (!isset($context['sandbox']['progress'])) {
$context['sandbox']['progress'] = 0;
}
$context['sandbox']['progress'] += count($vectors);

// Provide progress percentage
if (!isset($context['sandbox']['total'])) {
$context['sandbox']['total'] = $vector_service->countVectors(['collection' => $collection]);
}
$context['finished'] = $context['sandbox']['progress'] / $context['sandbox']['total'];
}

AI Streaming Integrationโ€‹

Streaming AI Responses from Drupalโ€‹

Use the new AI streaming capabilities to stream responses in real-time with metrics tracking:

/**
* Implements a streaming controller for AI chat.
*/
class AiStreamingController extends ControllerBase {

/**
* The streaming service.
*
* @var \Drupal\llm_mcp\StreamingService
*/
protected $streamingService;

/**
* The MCP client.
*
* @var \Drupal\llm_mcp\MCPClientInterface
*/
protected $mcpClient;

/**
* Constructor.
*/
public function __construct(
StreamingService $streaming_service,
MCPClientInterface $mcp_client
) {
$this->streamingService = $streaming_service;
$this->mcpClient = $mcp_client;
}

/**
* [@inheritdoc]
*/
public static function create(ContainerInterface $container) {
return new static(
$container->get('llm_mcp.streaming_service'),
$container->get('llm_mcp.client')
);
}

/**
* Streams an AI chat response.
*
* @return \Symfony\Component\HttpFoundation\StreamedResponse
* The streamed response.
*/
public function streamChat(Request $request) {
// Get request data
$data = json_decode($request->getContent(), TRUE);
$messages = $data['messages'] ?? [];
$options = $data['options'] ?? [];

// Create streaming response
return $this->streamingService->createStreamingResponse(function ($chunkHandler) use ($messages, $options) {
try {
// Initialize state
$chunkHandler(json_encode([
'status' => 'initializing',
]));

// Execute MCP tool with streaming enabled
$this->mcpClient->executeToolStreamed('ai_chat', [
'messages' => $messages,
'model' => $options['model'] ?? 'gpt-4',
], function ($chunk, $chunkInfo) use ($chunkHandler) {
// Forward chunk to client with metrics
$chunkHandler(json_encode([
'text' => $chunk,
'status' => 'streaming',
'metrics' => $chunkInfo['metrics'] ?? [],
]));
});

// Finalize stream
$chunkHandler(json_encode([
'status' => 'complete',
]));
} catch (\Exception $e) {
// Handle errors
$chunkHandler(json_encode([
'status' => 'error',
'error' => $e->getMessage(),
]));
}
}, [
'contentType' => 'application/json',
]);
}
}

Creating a Streaming Chat Form in Drupalโ€‹

Implement a form that uses streaming for better user experience:

/**
* Implements a streaming chat form.
*/
class StreamingChatForm extends FormBase [/**
* {@inheritdoc]
*/
public function getFormId() {
return 'streaming_chat_form';
}

/**
* [@inheritdoc]
*/
public function buildForm(array $form, FormStateInterface $form_state) {
$form['#attached']['library'][] = 'my_module/streaming-chat';

$form['chat_history'] = [
'#type' => 'container',
'#attributes' => [
'id' => 'chat-history',
'class' => ['chat-history'],
],
];

$form['message'] = [
'#type' => 'textarea',
'#title' => $this->t('Your message'),
'#attributes' => [
'id' => 'chat-input',
'class' => ['chat-input'],
],
];

$form['actions'] = [
'#type' => 'actions',
];

$form['actions']['submit'] = [
'#type' => 'button',
'#value' => $this->t('Send'),
'#attributes' => [
'id' => 'chat-submit',
'class' => ['chat-submit'],
],
];

$form['#attributes']['id'] = 'streaming-chat-form';
$form['#attributes']['data-streaming-url'] = Url::fromRoute('my_module.stream_chat')->toString();

return $form;
}

/**
* [@inheritdoc]
*/
public function submitForm(array &$form, FormStateInterface $form_state) [// Form is submitted via JavaScript]
}

Add JavaScript to handle streaming:

(function ($, Drupal) {
Drupal.behaviors.streamingChat = {
attach: function (context, settings) {
const form = $('#streaming-chat-form', context);
const chatHistory = $('#chat-history', context);
const chatInput = $('#chat-input', context);
const chatSubmit = $('#chat-submit', context);
const streamingUrl = form.data('streaming-url');

// Store conversation history
const messages = [];

// Handle form submission
form.once('streaming-chat').on('submit', function (e) {
e.preventDefault();

const userMessage = chatInput.val().trim();
if (!userMessage) return;

// Add user message to UI
chatHistory.append(
`<div class="chat-message user-message">[userMessage]</div>`
);

// Add to conversation history
messages.push({
role: 'user',
content: userMessage
});

// Clear input
chatInput.val('');

// Disable form during streaming
chatInput.prop('disabled', true);
chatSubmit.prop('disabled', true);

// Add assistant message container
const assistantMsgEl = $(
`<div class="chat-message assistant-message"><div class="typing-indicator">โ—โ—โ—</div></div>`
);
chatHistory.append(assistantMsgEl);
chatHistory.scrollTop(chatHistory[0].scrollHeight);

// Create EventSource for streaming
fetch(streamingUrl, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
messages: messages,
options: {
model: 'gpt-4'
}
})
}).then(response => {
const reader = response.body.getReader();
const decoder = new TextDecoder();
let buffer = '';

// Replace typing indicator with actual content
assistantMsgEl.empty();

function processStream({ done, value }) {
if (done) {
// Re-enable form when stream ends
chatInput.prop('disabled', false);
chatSubmit.prop('disabled', false);
chatInput.focus();
return;
}

// Decode and process chunks
buffer += decoder.decode(value, { stream: true });

// Process complete JSON objects
const lines = buffer.split('\n');
buffer = lines.pop(); // Keep last incomplete line in buffer

let assistantResponse = '';

for (const line of lines) {
if (!line.trim()) continue;

try {
const data = JSON.parse(line);

if (data.status === 'error') {
assistantMsgEl.html(`<div class="error">Error: [data.error]</div>`);
} else if (data.text) {
assistantResponse += data.text;
assistantMsgEl.html(assistantResponse);

// Update metrics if available
if (data.metrics) {
Drupal.behaviors.streamingChat.updateMetrics(data.metrics);
}
}
} catch (e) {
console.error('Error parsing streaming data:', e, line);
}
}

// Scroll to bottom
chatHistory.scrollTop(chatHistory[0].scrollHeight);

// Continue reading
return reader.read().then(processStream);
}

// Start reading the stream
return reader.read().then(processStream);
}).catch(error => {
assistantMsgEl.html(`<div class="error">Error: [error.message]</div>`);
chatInput.prop('disabled', false);
chatSubmit.prop('disabled', false);
});
});
},

updateMetrics: function(metrics) {
// Optionally display metrics in UI
console.log('AI Metrics:', metrics);
}
};
})(jQuery, Drupal);

Track AI Usage Metrics in Drupalโ€‹

Implement a dashboard to track AI usage metrics:

/**
* Implements a controller for AI usage metrics.
*/
class AiMetricsController extends ControllerBase {

/**
* Displays AI usage metrics.
*
* @return array
* Render array.
*/
public function displayMetrics() {
// Get metrics service
$metrics_service = \Drupal::service('llm_mcp.metrics_service');

// Get metrics for different time periods
$today = $metrics_service->getMetrics([
'period' => 'day',
]);

$month = $metrics_service->getMetrics([
'period' => 'month',
]);

$providers = $metrics_service->getMetrics([
'group_by' => 'provider',
]);

$operations = $metrics_service->getMetrics([
'group_by' => 'operation',
]);

// Prepare data for charts
$usage_chart_data = $metrics_service->getTimeSeriesMetrics([
'period' => 'month',
'interval' => 'day',
'metric' => 'tokens',
]);

// Render page
return [
'#theme' => 'ai_metrics_dashboard',
'#today' => [
'tokens' => $today['tokens'] ?? 0,
'cost' => $today['cost'] ?? 0,
'requests' => $today['requests'] ?? 0,
'streaming' => $today['streaming'] ?? 0,
],
'#month' => [
'tokens' => $month['tokens'] ?? 0,
'cost' => $month['cost'] ?? 0,
'requests' => $month['requests'] ?? 0,
'streaming' => $month['streaming'] ?? 0,
],
'#providers' => $providers,
'#operations' => $operations,
'#usage_chart' => [
'labels' => array_column($usage_chart_data, 'date'),
'data' => array_column($usage_chart_data, 'value'),
],
'#attached' => [
'library' => [
'my_module/ai-metrics',
],
],
];
}
}

Create a metrics service to track and aggregate AI usage:

/**
* Service for tracking AI metrics.
*/
class AiMetricsService {

/**
* The database connection.
*
* @var \Drupal\Core\Database\Connection
*/
protected $database;

/**
* Constructor.
*/
public function __construct(Connection $database) {
$this->database = $database;
}

/**
* Logs metrics for an AI operation.
*
* @param array $metrics
* The metrics to log.
*/
public function logMetrics(array $metrics) {
$this->database->insert('ai_usage_metrics')
->fields([
'timestamp' => \Drupal::time()->getRequestTime(),
'provider' => $metrics['provider'] ?? '',
'operation' => $metrics['operation'] ?? '',
'tokens' => $metrics['tokensUsed'] ?? 0,
'latency' => $metrics['latency'] ?? 0,
'cost' => $metrics['cost'] ?? 0,
'streaming' => !empty($metrics['streaming']),
'uid' => \Drupal::currentUser()->id(),
'data' => json_encode($metrics),
])
->execute();
}

/**
* Gets aggregated metrics.
*
* @param array $options
* Options for filtering and grouping.
*
* @return array
* The metrics.
*/
public function getMetrics(array $options = []) {
$query = $this->database->select('ai_usage_metrics', 'm');
$query->addExpression('SUM(tokens)', 'tokens');
$query->addExpression('SUM(cost)', 'cost');
$query->addExpression('COUNT(*)', 'requests');
$query->addExpression('SUM(streaming)', 'streaming');
$query->addExpression('AVG(latency)', 'avg_latency');

// Apply period filter
if (!empty($options['period'])) {
$now = \Drupal::time()->getRequestTime();

switch ($options['period']) {
case 'day':
$query->condition('timestamp', strtotime('today'), '>=');
break;
case 'week':
$query->condition('timestamp', strtotime('-1 week'), '>=');
break;
case 'month':
$query->condition('timestamp', strtotime('-1 month'), '>=');
break;
}
}

// Apply provider filter
if (!empty($options['provider'])) {
$query->condition('provider', $options['provider']);
}

// Apply operation filter
if (!empty($options['operation'])) {
$query->condition('operation', $options['operation']);
}

// Apply user filter
if (!empty($options['uid'])) {
$query->condition('uid', $options['uid']);
}

// Group by if needed
if (!empty($options['group_by'])) {
$query->groupBy($options['group_by']);
$query->addField('m', $options['group_by']);
}

return $query->execute()->fetchAll(\PDO::FETCH_ASSOC);
}

/**
* Gets time series metrics for charts.
*
* @param array $options
* Options for filtering and grouping.
*
* @return array
* The time series data.
*/
public function getTimeSeriesMetrics(array $options = []) {
$interval = $options['interval'] ?? 'day';
$period = $options['period'] ?? 'month';
$metric = $options['metric'] ?? 'tokens';

$query = $this->database->select('ai_usage_metrics', 'm');

// Format date for grouping
switch ($interval) {
case 'hour':
$query->addExpression("DATE_FORMAT(FROM_UNIXTIME(timestamp), '%Y-%m-%d %H:00')", 'date');
break;
case 'day':
$query->addExpression("DATE_FORMAT(FROM_UNIXTIME(timestamp), '%Y-%m-%d')", 'date');
break;
case 'week':
$query->addExpression("DATE_FORMAT(FROM_UNIXTIME(timestamp), '%Y-%u')", 'date');
break;
case 'month':
$query->addExpression("DATE_FORMAT(FROM_UNIXTIME(timestamp), '%Y-%m')", 'date');
break;
}

// Add metric expression
$query->addExpression("SUM($metric)", 'value');

// Apply period filter
$now = \Drupal::time()->getRequestTime();
switch ($period) {
case 'day':
$query->condition('timestamp', strtotime('-1 day'), '>=');
break;
case 'week':
$query->condition('timestamp', strtotime('-1 week'), '>=');
break;
case 'month':
$query->condition('timestamp', strtotime('-1 month'), '>=');
break;
case 'year':
$query->condition('timestamp', strtotime('-1 year'), '>=');
break;
}

// Apply provider filter
if (!empty($options['provider'])) {
$query->condition('provider', $options['provider']);
}

// Apply operation filter
if (!empty($options['operation'])) {
$query->condition('operation', $options['operation']);
}

// Apply user filter
if (!empty($options['uid'])) {
$query->condition('uid', $options['uid']);
}

// Group by date
$query->groupBy('date');
// Order by date
$query->orderBy('date', 'ASC');

return $query->execute()->fetchAll(\PDO::FETCH_ASSOC);
}
}

Integration Examplesโ€‹

Content Recommendation Systemโ€‹

Implement a complete content recommendation system:

/**
* Implements hook_entity_insert().
*/
function my_module_entity_insert(EntityInterface $entity) {
my_module_update_entity_embedding($entity);
}

/**
* Implements hook_entity_update().
*/
function my_module_entity_update(EntityInterface $entity) {
my_module_update_entity_embedding($entity);
}

/**
* Updates entity embedding.
*/
function my_module_update_entity_embedding(EntityInterface $entity) {
// Skip non-node entities
if ($entity->getEntityTypeId() !== 'node') {
return;
}

// Skip unsupported bundles
$supported_bundles = ['article', 'page', 'blog'];
if (!in_array($entity->bundle(), $supported_bundles)) {
return;
}

// Get services
$vector_service = \Drupal::service('llm_vector.storage');
$embedding_service = \Drupal::service('llm_vector.embedding');

// Generate content text
$text = $entity->getTitle() . ' ';
if ($entity->hasField('body') && !$entity->get('body')->isEmpty()) {
$text .= strip_tags($entity->get('body')->value);
}

// Add taxonomy terms if available
if ($entity->hasField('field_tags') && !$entity->get('field_tags')->isEmpty()) {
$tags = [];
foreach ($entity->get('field_tags')->referencedEntities() as $term) {
$tags[] = $term->label();
}
if ($tags) {
$text .= ' ' . implode(', ', $tags);
}
}

// Add author information if available
if ($entity->hasField('uid') && !$entity->get('uid')->isEmpty()) {
$author = $entity->get('uid')->entity;
if ($author) {
$text .= ' Author: ' . $author->getDisplayName();
}
}

try {
// Generate embedding
$embedding = $embedding_service->generateEmbedding([
'text' => $text,
'model' => 'text-embedding-ada-002',
]);

// Store embedding
$vector_service->storeVector([
'id' => 'node:' . $entity->id(),
'vector' => $embedding['vector'],
'metadata' => [
'entity_type' => 'node',
'bundle' => $entity->bundle(),
'title' => $entity->getTitle(),
'url' => $entity->toUrl()->setAbsolute()->toString(),
'created' => $entity->getCreatedTime(),
'changed' => $entity->getChangedTime(),
'status' => $entity->isPublished(),
'author_id' => $entity->getOwnerId(),
'author_name' => $entity->getOwner()->getDisplayName(),
'tags' => $entity->hasField('field_tags')
? array_map(function($term) {
return $term->label();
}, $entity->get('field_tags')->referencedEntities())
: [],
],
'collection' => 'drupal_content',
]);

\Drupal::logger('my_module')->notice('Updated embedding for @type @id', [
'@type' => $entity->getEntityTypeId(),
'@id' => $entity->id(),
]);
}
catch (\Exception $e) {
\Drupal::logger('my_module')->error('Failed to update embedding: @message', [
'@message' => $e->getMessage(),
]);
}
}

/**
* Implements hook_preprocess_node().
*/
function my_module_preprocess_node(&$variables) {
// Only show recommendations on full node view
if ($variables['view_mode'] !== 'full') {
return;
}

$node = $variables['node'];

// Get vector services
$vector_search = \Drupal::service('llm_vector.search');

try {
// Find similar content
$results = $vector_search->findSimilarVectors([
'id' => 'node:' . $node->id(),
'collection' => 'drupal_content',
'limit' => 5,
'scoreThreshold' => 0.7,
'filter' => [
'status' => true,
'entity_type' => 'node',
'id' => ['$ne' => 'node:' . $node->id()],
],
'cache' => TRUE,
'cache_time' => 3600,
]);

// Build recommendations
if (!empty($results)) {
$items = [];
foreach ($results as $result) {
$entity_id = str_replace('node:', '', $result['id']);
$recommended_node = \Drupal\node\Entity\Node::load($entity_id);

if ($recommended_node && $recommended_node->access('view')) {
$items[] = [
'title' => $recommended_node->getTitle(),
'url' => $recommended_node->toUrl()->toString(),
'score' => $result['score'],
'bundle' => $recommended_node->bundle(),
'date' => \Drupal::service('date.formatter')->format($recommended_node->getCreatedTime(), 'short'),
];
}
}

if (!empty($items)) {
$variables['content']['recommendations'] = [
'#theme' => 'my_module_recommendations',
'#recommendations' => $items,
'#title' => t('You might also like'),
'#cache' => [
'tags' => ['node:' . $node->id()],
'contexts' => ['user.permissions'],
'max-age' => 3600,
],
];
}
}
}
catch (\Exception $e) {
\Drupal::logger('my_module')->error('Failed to get recommendations: @message', [
'@message' => $e->getMessage(),
]);
}
}

Conclusionโ€‹

The LLM-MCP Drupal integration provides powerful AI capabilities to your Drupal site, including vector storage for semantic search, tool orchestration for custom AI workflows, and direct access to language models. By following the examples and best practices in this guide, you can create rich AI-powered experiences in your Drupal applications.

For more information:

If you encounter issues with the Drupal integration, refer to our Troubleshooting Guide.