- MCP LLM
Information
MCP LLM
An MCP server that provides access to LLMs using the LlamaIndexTS library.
Features
This MCP server provides the following tools:
generate_code
: Generate code based on a descriptiongenerate_code_to_file
: Generate code and write it directly to a file at a specific line numbergenerate_documentation
: Generate documentation for codeask_question
: Ask a question to the LLM
Installation
Update your MCP config to add the mcp-llm server:
{
"mcpServers": {
"llm": {
"command": "npx",
"args": [
"-y",
"mcp-llm"
],
"env": {
"LLM_MODEL_NAME": "deepseek-r1:7b-qwen-distill-q6_k_l",
"LLM_MODEL_PROVIDER": "ollama",
"LLM_BASE_URL": "http://localhost:11434",
"LLM_ALLOW_FILE_WRITE": "true",
"LLM_TIMEOUT_S": "240"
},
"disabled": false,
"autoApprove": [
"generate_code",
"generate_documentation",
"ask_question",
"generate_code_to_file"
],
"timeout": 300
},
}
}
Available Scripts
npm run build
- Build the projectnpm run watch
- Watch for changes and rebuildnpm start
- Start the MCP servernpm run example
- Run the example scriptnpm run inspector
- Run the MCP inspector
Configuration
The MCP server is configurable using environment variables:
Required Environment Variables
LLM_MODEL_NAME
: The name of the model to use (e.g.,qwen2-32b:q6_k
,anthropic.claude-3-7-sonnet-20250219-v1:0
)LLM_MODEL_PROVIDER
: The model provider (e.g.,bedrock
,ollama
,openai
,openai-compatible
)
Optional Environment Variables
LLM_BASE_URL
: Base URL for the model provider (e.g.,https://ollama.internal
,http://my-openai-compatible-server.com:3000/v1
)LLM_TEMPERATURE
: Temperature parameter for the model (e.g.,0.2
)LLM_NUM_CTX
: Context window size (e.g.,16384
)LLM_TOP_P
: Top-p parameter for the model (e.g.,0.85
)LLM_TOP_K
: Top-k parameter for the model (e.g.,40
)LLM_MIN_P
: Min-p parameter for the model (e.g.,0.05
)LLM_REPETITION_PENALTY
: Repetition penalty parameter for the model (e.g.,1.05
)LLM_SYSTEM_PROMPT_GENERATE_CODE
: System prompt for the generate_code toolLLM_SYSTEM_PROMPT_GENERATE_DOCUMENTATION
: System prompt for the generate_documentation toolLLM_SYSTEM_PROMPT_ASK_QUESTION
: System prompt for the ask_question toolLLM_TIMEOUT_S
: Timeout in seconds for LLM requests (e.g.,240
for 4 minutes)LLM_ALLOW_FILE_WRITE
: Set totrue
to allow thegenerate_code_to_file
tool to write to files (default:false
)OPENAI_API_KEY
: API key for OpenAI (required when using OpenAI provider)
Manual Install From Source
- Clone the repository
- Install dependencies:
npm install
- Build the project:
npm run build
- Update your MCP configuration
Using the Example Script
The repository includes an example script that demonstrates how to use the MCP server programmatically:
node examples/use-mcp-server.js
This script starts the MCP server and sends requests to it using curl commands.
Examples
Generate Code
{
"description": "Create a function that calculates the factorial of a number",
"language": "JavaScript"
}
Generate Code to File
{
"description": "Create a function that calculates the factorial of a number",
"language": "JavaScript",
"filePath": "/path/to/factorial.js",
"lineNumber": 10,
"replaceLines": 0
}
The generate_code_to_file
tool supports both relative and absolute file paths. If a relative path is provided, it will be resolved relative to the current working directory of the MCP server.
Generate Documentation
{
"code": "function factorial(n) {\n if (n <= 1) return 1;\n return n * factorial(n - 1);\n}",
"language": "JavaScript",
"format": "JSDoc"
}
Ask Question
{
"question": "What is the difference between var, let, and const in JavaScript?",
"context": "I'm a beginner learning JavaScript and confused about variable declarations."
}
License
Recommended Servers

ChatSumQuery and Summarize your chat messages.
Nuanced MCP ServerA Model Context Protocol (MCP) server that provides call graph analysis capabilities to LLMs through the nuanced library
mcp-deepresearchlocal mcp server perplexity
Template for MCP Server

MongoDB LensFull featured MCP Server for MongoDB database analysis.

GithubMCP Server for the GitHub API, enabling file operations, repository management, search functionality, and more.

Mcp Server RagdocsAn MCP server that provides tools for retrieving and processing documentation through vector search, both locally or hosted. Enabling AI assistants to augment their responses with relevant documentation context.
Perplexity Web Search MCP ServerA perplexity MCP server

Microsoft SQL Server MCP Server (MSSQL)MS SQL MCP Server
An easy-to-use bridge that lets AI assistants like Claude directly query and explore Microsoft SQL Server databases. No coding experience required!
What Does This Tool Do?
This tool allows AI assistants to:
Discover tables in your SQL Server database
View table structures (columns, data types, etc.)
Execute read-only SQL queries safely
Generate SQL queries from natural language requests
Most Popular Model Context Protocol (MCP) ServersA curated list of the most popular Model Context Protocol (MCP) servers based on usage data from Smithery.ai