Added MCP support

This commit is contained in:
Pavel Pivovarov
2025-12-15 15:15:40 +11:00
parent 272d223f73
commit 25e263b7a6
9 changed files with 635 additions and 201 deletions

View File

@@ -7,8 +7,9 @@ A CLI application that provides AI-powered search and information retrieval usin
- 🔍 **Web Search**: Powered by SearXNG for comprehensive internet searches - 🔍 **Web Search**: Powered by SearXNG for comprehensive internet searches
- 📄 **URL Fetching**: Automatically fetches and converts web pages to clean Markdown - 📄 **URL Fetching**: Automatically fetches and converts web pages to clean Markdown
- 🤖 **Local LLM Support**: Works with any OpenAI-compatible API (Ollama, LM Studio, etc.) - 🤖 **Local LLM Support**: Works with any OpenAI-compatible API (Ollama, LM Studio, etc.)
- 💻 **Simple CLI**: Clean terminal interface for easy interaction - 🔌 **MCP Support**: Extend capabilities with Model Context Protocol servers
- ⚙️ **Configurable**: Easy INI-based configuration - <EFBFBD> **Simple CLI**: Clean terminal interface for easy interaction
- ⚙️ **Configurable**: Easy YAML-based configuration with customizable prompts
- 🔒 **Privacy-Focused**: All processing happens locally - 🔒 **Privacy-Focused**: All processing happens locally
## Prerequisites ## Prerequisites
@@ -50,50 +51,58 @@ Create the config directory and copy the example configuration:
```bash ```bash
mkdir -p ~/.config mkdir -p ~/.config
cp tell-me.ini.example ~/.config/tell-me.ini cp tell-me.yaml.example ~/.config/tell-me.yaml
``` ```
### 5. Edit your configuration ### 5. Edit your configuration
Open `~/.config/tell-me.ini` in your favorite editor and configure: Open `~/.config/tell-me.yaml` in your favorite editor and configure:
```ini ```yaml
[llm]
# Your LLM API endpoint (e.g., Ollama, LM Studio) # Your LLM API endpoint (e.g., Ollama, LM Studio)
api_url = http://localhost:11434/v1 api_url: http://localhost:11434/v1
# Model name to use # Model name to use
model = llama3.2 model: llama3.2
# Context window size # Context window size
context_size = 16000 context_size: 16000
# API key (leave empty if not required) # API key (leave empty if not required)
api_key = api_key: ""
[searxng]
# Your SearXNG instance URL # Your SearXNG instance URL
url = http://localhost:8080 searxng_url: http://localhost:8080
# System prompt (customize the AI's behavior)
prompt: |
You are a helpful AI research assistant...
(see tell-me.yaml.example for full prompt)
# MCP Server Configuration (optional)
mcp_servers: {}
# Add MCP servers to extend functionality
# See MCP section below for examples
``` ```
**Example configurations:** **Example configurations:**
**For Ollama:** **For Ollama:**
```ini ```yaml
[llm] api_url: http://localhost:11434/v1
api_url = http://localhost:11434/v1 model: llama3.2
model = llama3.2 context_size: 16000
context_size = 16000 api_key: ""
api_key = searxng_url: http://localhost:8080
``` ```
**For LM Studio:** **For LM Studio:**
```ini ```yaml
[llm] api_url: http://localhost:1234/v1
api_url = http://localhost:1234/v1 model: your-model-name
model = your-model-name context_size: 16000
context_size = 16000 api_key: ""
api_key = searxng_url: http://localhost:8080
``` ```
## Usage ## Usage
@@ -129,6 +138,37 @@ The AI will:
Type `exit` or `quit` to exit the application, or press Ctrl-C. Type `exit` or `quit` to exit the application, or press Ctrl-C.
## MCP (Model Context Protocol) Support
Tell-Me supports the [Model Context Protocol](https://modelcontextprotocol.io/), allowing you to extend the AI assistant's capabilities with additional tools from MCP servers.
### Supported MCP Servers
Tell-Me supports **stdio-based MCP servers** (local command execution). Remote SSE-based servers are not supported for security reasons.
### Configuration
Add MCP servers to your `~/.config/tell-me.yaml`:
```yaml
mcp_servers:
# Example: Filesystem access
filesystem:
command: /usr/local/bin/mcp-server-filesystem
args:
- --root
- /path/to/allowed/directory
env:
LOG_LEVEL: info
# Example: Weather information
weather:
command: /usr/local/bin/mcp-server-weather
args: []
env:
API_KEY: your-weather-api-key
```
## How It Works ## How It Works
1. **User asks a question** - You type your query in the terminal 1. **User asks a question** - You type your query in the terminal
@@ -143,14 +183,16 @@ Type `exit` or `quit` to exit the application, or press Ctrl-C.
tell-me/ tell-me/
├── main.go # Main application entry point ├── main.go # Main application entry point
├── config/ ├── config/
│ └── config.go # Configuration loading from INI file │ └── config.go # Configuration loading from YAML file
├── llm/ ├── llm/
│ └── client.go # OpenAI-compatible API client with tool calling │ └── client.go # OpenAI-compatible API client with tool calling
├── mcp/
│ └── manager.go # MCP server connection and tool management
├── tools/ ├── tools/
│ ├── search.go # SearXNG web search implementation │ ├── search.go # SearXNG web search implementation
│ └── fetch.go # URL fetching and HTML-to-Markdown conversion │ └── fetch.go # URL fetching and HTML-to-Markdown conversion
├── go.mod # Go module dependencies ├── go.mod # Go module dependencies
├── tell-me.ini.example # Example configuration file ├── tell-me.yaml.example # Example YAML configuration file
└── README.md # This file └── README.md # This file
``` ```

View File

@@ -5,29 +5,35 @@ import (
"os" "os"
"path/filepath" "path/filepath"
"gopkg.in/ini.v1" "gopkg.in/yaml.v3"
) )
// Config holds the application configuration // Config holds the application configuration
type Config struct { type Config struct {
LLM LLMConfig // LLM Configuration
SearXNG SearXNGConfig APIURL string `yaml:"api_url"`
Model string `yaml:"model"`
ContextSize int `yaml:"context_size"`
APIKey string `yaml:"api_key"`
// SearXNG Configuration
SearXNGURL string `yaml:"searxng_url"`
// System Prompt
Prompt string `yaml:"prompt"`
// MCP Server Configuration
MCPServers map[string]MCPServer `yaml:"mcp_servers"`
} }
// LLMConfig holds LLM API configuration // MCPServer represents a single MCP server configuration (stdio transport only)
type LLMConfig struct { type MCPServer struct {
APIURL string Command string `yaml:"command"`
Model string Args []string `yaml:"args,omitempty"`
ContextSize int Env map[string]string `yaml:"env,omitempty"`
APIKey string
} }
// SearXNGConfig holds SearXNG configuration // Load reads and parses the YAML configuration file from ~/.config/tell-me.yaml
type SearXNGConfig struct {
URL string
}
// Load reads and parses the INI configuration file from ~/.config/tell-me.ini
func Load() (*Config, error) { func Load() (*Config, error) {
// Get home directory // Get home directory
homeDir, err := os.UserHomeDir() homeDir, err := os.UserHomeDir()
@@ -36,47 +42,43 @@ func Load() (*Config, error) {
} }
// Build config path // Build config path
configPath := filepath.Join(homeDir, ".config", "tell-me.ini") configPath := filepath.Join(homeDir, ".config", "tell-me.yaml")
// Check if config file exists // Check if config file exists
if _, err := os.Stat(configPath); os.IsNotExist(err) { if _, err := os.Stat(configPath); os.IsNotExist(err) {
return nil, fmt.Errorf("config file not found at %s. Please create it from tell-me.ini.example", configPath) return nil, fmt.Errorf("config file not found at %s. Please create it from tell-me.yaml.example", configPath)
} }
// Load INI file // Read YAML file
cfg, err := ini.Load(configPath) data, err := os.ReadFile(configPath)
if err != nil { if err != nil {
return nil, fmt.Errorf("failed to load config file: %w", err) return nil, fmt.Errorf("failed to read config file: %w", err)
} }
// Parse LLM section // Parse YAML
llmSection := cfg.Section("llm") var cfg Config
llmConfig := LLMConfig{ if err := yaml.Unmarshal(data, &cfg); err != nil {
APIURL: llmSection.Key("api_url").String(), return nil, fmt.Errorf("failed to parse config file: %w", err)
Model: llmSection.Key("model").String(),
ContextSize: llmSection.Key("context_size").MustInt(16000),
APIKey: llmSection.Key("api_key").String(),
} }
// Parse SearXNG section // Set defaults
searxngSection := cfg.Section("searxng") if cfg.ContextSize == 0 {
searxngConfig := SearXNGConfig{ cfg.ContextSize = 16000
URL: searxngSection.Key("url").String(),
} }
// Validate required fields // Validate required fields
if llmConfig.APIURL == "" { if cfg.APIURL == "" {
return nil, fmt.Errorf("llm.api_url is required in config") return nil, fmt.Errorf("api_url is required in config")
} }
if llmConfig.Model == "" { if cfg.Model == "" {
return nil, fmt.Errorf("llm.model is required in config") return nil, fmt.Errorf("model is required in config")
} }
if searxngConfig.URL == "" { if cfg.SearXNGURL == "" {
return nil, fmt.Errorf("searxng.url is required in config") return nil, fmt.Errorf("searxng_url is required in config")
}
if cfg.Prompt == "" {
return nil, fmt.Errorf("prompt is required in config")
} }
return &Config{ return &cfg, nil
LLM: llmConfig,
SearXNG: searxngConfig,
}, nil
} }

10
go.mod
View File

@@ -1,16 +1,20 @@
module git.netra.pivpav.com/public/tell-me module tell-me
go 1.21 go 1.23.0
require ( require (
github.com/JohannesKaufmann/html-to-markdown v1.6.0 github.com/JohannesKaufmann/html-to-markdown v1.6.0
github.com/jlubawy/go-boilerpipe v0.4.0 github.com/jlubawy/go-boilerpipe v0.4.0
github.com/modelcontextprotocol/go-sdk v1.1.0
github.com/sashabaranov/go-openai v1.41.2 github.com/sashabaranov/go-openai v1.41.2
gopkg.in/ini.v1 v1.67.0 gopkg.in/yaml.v3 v3.0.1
) )
require ( require (
github.com/PuerkitoBio/goquery v1.9.2 // indirect github.com/PuerkitoBio/goquery v1.9.2 // indirect
github.com/andybalholm/cascadia v1.3.2 // indirect github.com/andybalholm/cascadia v1.3.2 // indirect
github.com/google/jsonschema-go v0.3.0 // indirect
github.com/yosida95/uritemplate/v3 v3.0.2 // indirect
golang.org/x/net v0.25.0 // indirect golang.org/x/net v0.25.0 // indirect
golang.org/x/oauth2 v0.30.0 // indirect
) )

21
go.sum
View File

@@ -5,13 +5,20 @@ github.com/PuerkitoBio/goquery v1.9.2/go.mod h1:GHPCaP0ODyyxqcNoFGYlAprUFH81NuRP
github.com/andybalholm/cascadia v1.3.2 h1:3Xi6Dw5lHF15JtdcmAHD3i1+T8plmv7BQ/nsViSLyss= github.com/andybalholm/cascadia v1.3.2 h1:3Xi6Dw5lHF15JtdcmAHD3i1+T8plmv7BQ/nsViSLyss=
github.com/andybalholm/cascadia v1.3.2/go.mod h1:7gtRlve5FxPPgIgX36uWBX58OdBsSS6lUvCFb+h7KvU= github.com/andybalholm/cascadia v1.3.2/go.mod h1:7gtRlve5FxPPgIgX36uWBX58OdBsSS6lUvCFb+h7KvU=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38= github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38= github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
github.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=
github.com/google/jsonschema-go v0.3.0 h1:6AH2TxVNtk3IlvkkhjrtbUc4S8AvO0Xii0DxIygDg+Q=
github.com/google/jsonschema-go v0.3.0/go.mod h1:r5quNTdLOYEz95Ru18zA0ydNbBuYoo9tgaYcxEYhJVE=
github.com/jlubawy/go-boilerpipe v0.4.0 h1:9OWr5DBO6q+Dq9qv/2+XIIzJ0+okCE/YMZ0Ztn3daJw= github.com/jlubawy/go-boilerpipe v0.4.0 h1:9OWr5DBO6q+Dq9qv/2+XIIzJ0+okCE/YMZ0Ztn3daJw=
github.com/jlubawy/go-boilerpipe v0.4.0/go.mod h1:myVVbfThICMP+2GZM9weT1m+1kvA20pq/t2SG5IO3F8= github.com/jlubawy/go-boilerpipe v0.4.0/go.mod h1:myVVbfThICMP+2GZM9weT1m+1kvA20pq/t2SG5IO3F8=
github.com/kr/pretty v0.1.0 h1:L/CwN0zerZDmRFUapSPitk6f+Q3+0za1rQkzVuMiMFI=
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo= github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ= github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=
github.com/kr/text v0.1.0 h1:45sCR5RtlFHMR4UwH9sdQ5TC8v0qDQCHnXt+kaKSTVE=
github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI= github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
github.com/modelcontextprotocol/go-sdk v1.1.0 h1:Qjayg53dnKC4UZ+792W21e4BpwEZBzwgRW6LrjLWSwA=
github.com/modelcontextprotocol/go-sdk v1.1.0/go.mod h1:6fM3LCm3yV7pAs8isnKLn07oKtB0MP9LHd3DfAcKw10=
github.com/pkg/errors v0.8.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0= github.com/pkg/errors v0.8.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM= github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4= github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
@@ -24,8 +31,9 @@ github.com/sergi/go-diff v1.3.1 h1:xkr+Oxo4BOQKmkn/B9eMK0g5Kg/983T9DqqPHwYqD+8=
github.com/sergi/go-diff v1.3.1/go.mod h1:aMJSSKb2lpPvRNec0+w3fl7LP9IOFzdc9Pa4NFbPK1I= github.com/sergi/go-diff v1.3.1/go.mod h1:aMJSSKb2lpPvRNec0+w3fl7LP9IOFzdc9Pa4NFbPK1I=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME= github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI= github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
github.com/stretchr/testify v1.4.0 h1:2E4SXV/wtOkTonXsotYi4li6zVWxYlZuYNCXe9XRJyk=
github.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4= github.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4=
github.com/yosida95/uritemplate/v3 v3.0.2 h1:Ed3Oyj9yrmi9087+NczuL5BwkIc4wvTb5zIM+UJPGz4=
github.com/yosida95/uritemplate/v3 v3.0.2/go.mod h1:ILOh0sOhIJR3+L/8afwt/kE++YT040gmv5BQTMR2HP4=
github.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY= github.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY=
github.com/yuin/goldmark v1.7.1 h1:3bajkSilaCbjdKVsKdZjZCLBNPL9pYzrCakKaf4U49U= github.com/yuin/goldmark v1.7.1 h1:3bajkSilaCbjdKVsKdZjZCLBNPL9pYzrCakKaf4U49U=
github.com/yuin/goldmark v1.7.1/go.mod h1:uzxRWxtg69N339t3louHJ7+O03ezfj6PlliRlaOzY1E= github.com/yuin/goldmark v1.7.1/go.mod h1:uzxRWxtg69N339t3louHJ7+O03ezfj6PlliRlaOzY1E=
@@ -47,6 +55,8 @@ golang.org/x/net v0.21.0/go.mod h1:bIjVDfnllIU7BJ2DNgfnXvpSvtn8VRwhlsaeUTyUS44=
golang.org/x/net v0.24.0/go.mod h1:2Q7sJY5mzlzWjKtYUEXSlBWCdyaioyXzRB2RtU8KVE8= golang.org/x/net v0.24.0/go.mod h1:2Q7sJY5mzlzWjKtYUEXSlBWCdyaioyXzRB2RtU8KVE8=
golang.org/x/net v0.25.0 h1:d/OCCoBEUq33pjydKrGQhw7IlUPI2Oylr+8qLx49kac= golang.org/x/net v0.25.0 h1:d/OCCoBEUq33pjydKrGQhw7IlUPI2Oylr+8qLx49kac=
golang.org/x/net v0.25.0/go.mod h1:JkAGAh7GEvH74S6FOH42FLoXpXbE/aqXSrIQjXgsiwM= golang.org/x/net v0.25.0/go.mod h1:JkAGAh7GEvH74S6FOH42FLoXpXbE/aqXSrIQjXgsiwM=
golang.org/x/oauth2 v0.30.0 h1:dnDm7JmhM45NNpd8FDDeLhK6FwqbOf4MLCM9zb1BOHI=
golang.org/x/oauth2 v0.30.0/go.mod h1:B++QgG3ZKulg6sRPGD/mqlHQs5rB3Ml9erfeDY7xKlU=
golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM= golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM= golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM= golang.org/x/sync v0.1.0/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
@@ -80,11 +90,14 @@ golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGm
golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo= golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=
golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc= golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=
golang.org/x/tools v0.6.0/go.mod h1:Xwgl3UAJ/d3gWutnCtw505GrjyAbvKui8lOU390QaIU= golang.org/x/tools v0.6.0/go.mod h1:Xwgl3UAJ/d3gWutnCtw505GrjyAbvKui8lOU390QaIU=
golang.org/x/tools v0.34.0 h1:qIpSLOxeCYGg9TrcJokLBG4KFA6d795g0xkBkiESGlo=
golang.org/x/tools v0.34.0/go.mod h1:pAP9OwEaY1CAW3HOmg3hLZC5Z0CCmzjAF2UQMSqNARg=
golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0= golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0= gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15 h1:YR8cESwS4TdDjEe65xsg0ogRM/Nc3DYOhEAlW+xobZo=
gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0= gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/ini.v1 v1.67.0 h1:Dgnx+6+nfE+IfzjUEISNeydPJh9AXNNsWbGP9KzCsOA=
gopkg.in/ini.v1 v1.67.0/go.mod h1:pNLf8WUiyNEtQjuu5G5vTm06TEv9tsIgeAvK8hOrP4k=
gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI= gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY= gopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY=
gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ= gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=

View File

@@ -4,9 +4,12 @@ import (
"context" "context"
"encoding/json" "encoding/json"
"fmt" "fmt"
"strings"
"time" "time"
"git.netra.pivpav.com/public/tell-me/tools" "tell-me/mcp"
"tell-me/tools"
"github.com/sashabaranov/go-openai" "github.com/sashabaranov/go-openai"
) )
@@ -16,10 +19,11 @@ type Client struct {
model string model string
contextSize int contextSize int
searxngURL string searxngURL string
mcpManager *mcp.Manager
} }
// NewClient creates a new LLM client // NewClient creates a new LLM client
func NewClient(apiURL, apiKey, model string, contextSize int, searxngURL string) *Client { func NewClient(apiURL, apiKey, model string, contextSize int, searxngURL string, mcpManager *mcp.Manager) *Client {
config := openai.DefaultConfig(apiKey) config := openai.DefaultConfig(apiKey)
config.BaseURL = apiURL config.BaseURL = apiURL
@@ -30,59 +34,18 @@ func NewClient(apiURL, apiKey, model string, contextSize int, searxngURL string)
model: model, model: model,
contextSize: contextSize, contextSize: contextSize,
searxngURL: searxngURL, searxngURL: searxngURL,
mcpManager: mcpManager,
} }
} }
// GetSystemPrompt returns the system prompt that enforces search-first behavior // GetSystemPrompt returns the system prompt with current date appended
func GetSystemPrompt() string { func GetSystemPrompt(prompt string) string {
currentDate := time.Now().Format("2006-01-02") currentDate := time.Now().Format("2006-01-02")
return fmt.Sprintf("%s\n\nCURRENT DATE: %s", prompt, currentDate)
return fmt.Sprintf(`You are a helpful AI research assistant with access to web search and article fetching capabilities.
RESEARCH WORKFLOW - MANDATORY STEPS:
1. For questions requiring current information, facts, or knowledge beyond your training data:
- Perform MULTIPLE searches (typically 2-3) with DIFFERENT query angles to gather comprehensive information
- Vary your search terms to capture different perspectives and sources
2. After completing ALL searches, analyze the combined results:
- Review ALL search results from your multiple searches together
- Identify the 3-5 MOST relevant and authoritative URLs across ALL searches
- Prioritize: official sources, reputable news sites, technical documentation, expert reviews
- Look for sources that complement each other (e.g., official specs + expert analysis + user reviews)
3. Fetch the selected articles:
- Use fetch_articles with the 3-5 best URLs you identified from ALL your searches
- Read all fetched content thoroughly before formulating your answer
- Synthesize information from multiple sources for a comprehensive response
HANDLING USER CORRECTIONS - CRITICAL:
When a user indicates your answer is incorrect, incomplete, or needs clarification:
1. NEVER argue or defend your previous answer
2. IMMEDIATELY acknowledge the correction: "Let me search for more accurate information"
3. Perform NEW searches with DIFFERENT queries based on the user's feedback
4. Fetch NEW sources that address the specific correction or clarification needed
5. Provide an updated answer based on the new research
6. If the user provides specific information, incorporate it and verify with additional searches
Remember: The user may have more current or specific knowledge. Your role is to research and verify, not to argue.
OUTPUT FORMATTING RULES:
- NEVER include source URLs or citations in your response
- DO NOT use Markdown formatting (no **, ##, -, *, [], etc.)
- Write in plain text only - use natural language without any special formatting
- For emphasis, use CAPITAL LETTERS instead of bold or italics
- For lists, use simple numbered lines (1., 2., 3.) or write as flowing paragraphs
- Keep output clean and readable for terminal display
Available tools:
- web_search: Search the internet (can be used multiple times with different queries)
- fetch_articles: Fetch and read content from 1-5 URLs at once
CURRENT DATE: %s`, currentDate)
} }
// GetTools returns the tool definitions for the LLM // GetTools returns the tool definitions for the LLM (built-in tools only)
func GetTools() []openai.Tool { func GetBuiltInTools() []openai.Tool {
return []openai.Tool{ return []openai.Tool{
{ {
Type: openai.ToolTypeFunction, Type: openai.ToolTypeFunction,
@@ -135,12 +98,25 @@ func GetTools() []openai.Tool {
} }
} }
// GetTools returns all available tools (built-in + MCP tools)
func (c *Client) GetTools() []openai.Tool {
tools := GetBuiltInTools()
// Add MCP tools if manager is available
if c.mcpManager != nil {
mcpTools := c.mcpManager.GetAllTools()
tools = append(tools, mcpTools...)
}
return tools
}
// Chat sends a message and handles tool calls // Chat sends a message and handles tool calls
func (c *Client) Chat(ctx context.Context, messages []openai.ChatCompletionMessage) (string, []openai.ChatCompletionMessage, error) { func (c *Client) Chat(ctx context.Context, messages []openai.ChatCompletionMessage) (string, []openai.ChatCompletionMessage, error) {
req := openai.ChatCompletionRequest{ req := openai.ChatCompletionRequest{
Model: c.model, Model: c.model,
Messages: messages, Messages: messages,
Tools: GetTools(), Tools: c.GetTools(),
} }
resp, err := c.client.CreateChatCompletion(ctx, req) resp, err := c.client.CreateChatCompletion(ctx, req)
@@ -154,48 +130,7 @@ func (c *Client) Chat(ctx context.Context, messages []openai.ChatCompletionMessa
// Handle tool calls // Handle tool calls
if len(choice.Message.ToolCalls) > 0 { if len(choice.Message.ToolCalls) > 0 {
for _, toolCall := range choice.Message.ToolCalls { for _, toolCall := range choice.Message.ToolCalls {
var result string result := c.handleToolCall(ctx, toolCall)
switch toolCall.Function.Name {
case "web_search":
var args struct {
Query string `json:"query"`
}
if err := json.Unmarshal([]byte(toolCall.Function.Arguments), &args); err != nil {
result = fmt.Sprintf("Error parsing arguments: %v", err)
} else {
fmt.Printf("Searching: %s\n", args.Query)
result, err = tools.WebSearch(c.searxngURL, args.Query)
if err != nil {
result = fmt.Sprintf("Search error: %v", err)
}
}
case "fetch_articles":
var args struct {
Articles []struct {
Title string `json:"title"`
URL string `json:"url"`
} `json:"articles"`
}
if err := json.Unmarshal([]byte(toolCall.Function.Arguments), &args); err != nil {
result = fmt.Sprintf("Error parsing arguments: %v", err)
} else {
fmt.Printf("Reading %d articles:\n", len(args.Articles))
urls := make([]string, len(args.Articles))
for i, article := range args.Articles {
fmt.Printf(" - %s\n", article.Title)
urls[i] = article.URL
}
result, err = tools.FetchArticles(urls)
if err != nil {
result = fmt.Sprintf("Fetch error: %v", err)
}
}
default:
result = fmt.Sprintf("Unknown tool: %s", toolCall.Function.Name)
}
// Add tool response to messages // Add tool response to messages
messages = append(messages, openai.ChatCompletionMessage{ messages = append(messages, openai.ChatCompletionMessage{
@@ -211,3 +146,60 @@ func (c *Client) Chat(ctx context.Context, messages []openai.ChatCompletionMessa
return choice.Message.Content, messages, nil return choice.Message.Content, messages, nil
} }
// handleToolCall routes tool calls to the appropriate handler
func (c *Client) handleToolCall(ctx context.Context, toolCall openai.ToolCall) string {
toolName := toolCall.Function.Name
// Check if it's a built-in tool
switch toolName {
case "web_search":
var args struct {
Query string `json:"query"`
}
if err := json.Unmarshal([]byte(toolCall.Function.Arguments), &args); err != nil {
return fmt.Sprintf("Error parsing arguments: %v", err)
}
fmt.Printf("Searching: %s\n", args.Query)
result, err := tools.WebSearch(c.searxngURL, args.Query)
if err != nil {
return fmt.Sprintf("Search error: %v", err)
}
return result
case "fetch_articles":
var args struct {
Articles []struct {
Title string `json:"title"`
URL string `json:"url"`
} `json:"articles"`
}
if err := json.Unmarshal([]byte(toolCall.Function.Arguments), &args); err != nil {
return fmt.Sprintf("Error parsing arguments: %v", err)
}
fmt.Printf("Reading %d articles:\n", len(args.Articles))
urls := make([]string, len(args.Articles))
for i, article := range args.Articles {
fmt.Printf(" - %s\n", article.Title)
urls[i] = article.URL
}
result, err := tools.FetchArticles(urls)
if err != nil {
return fmt.Sprintf("Fetch error: %v", err)
}
return result
default:
// Check if it's an MCP tool (format: servername_toolname)
if c.mcpManager != nil && strings.Contains(toolName, "_") {
fmt.Printf("Calling MCP tool: %s\n", toolName)
result, err := c.mcpManager.CallTool(ctx, toolName, toolCall.Function.Arguments)
if err != nil {
return fmt.Sprintf("MCP tool error: %v", err)
}
return result
}
return fmt.Sprintf("Unknown tool: %s", toolName)
}
}

81
main.go
View File

@@ -4,13 +4,16 @@ import (
"bufio" "bufio"
"context" "context"
"fmt" "fmt"
"log"
"os" "os"
"os/signal" "os/signal"
"strings" "strings"
"syscall" "syscall"
"git.netra.pivpav.com/public/tell-me/config" "tell-me/config"
"git.netra.pivpav.com/public/tell-me/llm" "tell-me/llm"
"tell-me/mcp"
"github.com/sashabaranov/go-openai" "github.com/sashabaranov/go-openai"
) )
@@ -19,29 +22,42 @@ func main() {
cfg, err := config.Load() cfg, err := config.Load()
if err != nil { if err != nil {
fmt.Fprintf(os.Stderr, "Error loading configuration: %v\n", err) fmt.Fprintf(os.Stderr, "Error loading configuration: %v\n", err)
fmt.Fprintf(os.Stderr, "Please create ~/.config/tell-me.ini from tell-me.ini.example\n") fmt.Fprintf(os.Stderr, "Please create ~/.config/tell-me.yaml from tell-me.yaml.example\n")
os.Exit(1) os.Exit(1)
} }
// Create LLM client ctx := context.Background()
// Initialize MCP manager
mcpManager := mcp.NewManager(ctx)
defer mcpManager.Close()
// Connect to MCP servers if configured
if len(cfg.MCPServers) > 0 {
fmt.Println("Connecting to MCP servers...")
if err := mcpManager.ConnectServers(cfg.MCPServers); err != nil {
log.Printf("Warning: Failed to connect to some MCP servers: %v", err)
}
}
// Create LLM client with MCP manager
client := llm.NewClient( client := llm.NewClient(
cfg.LLM.APIURL, cfg.APIURL,
cfg.LLM.APIKey, cfg.APIKey,
cfg.LLM.Model, cfg.Model,
cfg.LLM.ContextSize, cfg.ContextSize,
cfg.SearXNG.URL, cfg.SearXNGURL,
mcpManager,
) )
// Initialize conversation with system prompt // Initialize conversation with system prompt from config
messages := []openai.ChatCompletionMessage{ messages := []openai.ChatCompletionMessage{
{ {
Role: openai.ChatMessageRoleSystem, Role: openai.ChatMessageRoleSystem,
Content: llm.GetSystemPrompt(), Content: llm.GetSystemPrompt(cfg.Prompt),
}, },
} }
ctx := context.Background()
// Check if arguments are provided (non-interactive mode) // Check if arguments are provided (non-interactive mode)
if len(os.Args) > 1 { if len(os.Args) > 1 {
query := strings.Join(os.Args[1:], " ") query := strings.Join(os.Args[1:], " ")
@@ -58,14 +74,21 @@ func main() {
os.Exit(0) os.Exit(0)
}() }()
// Print welcome message // Print welcome message with MCP status
fmt.Println("╔════════════════════════════════════════════════════════════════╗") fmt.Println("╔════════════════════════════════════════════════════════════════╗")
fmt.Println("║ Tell-Me CLI ║") fmt.Println("║ Tell-Me CLI ║")
fmt.Println("║ AI-powered search with local LLM support ║") fmt.Println("║ AI-powered search with local LLM support ║")
fmt.Println("╚════════════════════════════════════════════════════════════════╝") fmt.Println("╚════════════════════════════════════════════════════════════════╝")
fmt.Println() fmt.Println()
fmt.Printf("Using model: %s\n", cfg.LLM.Model) fmt.Printf("Using model: %s\n", cfg.Model)
fmt.Printf("SearXNG: %s\n", cfg.SearXNG.URL) fmt.Printf("SearXNG: %s\n", cfg.SearXNGURL)
// Display MCP server status
if len(cfg.MCPServers) > 0 {
fmt.Println()
displayMCPStatusInline(mcpManager)
}
fmt.Println() fmt.Println()
fmt.Println("Type your questions below. Type 'exit' or 'quit' to exit, or press Ctrl-C.") fmt.Println("Type your questions below. Type 'exit' or 'quit' to exit, or press Ctrl-C.")
fmt.Println("────────────────────────────────────────────────────────────────") fmt.Println("────────────────────────────────────────────────────────────────")
@@ -131,3 +154,29 @@ func processQuery(ctx context.Context, client *llm.Client, messages []openai.Cha
return messages return messages
} }
// displayMCPStatusInline shows MCP server status in the header
func displayMCPStatusInline(manager *mcp.Manager) {
statuses := manager.GetDetailedStatus()
if len(statuses) == 0 {
return
}
fmt.Print("MCP Servers: ")
for i, status := range statuses {
if i > 0 {
fmt.Print(", ")
}
if status.Error != "" {
// Red X for error
fmt.Printf("\033[31m✗\033[0m %s", status.Name)
} else {
// Green checkmark for OK
fmt.Printf("\033[32m✓\033[0m %s (%d tools)", status.Name, len(status.Tools))
}
}
fmt.Println()
}

261
mcp/manager.go Normal file
View File

@@ -0,0 +1,261 @@
package mcp
import (
"context"
"encoding/json"
"fmt"
"log"
"os/exec"
"sync"
"tell-me/config"
"github.com/modelcontextprotocol/go-sdk/mcp"
"github.com/sashabaranov/go-openai"
)
// Manager manages multiple MCP server connections
type Manager struct {
servers map[string]*ServerConnection
mu sync.RWMutex
ctx context.Context
cancel context.CancelFunc
}
// ServerConnection represents a connection to an MCP server
type ServerConnection struct {
Name string
Config config.MCPServer
Client *mcp.Client
Session *mcp.ClientSession
Tools []*mcp.Tool
Error string // Connection error if any
}
// NewManager creates a new MCP manager
func NewManager(ctx context.Context) *Manager {
ctx, cancel := context.WithCancel(ctx)
return &Manager{
servers: make(map[string]*ServerConnection),
ctx: ctx,
cancel: cancel,
}
}
// ConnectServers connects to all configured MCP servers
func (m *Manager) ConnectServers(servers map[string]config.MCPServer) error {
m.mu.Lock()
defer m.mu.Unlock()
for name, serverCfg := range servers {
if err := m.connectServer(name, serverCfg); err != nil {
log.Printf("Warning: Failed to connect to MCP server %s: %v", name, err)
// Store the error in the connection
m.servers[name] = &ServerConnection{
Name: name,
Config: serverCfg,
Error: err.Error(),
}
continue
}
log.Printf("Successfully connected to MCP server: %s", name)
}
return nil
}
// connectServer connects to a single MCP server
func (m *Manager) connectServer(name string, serverCfg config.MCPServer) error {
// Create MCP client
client := mcp.NewClient(&mcp.Implementation{
Name: "tell-me",
Version: "1.0.0",
}, nil)
// Only stdio transport is supported for local servers
if serverCfg.Command == "" {
return fmt.Errorf("command is required for MCP server")
}
cmd := exec.CommandContext(m.ctx, serverCfg.Command, serverCfg.Args...)
// Set environment variables if provided
if len(serverCfg.Env) > 0 {
cmd.Env = append(cmd.Env, m.envMapToSlice(serverCfg.Env)...)
}
transport := &mcp.CommandTransport{Command: cmd}
// Connect to the server
session, err := client.Connect(m.ctx, transport, nil)
if err != nil {
return fmt.Errorf("failed to connect: %w", err)
}
// List available tools
toolsResult, err := session.ListTools(m.ctx, &mcp.ListToolsParams{})
if err != nil {
session.Close()
return fmt.Errorf("failed to list tools: %w", err)
}
// Store the connection
m.servers[name] = &ServerConnection{
Name: name,
Config: serverCfg,
Client: client,
Session: session,
Tools: toolsResult.Tools,
}
return nil
}
// envMapToSlice converts environment map to slice format
func (m *Manager) envMapToSlice(envMap map[string]string) []string {
result := make([]string, 0, len(envMap))
for key, value := range envMap {
result = append(result, fmt.Sprintf("%s=%s", key, value))
}
return result
}
// GetAllTools returns all tools from all connected servers as OpenAI tool definitions
func (m *Manager) GetAllTools() []openai.Tool {
m.mu.RLock()
defer m.mu.RUnlock()
var tools []openai.Tool
for serverName, conn := range m.servers {
for _, mcpTool := range conn.Tools {
// Convert MCP tool to OpenAI tool format
tool := openai.Tool{
Type: openai.ToolTypeFunction,
Function: &openai.FunctionDefinition{
Name: fmt.Sprintf("%s_%s", serverName, mcpTool.Name),
Description: mcpTool.Description,
Parameters: mcpTool.InputSchema,
},
}
tools = append(tools, tool)
}
}
return tools
}
// CallTool calls a tool on the appropriate MCP server
func (m *Manager) CallTool(ctx context.Context, toolName string, arguments string) (string, error) {
m.mu.RLock()
defer m.mu.RUnlock()
// Parse the tool name to extract server name and actual tool name
// Format: serverName_toolName
var serverName, actualToolName string
for sName := range m.servers {
prefix := sName + "_"
if len(toolName) > len(prefix) && toolName[:len(prefix)] == prefix {
serverName = sName
actualToolName = toolName[len(prefix):]
break
}
}
if serverName == "" {
return "", fmt.Errorf("unknown tool: %s", toolName)
}
conn, exists := m.servers[serverName]
if !exists {
return "", fmt.Errorf("server not found: %s", serverName)
}
// Parse arguments
var args map[string]interface{}
if arguments != "" {
if err := json.Unmarshal([]byte(arguments), &args); err != nil {
return "", fmt.Errorf("failed to parse arguments: %w", err)
}
}
// Call the tool
result, err := conn.Session.CallTool(ctx, &mcp.CallToolParams{
Name: actualToolName,
Arguments: args,
})
if err != nil {
return "", fmt.Errorf("tool call failed: %w", err)
}
if result.IsError {
return "", fmt.Errorf("tool returned error")
}
// Format the result
var response string
for _, content := range result.Content {
switch c := content.(type) {
case *mcp.TextContent:
response += c.Text + "\n"
case *mcp.ImageContent:
response += fmt.Sprintf("[Image: %s]\n", c.MIMEType)
case *mcp.EmbeddedResource:
response += fmt.Sprintf("[Resource: %s]\n", c.Resource.URI)
}
}
return response, nil
}
// GetServerInfo returns information about connected servers
func (m *Manager) GetServerInfo() map[string][]string {
m.mu.RLock()
defer m.mu.RUnlock()
info := make(map[string][]string)
for name, conn := range m.servers {
if conn.Error == "" {
toolNames := make([]string, len(conn.Tools))
for i, tool := range conn.Tools {
toolNames[i] = tool.Name
}
info[name] = toolNames
}
}
return info
}
// GetDetailedStatus returns detailed status information for all servers
func (m *Manager) GetDetailedStatus() []*ServerConnection {
m.mu.RLock()
defer m.mu.RUnlock()
statuses := make([]*ServerConnection, 0, len(m.servers))
for _, conn := range m.servers {
statuses = append(statuses, conn)
}
return statuses
}
// Close closes all MCP server connections
func (m *Manager) Close() error {
m.mu.Lock()
defer m.mu.Unlock()
var lastErr error
for name, conn := range m.servers {
if conn.Session != nil {
if err := conn.Session.Close(); err != nil {
log.Printf("Error closing connection to %s: %v", name, err)
lastErr = err
}
}
}
// Cancel context after closing all sessions
m.cancel()
m.servers = make(map[string]*ServerConnection)
return lastErr
}

View File

@@ -1,13 +0,0 @@
[llm]
# OpenAI-compatible API endpoint (e.g., Ollama, LM Studio)
api_url = http://localhost:11434/v1
# Model name to use
model = llama3.2
# Context size for the model
context_size = 16000
# API key (leave empty if not required)
api_key =
[searxng]
# SearXNG instance URL
url = http://localhost:8080

84
tell-me.yaml.example Normal file
View File

@@ -0,0 +1,84 @@
# Tell-Me Configuration File
# Copy this file to ~/.config/tell-me.yaml and customize it
# OpenAI-compatible API endpoint (e.g., Ollama, LM Studio)
api_url: http://localhost:11434/v1
# Model name to use
model: llama3.2
# Context size for the model
context_size: 16000
# API key (leave empty if not required)
api_key: ""
# SearXNG instance URL
searxng_url: http://localhost:8080
# System Prompt Configuration
# This prompt defines the AI assistant's behavior and capabilities
prompt: |
You are a helpful AI research assistant with access to web search and article fetching capabilities.
RESEARCH WORKFLOW - MANDATORY STEPS:
1. For questions requiring current information, facts, or knowledge beyond your training data:
- Perform MULTIPLE searches (typically 2-3) with DIFFERENT query angles to gather comprehensive information
- Vary your search terms to capture different perspectives and sources
2. After completing ALL searches, analyze the combined results:
- Review ALL search results from your multiple searches together
- Identify the 3-5 MOST relevant and authoritative URLs across ALL searches
- Prioritize: official sources, reputable news sites, technical documentation, expert reviews
- Look for sources that complement each other (e.g., official specs + expert analysis + user reviews)
3. Fetch the selected articles:
- Use fetch_articles with the 3-5 best URLs you identified from ALL your searches
- Read all fetched content thoroughly before formulating your answer
- Synthesize information from multiple sources for a comprehensive response
HANDLING USER CORRECTIONS - CRITICAL:
When a user indicates your answer is incorrect, incomplete, or needs clarification:
1. NEVER argue or defend your previous answer
2. IMMEDIATELY acknowledge the correction: "Let me search for more accurate information"
3. Perform NEW searches with DIFFERENT queries based on the user's feedback
4. Fetch NEW sources that address the specific correction or clarification needed
5. Provide an updated answer based on the new research
6. If the user provides specific information, incorporate it and verify with additional searches
Remember: The user may have more current or specific knowledge. Your role is to research and verify, not to argue.
OUTPUT FORMATTING RULES:
- NEVER include source URLs or citations in your response
- DO NOT use Markdown formatting (no **, ##, -, *, [], etc.)
- Write in plain text only - use natural language without any special formatting
- For emphasis, use CAPITAL LETTERS instead of bold or italics
- For lists, use simple numbered lines (1., 2., 3.) or write as flowing paragraphs
- Keep output clean and readable for terminal display
Available tools:
- web_search: Search the internet (can be used multiple times with different queries)
- fetch_articles: Fetch and read content from 1-5 URLs at once
# MCP (Model Context Protocol) Server Configuration
# MCP servers extend the assistant's capabilities with additional tools
# Only stdio-based (local command) servers are supported for security
# Leave empty ({}) if you don't want to use MCP servers
mcp_servers: {}
# Example MCP server configuration:
# filesystem:
# command: /usr/local/bin/mcp-server-filesystem
# args:
# - --root
# - /path/to/allowed/directory
# env:
# LOG_LEVEL: info
#
# weather:
# command: /usr/local/bin/mcp-server-weather
# args: []
# env:
# API_KEY: your-weather-api-key
#
# Note: Tools from MCP servers will be automatically available to the LLM
# Tool names will be prefixed with the server name (e.g., filesystem_read_file)