Added MCP support
This commit is contained in:
106
README.md
106
README.md
@@ -7,8 +7,9 @@ A CLI application that provides AI-powered search and information retrieval usin
|
||||
- 🔍 **Web Search**: Powered by SearXNG for comprehensive internet searches
|
||||
- 📄 **URL Fetching**: Automatically fetches and converts web pages to clean Markdown
|
||||
- 🤖 **Local LLM Support**: Works with any OpenAI-compatible API (Ollama, LM Studio, etc.)
|
||||
- 💻 **Simple CLI**: Clean terminal interface for easy interaction
|
||||
- ⚙️ **Configurable**: Easy INI-based configuration
|
||||
- 🔌 **MCP Support**: Extend capabilities with Model Context Protocol servers
|
||||
- <EFBFBD> **Simple CLI**: Clean terminal interface for easy interaction
|
||||
- ⚙️ **Configurable**: Easy YAML-based configuration with customizable prompts
|
||||
- 🔒 **Privacy-Focused**: All processing happens locally
|
||||
|
||||
## Prerequisites
|
||||
@@ -50,50 +51,58 @@ Create the config directory and copy the example configuration:
|
||||
|
||||
```bash
|
||||
mkdir -p ~/.config
|
||||
cp tell-me.ini.example ~/.config/tell-me.ini
|
||||
cp tell-me.yaml.example ~/.config/tell-me.yaml
|
||||
```
|
||||
|
||||
### 5. Edit your configuration
|
||||
|
||||
Open `~/.config/tell-me.ini` in your favorite editor and configure:
|
||||
Open `~/.config/tell-me.yaml` in your favorite editor and configure:
|
||||
|
||||
```ini
|
||||
[llm]
|
||||
```yaml
|
||||
# Your LLM API endpoint (e.g., Ollama, LM Studio)
|
||||
api_url = http://localhost:11434/v1
|
||||
api_url: http://localhost:11434/v1
|
||||
|
||||
# Model name to use
|
||||
model = llama3.2
|
||||
model: llama3.2
|
||||
|
||||
# Context window size
|
||||
context_size = 16000
|
||||
context_size: 16000
|
||||
|
||||
# API key (leave empty if not required)
|
||||
api_key =
|
||||
api_key: ""
|
||||
|
||||
[searxng]
|
||||
# Your SearXNG instance URL
|
||||
url = http://localhost:8080
|
||||
searxng_url: http://localhost:8080
|
||||
|
||||
# System prompt (customize the AI's behavior)
|
||||
prompt: |
|
||||
You are a helpful AI research assistant...
|
||||
(see tell-me.yaml.example for full prompt)
|
||||
|
||||
# MCP Server Configuration (optional)
|
||||
mcp_servers: {}
|
||||
# Add MCP servers to extend functionality
|
||||
# See MCP section below for examples
|
||||
```
|
||||
|
||||
**Example configurations:**
|
||||
|
||||
**For Ollama:**
|
||||
```ini
|
||||
[llm]
|
||||
api_url = http://localhost:11434/v1
|
||||
model = llama3.2
|
||||
context_size = 16000
|
||||
api_key =
|
||||
```yaml
|
||||
api_url: http://localhost:11434/v1
|
||||
model: llama3.2
|
||||
context_size: 16000
|
||||
api_key: ""
|
||||
searxng_url: http://localhost:8080
|
||||
```
|
||||
|
||||
**For LM Studio:**
|
||||
```ini
|
||||
[llm]
|
||||
api_url = http://localhost:1234/v1
|
||||
model = your-model-name
|
||||
context_size = 16000
|
||||
api_key =
|
||||
```yaml
|
||||
api_url: http://localhost:1234/v1
|
||||
model: your-model-name
|
||||
context_size: 16000
|
||||
api_key: ""
|
||||
searxng_url: http://localhost:8080
|
||||
```
|
||||
|
||||
## Usage
|
||||
@@ -129,6 +138,37 @@ The AI will:
|
||||
|
||||
Type `exit` or `quit` to exit the application, or press Ctrl-C.
|
||||
|
||||
## MCP (Model Context Protocol) Support
|
||||
|
||||
Tell-Me supports the [Model Context Protocol](https://modelcontextprotocol.io/), allowing you to extend the AI assistant's capabilities with additional tools from MCP servers.
|
||||
|
||||
### Supported MCP Servers
|
||||
|
||||
Tell-Me supports **stdio-based MCP servers** (local command execution). Remote SSE-based servers are not supported for security reasons.
|
||||
|
||||
### Configuration
|
||||
|
||||
Add MCP servers to your `~/.config/tell-me.yaml`:
|
||||
|
||||
```yaml
|
||||
mcp_servers:
|
||||
# Example: Filesystem access
|
||||
filesystem:
|
||||
command: /usr/local/bin/mcp-server-filesystem
|
||||
args:
|
||||
- --root
|
||||
- /path/to/allowed/directory
|
||||
env:
|
||||
LOG_LEVEL: info
|
||||
|
||||
# Example: Weather information
|
||||
weather:
|
||||
command: /usr/local/bin/mcp-server-weather
|
||||
args: []
|
||||
env:
|
||||
API_KEY: your-weather-api-key
|
||||
```
|
||||
|
||||
## How It Works
|
||||
|
||||
1. **User asks a question** - You type your query in the terminal
|
||||
@@ -141,17 +181,19 @@ Type `exit` or `quit` to exit the application, or press Ctrl-C.
|
||||
|
||||
```
|
||||
tell-me/
|
||||
├── main.go # Main application entry point
|
||||
├── main.go # Main application entry point
|
||||
├── config/
|
||||
│ └── config.go # Configuration loading from INI file
|
||||
│ └── config.go # Configuration loading from YAML file
|
||||
├── llm/
|
||||
│ └── client.go # OpenAI-compatible API client with tool calling
|
||||
│ └── client.go # OpenAI-compatible API client with tool calling
|
||||
├── mcp/
|
||||
│ └── manager.go # MCP server connection and tool management
|
||||
├── tools/
|
||||
│ ├── search.go # SearXNG web search implementation
|
||||
│ └── fetch.go # URL fetching and HTML-to-Markdown conversion
|
||||
├── go.mod # Go module dependencies
|
||||
├── tell-me.ini.example # Example configuration file
|
||||
└── README.md # This file
|
||||
│ ├── search.go # SearXNG web search implementation
|
||||
│ └── fetch.go # URL fetching and HTML-to-Markdown conversion
|
||||
├── go.mod # Go module dependencies
|
||||
├── tell-me.yaml.example # Example YAML configuration file
|
||||
└── README.md # This file
|
||||
```
|
||||
|
||||
## License
|
||||
|
||||
Reference in New Issue
Block a user