Refined readme

This commit is contained in:
Pavel Pivovarov
2025-12-12 12:10:15 +11:00
parent 50a439a499
commit 272d223f73
4 changed files with 9 additions and 48 deletions

View File

@@ -25,10 +25,10 @@ Before using Tell-Me, you need:
## Installation
### 1. Clone or download this repository
### 1. Clone the repository
```bash
git clone <repository-url>
git clone https://git.netra.pivpav.com/public/tell-me
cd tell-me
```
@@ -118,7 +118,7 @@ SearXNG: http://localhost:8080
Type your questions below. Type 'exit' or 'quit' to exit.
────────────────────────────────────────────────────────────────
You: What are the latest developments in AI?
What are the latest developments in AI?
```
The AI will:
@@ -127,7 +127,7 @@ The AI will:
3. Synthesize the information into a comprehensive answer
4. Cite sources with URLs
Type `exit` or `quit` to exit the application.
Type `exit` or `quit` to exit the application, or press Ctrl-C.
## How It Works
@@ -154,45 +154,6 @@ tell-me/
└── README.md # This file
```
## Configuration Reference
### LLM Section
- `api_url`: The base URL for your OpenAI-compatible API endpoint
- `model`: The model name/identifier to use
- `context_size`: Maximum context window size (default: 16000)
- `api_key`: API key if required (leave empty for local APIs like Ollama)
### SearXNG Section
- `url`: The URL of your SearXNG instance
## Troubleshooting
### "Config file not found"
Make sure you've created `~/.config/tell-me.ini` from the example file.
### "Search request failed"
Check that your SearXNG instance is running and accessible at the configured URL.
### "Chat completion failed"
Verify that:
- Your LLM API is running
- The API URL is correct
- The model name is correct
- The model supports tool/function calling
### Connection refused errors
Ensure both SearXNG and your LLM API are running before starting Tell-Me.
## Tips
- Use specific questions for better results
- The AI will automatically search before answering
- Sources are cited with URLs for verification
- You can ask follow-up questions in the same session
- The conversation history is maintained throughout the session
## License
MIT

4
go.mod
View File

@@ -1,9 +1,10 @@
module github.com/tell-me
module git.netra.pivpav.com/public/tell-me
go 1.21
require (
github.com/JohannesKaufmann/html-to-markdown v1.6.0
github.com/jlubawy/go-boilerpipe v0.4.0
github.com/sashabaranov/go-openai v1.41.2
gopkg.in/ini.v1 v1.67.0
)
@@ -11,6 +12,5 @@ require (
require (
github.com/PuerkitoBio/goquery v1.9.2 // indirect
github.com/andybalholm/cascadia v1.3.2 // indirect
github.com/jlubawy/go-boilerpipe v0.4.0 // indirect
golang.org/x/net v0.25.0 // indirect
)

View File

@@ -6,8 +6,8 @@ import (
"fmt"
"time"
"git.netra.pivpav.com/public/tell-me/tools"
"github.com/sashabaranov/go-openai"
"github.com/tell-me/tools"
)
// Client wraps the OpenAI client for LLM interactions

View File

@@ -9,9 +9,9 @@ import (
"strings"
"syscall"
"git.netra.pivpav.com/public/tell-me/config"
"git.netra.pivpav.com/public/tell-me/llm"
"github.com/sashabaranov/go-openai"
"github.com/tell-me/config"
"github.com/tell-me/llm"
)
func main() {