A local AI-powered assistant that lets you enter natural language queries like:
"I want to go from Alexanderplatz to Grunewald at 17:00"
It then:
- Parses your query using a local LLaMA3 model via Ollama
- Fetches real-time public transport journeys from the VBB API
- Calculates CO₂ emissions per journey using segment distances and transport mode
- Generates a natural language summary and recommendation
- All inside your terminal, without sending data to the cloud
- Python 3.8+
- Ollama installed and running
- A CPU-capable LLM (like
llama3
) pulled via Ollama
- Create a virtual environment
python3 -m venv venv
source venv/bin/activate
2. **Install dependencies**
```bash
pip install -r requirements.txt
- Pull the model with Ollama
ollama pull llama3
- Run Ollama in background
ollama run llama3
- Run the main
python main.py