Skip to content

A local proxy that exposes Gemini CLI functionality through an Ollama-compatible API. It enables the use of Gemini models (via OAuth) in workflows built for Ollama without requiring an API key. Ideal for integration with external tools like n8n, Cline, agent frameworks, or custom frontends.

Notifications You must be signed in to change notification settings

RoderickGrc/gemini-cli-proxy-for-ollama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama-GeminiCLI Proxy

Proxy server for connecting Ollama-compatible clients (such as Open WebUI or n8n) to Google Gemini Code Assist models via the internal Gemini CLI authentication flow. Intended for personal and experimental use only.


Overview

This project provides a local proxy compatible with Ollama and OpenAI API clients, translating chat and completion requests to the Gemini Code Assist API endpoints using your personal Google account credentials (as generated by gemini auth).
It enables the use of Gemini 2.5 Pro/Flash models through local automation, dashboards, or scripting environments, provided the usage remains strictly personal.


Features

  • Compatible with Ollama /api/chat, /api/generate, /api/tags, and /api/ps endpoints.
  • Implements OpenAI-compatible /v1/models and /api/chat/completions endpoints.
  • Supports streaming (ND-JSON) and non-streaming responses.
  • Handles token refresh and project ID discovery for Gemini Code Assist.
  • Designed to integrate with n8n and local dashboard tools.

Requirements

  • Node.js 18+ and npm.
  • A valid oauth_creds.json file in your $HOME/.gemini/ directory, generated via Gemini CLI (gemini auth).
  • Personal Google account (Gemini Code Assist endpoints are not available for Workspace/Enterprise accounts).

Installation

git clone https://github.com/RoderickGrc/gemini-cli-proxy-for-ollama/
cd ollama-gemini-proxy
npm install

Authenticate with Gemini CLI and ensure credentials are present:

gemini auth
ls ~/.gemini/oauth_creds.json

Start the proxy:

npm start

The server will listen on http://localhost:11434 by default.


Usage

Point your Ollama-compatible client (or Open WebUI, or n8n custom HTTP node) to the proxy URL. You can use the following model identifiers:

  • gemini-2.5-pro:latest
  • gemini-2.5-flash:latest

Example (curl):

curl -X POST http://localhost:11434/api/chat \
     -H "Content-Type: application/json" \
     -d '{
         "model": "gemini-2.5-pro:latest",
         "messages": [{"role":"user","content":"Hello, Gemini!"}]
     }'

Limitations

  • Personal Use Only: This proxy is intended strictly for personal, experimental, or research purposes.
  • No Multi-user or Production Support: Do not use as a public API, SaaS, or to serve requests from other users.
  • Rate Limits: Requests are subject to Gemini Code Assist quotas (e.g., 60 requests/minute, 1,000/day per Google account).
  • No guarantee of stability: The Gemini Code Assist endpoints are internal and may change or be restricted by Google at any time.
  • No warranty or support provided.

Legal Notice and Disclaimer

This project is not affiliated with, endorsed by, or supported by Google LLC. Google Code Assist endpoints are not intended for public redistribution. Do not publish, resell, or provide this proxy as a service to third parties.

All use is at your own risk. The author is not responsible for any loss, account suspension, or legal consequences arising from the use or misuse of this code.


License

MIT (see LICENSE)


Credits

  • Gemini CLI – for authentication and internal endpoint research.
  • Inspired by open-source Ollama and OpenAI API proxy projects.

About

A local proxy that exposes Gemini CLI functionality through an Ollama-compatible API. It enables the use of Gemini models (via OAuth) in workflows built for Ollama without requiring an API key. Ideal for integration with external tools like n8n, Cline, agent frameworks, or custom frontends.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published