Detect and redact PII locally with SOTA performance
-
Updated
Mar 25, 2025 - Python
Detect and redact PII locally with SOTA performance
Extract structured data from local or remote LLM models
A chrome extention for quering a local llm model using llama-cpp-python, includes a pip package for running the server, 'pip install local-llama' to install
entirely oss and locally running version of recall (originally revealed by msft for copilot+pcs)
A small VLM that sees everything
A simple framework for using Claude Code or Codex CLI as the frontend to any cloud or local LLM on Apple Silicon. Connect locally via LiteLLM + MLX or LM Studio, or remotely via Z.AI, Gemini/Google AI Studio, DeepSeek, or OpenRouter.
Bell inequalities and local models via Frank-Wolfe algorithms
A 💅 stylish 💅 local multi-model AI assistant and API.
Main code chunks used for models in the publication "Exploring the Potential of Adaptive, Local Machine Learning (ML) in Comparison ton the Prediction Performance of Global Models: A Case Study from Bayer's Caco-2 Permeability Database"
Local-first AI orchestrator for engines & models. Modular, scriptable, and cross-platform (Linux, macOS, Termux, WSL2). Proof of Concept build.
Extracting complete webpage articles from a screen recording using local models
ODK: An open-source AI shell to control your computer with natural language.
Running the Multimodal AI Chat App with LM Studio using a locally loaded model
Codes for a published work "Global or local modeling for XGBoost in geospatial studies upon simulated data and German COVID-19 infection forecasting"
The AI-OS in userspace.
Vision-based avatar, reads Google News and extracts news by itself using only local models
A comprehensive learning repository for Model Context Protocol (MCP) - from simple tools to complex agentic workflows using local Ollama models
Running the Multimodal AI Chat App with Ollama using a locally loaded model
A streamlined interface for interacting with local Large Language Models (LLMs) using Streamlit. Features interactive chat, configurable model parameters, and more.
Desktop application for generating AI-powered educational case studies with support for OpenAI, Anthropic, Google Gemini, and local Ollama models
Add a description, image, and links to the local-models topic page so that developers can more easily learn about it.
To associate your repository with the local-models topic, visit your repo's landing page and select "manage topics."