🚧 Project Status: Draft/In Progress
This project is in early development. Core algorithms and implementations are being actively developed.
Liquid Neural Networks (LNNs) are a revolutionary approach to artificial intelligence that draws inspiration from biological neural systems, particularly the C. elegans nervous system. Unlike traditional neural networks, LNNs use continuous-time dynamics and can adapt in real-time to changing inputs.
- Parameter Efficiency: Solve complex tasks with as few as 19-302 neurons
- Continuous-Time Dynamics: Based on ordinary differential equations (ODEs)
- Real-Time Adaptation: Networks that evolve and adapt during inference
- Superior Interpretability: Understand exactly how decisions are made
- Edge AI Ready: Efficient enough for deployment on resource-constrained devices
Traditional neural networks require millions of parameters and struggle with:
- Adapting to new situations without retraining
- Explaining their decision-making process
- Running efficiently on edge devices
- Handling time-series data naturally
LNNs address these limitations by mimicking biological neurons more closely, using differential equations to model continuous-time dynamics.
# Using pip
pip install liquid-neural-networks
# Using uv (recommended)
uv pip install liquid-neural-networks
# Development installation
git clone https://github.com/aygp-dr/liquid-neural-networks
cd liquid-neural-networks
uv pip install -e ".[dev]"
# Add to your deps.edn
{:deps {aygp-dr/liquid-neural-networks {:git/url "https://github.com/aygp-dr/liquid-neural-networks"
:git/sha "LATEST_SHA"}}}
# Or use from source
git clone https://github.com/aygp-dr/liquid-neural-networks
cd liquid-neural-networks
clojure -M:dev
from liquid_neural_networks import LiquidNeuron, LiquidNetwork
# Create a simple liquid neural network
network = LiquidNetwork(
input_size=10,
hidden_size=32, # Just 32 neurons!
output_size=2
)
# Train on time-series data
for epoch in range(100):
outputs = network(inputs, time_constants)
loss = criterion(outputs, targets)
loss.backward()
(require '[liquid-neural-networks.core :as lnn])
;; Create a liquid network
(def network (lnn/create-network {:input-size 10
:hidden-size 32
:output-size 2}))
;; Process time-series data
(def result (lnn/forward network input-data time-constants))
- Drone navigation with 19 neurons
- Self-driving car control
- Robotic arm manipulation
- Financial market prediction
- Weather forecasting
- Sensor data processing
- ECG analysis
- Brain signal interpretation
- Disease progression modeling
- IoT device intelligence
- Embedded system control
- Real-time anomaly detection
LNNs consist of three main components:
- Liquid Time-Constant (LTC) Neurons: Neurons with adaptive time constants that change based on input
- Continuous-Time Dynamics: ODEs that govern neuron behavior
- Sparse Connectivity: Efficient wiring patterns inspired by biological systems
The mathematical foundation:
dx/dt = -x/τ(t) + f(Wx + b) where τ(t) is the adaptive time constant
Benchmark results comparing LNNs to traditional architectures:
Task | Traditional NN | LNN | Parameters Reduction |
---|---|---|---|
Drone Control | 100K params | 19 neurons | 99.98% |
Time-Series | 1M params | 302 neurons | 99.97% |
Image Classification | 25M params | 1K neurons | 99.99% |
We welcome contributions! See our Contributing Guide for:
- Code style guidelines
- Testing requirements
- Pull request process
- Development setup
- Tutorials: Step-by-step guides
- API Reference: Detailed documentation
- Examples: Working code samples
- Development Setup: For contributors
This implementation is based on:
- Hasani et al. “Liquid Time-constant Networks” (2021)
- Lechner et al. “Neural Circuit Policies” (2020)
- MIT CSAIL research on continuous-time neural models
MIT License - see LICENSE for details.
- GitHub Discussions
- Issue Tracker
- Research papers and citations in docs/papers/