-
-
Notifications
You must be signed in to change notification settings - Fork 432
Description
Problem
There are several issues with using langchain
as the model library of choice in JAI:
-
Hard requirement on LangChain makes it difficult for contributors to extend model support
-
Models are only listed if their provider dependencies are installed (Chat panel fails without selected provider dependencies installed #680, Inconsistent experience with respect to non-installed third-party packages #840)
-
Magics & chat authentication are different ([v3-future] Enhance and generalize secret/key management? #1237)
-
LangChain modules are slow to import & load from entry points (
jupyter_ai
import time is too slow #1115)
This issue proposes a solution to #1312.
Proposed Solution
We propose that Jupyter AI migrates to use litellm
instead of langchain
as a model library to provide universal LLM support.
litellm
offers a much more simple dependency tree and is noticeably faster to import. It also provides access to a vast array of models out-of-the-box, without the need for partner packages. This entirely solves 3 major issues in JAI v2:
- Dealing with provider dependencies:
litellm
provides everything out-of-the-box. - Forcing users to wait for us to update the list of models regularly for each provider:
litellm
maintains its own lists. - Forcing developers to create a package to add a model via entry points API:
litellm
supports custom models
As Jupyter AI v3 will only provide Jupyternaut, the requirement on langchain
can be dropped if this proposal is accepted. For models that are not available through our litellm
implementation, developers can write custom personas. These are easy to write and can be defined in any .jupyter/
directory.
Additional context
Also would fix #1308.