Skip to content

add AI/ML API as model provider #1589

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
OctavianTheI opened this issue Feb 11, 2025 · 10 comments
Open

add AI/ML API as model provider #1589

OctavianTheI opened this issue Feb 11, 2025 · 10 comments

Comments

@OctavianTheI
Copy link

Is your feature request related to a problem? Please describe.

No response

Describe the solution you'd like

add an entry like the following one to the docs:

To use Open Interpreter with a model from OpenRouter, set the `model` flag to begin with `openrouter/`:

We can do it ourselves tbh - if you can give us a green light, it'd be done within a couple of days

Describe alternatives you've considered

No response

Additional context

Hi!
I'm from the Integrations team over at AI/ML API

I've pinged you guys a couple of times 'cos I think your product is dope, and we'd love to have a native integration with it.
It seems now that to do such a thing is easier then ever - 'cos the way you add providers is ingenious and very simple

Say you're interested, and we'll test the compatibility, update your docs to include us, and add a tutorial on using OpenInterpreter with AI/ML API to our docs too

Best,
Sergey

@Kreijstal
Copy link

the product seems dead, which is a shame.. what do you like the most about this project?

@OctavianTheI
Copy link
Author

Shame :c
This seemed to have early beginnings of many cool things -like agentic interactions with calendar and email
And the style was hella dope
I wonder why they've stopped

@Kreijstal
Copy link

Shame :c This seemed to have early beginnings of many cool things -like agentic interactions with calendar and email And the style was hella dope I wonder why they've stopped

calendar and e-mail? just from the command line?

@Notnaton
Copy link
Collaborator

Not stopped btw... Look at the development branch.

I'm now the main developer for open-interpreter, while Killian will focus on some other project.
I have gotten a few people helping me maintain and develop OI in the discord channel 😄

Does litellm support AI/ML API?

@Kreijstal
Copy link

Not stopped btw... Look at the development branch.

I'm now the main developer for open-interpreter, while Killian will focus on some other project. I have gotten a few people helping me maintain and develop OI in the discord channel 😄

Does litellm support AI/ML API?

thank you for responding and interacting

@OctavianTheI
Copy link
Author

Not stopped btw... Look at the development branch.

I'm now the main developer for open-interpreter, while Killian will focus on some other project. I have gotten a few people helping me maintain and develop OI in the discord channel 😄

Does litellm support AI/ML API?

Sup!
Yep, it does

@Kreijstal
Copy link

Kreijstal commented Mar 31, 2025

Not stopped btw... Look at the development branch.

I'm now the main developer for open-interpreter, while Killian will focus on some other project. I have gotten a few people helping me maintain and develop OI in the discord channel 😄

Does litellm support AI/ML API?

litellm is a library that supports mostly all providers so you dont have to, it was written after open-interpreter... it'd be nice to use openrouter with openinterpreter. that we we can finally use gemini and deepseek

@Notnaton
Copy link
Collaborator

@Kreijstal
Copy link

@Notnaton

Details

kreijstal@kreijstalnuc:~/git/workspace/cosim$ curl https://raw.githubusercontent.com/OpenInterpreter/open-interpreter/refs/heads/development/installers/new-installer.sh | sh
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  1281  100  1281    0     0   5307      0 --:--:-- --:--:-- --:--:--  5293
Installing Python 3.12...
Creating virtual environment...
Using CPython 3.12.9
Creating virtual environment at: /home/kreijstal/.openinterpreter/venv
Activate with: source /home/kreijstal/.openinterpreter/venv/bin/activate
Installing package...
Using Python 3.12.9 environment at: /home/kreijstal/.openinterpreter/venv
Resolved 72 packages in 747ms
Prepared 3 packages in 188ms
Installed 72 packages in 38ms
 + aiohappyeyeballs==2.6.1
 + aiohttp==3.11.14
 + aiosignal==1.3.2
 + annotated-types==0.7.0
 + anthropic==0.39.0
 + anyio==4.9.0
 + attrs==25.3.0
 + certifi==2025.1.31
 + charset-normalizer==3.4.1
 + click==8.1.8
 + colorama==0.4.6
 + distro==1.9.0
 + evdev==1.9.1
 + fastapi==0.115.12
 + filelock==3.18.0
 + frozenlist==1.5.0
 + fsspec==2025.3.2
 + h11==0.14.0
 + httpcore==1.0.7
 + httpx==0.27.2
 + huggingface-hub==0.30.1
 + idna==3.10
 + importlib-metadata==8.6.1
 + jinja2==3.1.6
 + jiter==0.9.0
 + jsonschema==4.23.0
 + jsonschema-specifications==2024.10.1
 + litellm==1.65.0
 + markupsafe==3.0.2
 + mouseinfo==0.1.3
 + multidict==6.2.0
 + open-interpreter==1.0.0 (from git+https://github.com/OpenInterpreter/open-interpreter.git@fa06bfda18b70546baa85b9d314873e048f70c99)
 + openai==1.70.0
 + packaging==24.2
 + pillow==11.1.0
 + prompt-toolkit==3.0.50
 + propcache==0.3.1
 + pyautogui==0.9.54
 + pydantic==2.11.1
 + pydantic-core==2.33.0
 + pygetwindow==0.0.9
 + pygments==2.19.1
 + pymsgbox==1.0.9
 + pynput==1.8.1
 + pyperclip==1.9.0
 + pyrect==0.2.0
 + pyscreeze==1.0.1
 + pyte==0.8.2
 + python-dotenv==1.1.0
 + python-xlib==0.33
 + python3-xlib==0.15
 + pytweening==1.2.0
 + pyyaml==6.0.2
 + readchar==4.2.1
 + referencing==0.36.2
 + regex==2024.11.6
 + requests==2.32.3
 + rpds-py==0.24.0
 + screeninfo==0.8.1
 + six==1.17.0
 + sniffio==1.3.1
 + starlette==0.46.1
 + tiktoken==0.9.0
 + tokenizers==0.21.1
 + tqdm==4.67.1
 + typing-extensions==4.13.0
 + typing-inspection==0.4.0
 + urllib3==2.3.0
 + uvicorn==0.32.1
 + wcwidth==0.2.13
 + yarl==1.18.3
 + zipp==3.21.0
Testing...
...Traceback (most recent call last):
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/interpreter/tools/computer.py", line 14, in <module>
    import pyautogui
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/pyautogui/__init__.py", line 246, in <module>
    import mouseinfo
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/mouseinfo/__init__.py", line 223, in <module>
    _display = Display(os.environ['DISPLAY'])
                       ~~~~~~~~~~^^^^^^^^^^^
  File "<frozen os>", line 714, in __getitem__
KeyError: 'DISPLAY'
Failed to import pyautogui. Computer tool will not work.

   Traceback (most recent call last):
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 726, in completion
    raise e
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 610, in completion
    return self.streaming(
           ^^^^^^^^^^^^^^^
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 867, in streaming
    openai_client: OpenAI = self._get_openai_client(  # type: ignore
                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 378, in _get_openai_client
    _new_client = OpenAI(
                  ^^^^^^^
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/openai/_client.py", line 114, in __init__
    raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/litellm/main.py", line 1743, in completion
    raise e
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/litellm/main.py", line 1716, in completion
    response = openai_chat_completions.completion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 737, in completion
    raise OpenAIError(
litellm.llms.openai.common_utils.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/kreijstal/.openinterpreter/venv/bin/interpreter", line 10, in <module>
    sys.exit(main())
             ^^^^^^
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/interpreter/cli.py", line 301, in main
    asyncio.run(async_main(args))
  File "/home/kreijstal/.local/share/uv/python/cpython-3.12.9-linux-x86_64-gnu/lib/python3.12/asyncio/runners.py", line 195, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/home/kreijstal/.local/share/uv/python/cpython-3.12.9-linux-x86_64-gnu/lib/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/kreijstal/.local/share/uv/python/cpython-3.12.9-linux-x86_64-gnu/lib/python3.12/asyncio/base_events.py", line 691, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/interpreter/cli.py", line 225, in async_main
    async for _ in global_interpreter.async_respond():
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/interpreter/interpreter.py", line 740, in async_respond
    raw_response = litellm.completion(**params)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/litellm/utils.py", line 1235, in wrapper
    raise e
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/litellm/utils.py", line 1113, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/litellm/main.py", line 3148, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2214, in exception_type
    raise e
  File "/home/kreijstal/.openinterpreter/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 360, in exception_type
    raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AuthenticationError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

Despite wanting to connect through openrouter installation only works with openai_api_key? I've not used openai in months!

@Notnaton
Copy link
Collaborator

This is just for the tests that run...

I have never used the installers before.
I recommend
Git clone
Cd
Poetry install
Poetry run open-interpreter

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants