Skip to content

[Bug]: groq/whisper-large-v3 returns 400 BadRequestError with OPENAI_TRANSCRIPTION_PARAMS #11402

@afreenss

Description

@afreenss

What happened?

I'm using the groq/whisper-large-v3 model with litellm.transcription() as shown in the official documentation

This same code was running fine on Google Colab around 3–4 days ago, but now it consistently fails with a BadRequestError. No changes were made to the code or environment.

import os
from litellm import transcription

os.environ["GROQ_API_KEY"] = "your-key-here"

audio_file = open("/path/to/audio.mp3", "rb")

transcript = transcription(
    model="groq/whisper-large-v3",
    file=audio_file,
    prompt="Specify context or spelling",
    temperature=0,
    response_format="json"
)

print("response=", transcript)

Error message:

BadRequestError: Error code: 400 - {'error': {'message': 'unknown paramOPENAI_TRANSCRIPTION_PARAMS[]', 'type': 'invalid_request_error'}}

Relevant log output

9 audio_file = open(audio_path, "rb")
---> 11 transcript = litellm.transcription(
     12     model="groq/whisper-large-v3",
     13     file=audio_file,
     14     prompt="Specify context or spelling",
     15     temperature=0,
     16     response_format="json"
     17 )
     19 transcribed_text = transcript.text

File d:\agenticvenv\Lib\site-packages\litellm\utils.py:1283, in client.<locals>.wrapper(*args, **kwargs)
   1279 if logging_obj:
   1280     logging_obj.failure_handler(
   1281         e, traceback_exception, start_time, end_time
   1282     )  # DO NOT MAKE THREADED - router retry fallback relies on this!
-> 1283 raise e

File d:\agenticvenv\Lib\site-packages\litellm\utils.py:1161, in client.<locals>.wrapper(*args, **kwargs)
   1159         print_verbose(f"Error while checking max token limit: {str(e)}")
   1160 # MODEL CALL
...
-> 1037         raise self._make_status_error_from_response(err.response) from None
   1039     break
   1041 assert response is not None, "could not resolve response (should never happen)"

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

litellm==1.72.0

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions