[Bug]: logprobs
is not compatible with the OpenAI spec
#4795
Labels
bug
Something isn't working
good first issue
Good for newcomers
help wanted
Extra attention is needed
Your current environment
I'm using Runpod Serverless vLLM (https://github.com/runpod-workers/worker-vllm) so I can't run this command. However, I confirmed that the issue is in the codebase in
main
:https://github.com/vllm-project/vllm/blob/0fca3cdcf265cd375bca684d951702b6b7adf65a/vllm/entrypoints/openai/protocol.py
🐛 Describe the bug
The behavior of
logprobs=True
does not match OpenAI's.I identified two issues:
(1) vLLM throws an error when
logprobs=True
andtop_logprobs
is missing.OpenAI works fine:
vLLM breaks:
via
vllm/vllm/entrypoints/openai/protocol.py
Line 162 in 0fca3cd
(2) Even wtih
top_logprobs=1
, the behavior doesn't match.OpenAI:
vLLM:
Notice that, for example,
token_logprobs
comes up with vLLM but not with OpenAI.These issues break libraries expecting OpenAI-compatible responses, e.g. Rust's async_openai we are using.
The text was updated successfully, but these errors were encountered: