You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have searched the Inference issues and found no similar bug report.
Bug
I tried to run inference via the CLI & it looks like it requires me to have the Docker running (vs using the pip package or hitting the Hosted API). The error message is extremely long and nonobvious.
pytorch/pytorch:2.2.0-cuda12.1-cudnn8-devel docker with after running pip install inference-gpu
Minimal Reproducible Example
(snipped some of the error output for brevity since Github complained that There was an error creating your Issue: body is too long (maximum is 65536 characters).)
Search before asking
Bug
I tried to run inference via the CLI & it looks like it requires me to have the Docker running (vs using the pip package or hitting the Hosted API). The error message is extremely long and nonobvious.
Environment
pytorch/pytorch:2.2.0-cuda12.1-cudnn8-devel
docker with after runningpip install inference-gpu
Minimal Reproducible Example
(snipped some of the error output for brevity since Github complained that
There was an error creating your Issue: body is too long (maximum is 65536 characters).
)Additional
No response
Are you willing to submit a PR?
The text was updated successfully, but these errors were encountered: