Convert ONNX to ENGINE file failure of TensorRT 8.6 when running the command "deepstream-app -c deepstream_app_config.txt" on GPU NVIDIA ORIN NX 64GB #3832
Labels
triaged
Issue has been triaged by maintainers
Description
When I used the steps provided by Deepstream-YOLO to convert the YOLOv5n model from ONNX to Engine, I received the following error message:
Running Environment
deepstream-app version 6.4.0
DeepStreamSDK 6.4.0
CUDA Driver Version: 12.2
CUDA Runtime Version: 12.2
TensorRT Version: 8.6
cuDNN Version: 8.9
libNVWarp360 Version: 2.0.1d3
TensorRT Version:8.6
NVIDIA GPU: NVIDIA Corporation Device 229e (rev a1)
NVIDIA Driver Version:540.2.0
Operating System:jetson(apply jetpack 6.0)
Running command
python3 export_yoloV5.py -w yolov5s.pt --dynamic
cd Deepstream-yolo
CUDA_VER=12.2 make -C nvdsinfer_custom_impl_Yolo
sudo deepstream-app -c deepstream_app_config.txt
Others
I refer to the issue #2535 and try to generate the onnx model by "python3 export_yoloV5.py -w yolov5s.pt --dynamic --simplify" and "python3 export_yoloV5.py -w yolov5s.pt",but the above question has already exists.......
Please help me to solve the question,thank you so... much !!!
The text was updated successfully, but these errors were encountered: