Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to use bpnet on Jetson? #67

Open
AK51 opened this issue Aug 3, 2022 · 1 comment
Open

How to use bpnet on Jetson? #67

AK51 opened this issue Aug 3, 2022 · 1 comment

Comments

@AK51
Copy link

AK51 commented Aug 3, 2022

Hi,

I have created this question in the Nvidia forum, https://forums.developer.nvidia.com/t/how-to-run-bpnet-in-tao-toolkit/222628/6
And then, I was led to this github.
When I try to install it to my Jetson Orin, I need to change some code in order to do the MAKE

nvidia@ubuntu:~/deepstream_tao_apps/apps/tao_others$     export CUDA_VER=11.4
nvidia@ubuntu:~/deepstream_tao_apps/apps/tao_others$     make

And when I try to run an example to verify the installation, there is error.
Sorry I am new to tao toolkit, did I miss any installtion or setup?
Note: My AGX Orin 64Gb is new and flashed with deepstream 6.1 using SDK manager.
Thx

nvidia@ubuntu:~/deepstream_tao_apps/apps/tao_others/deepstream-bodypose2d-app$ ./deepstream-bodypose2d-app 1 ../../../configs/bodypose2d_tao/sample_bodypose2d_model_config.txt 0 0 file:///usr/data/bodypose2d_test.png ./body2dout
(gst-plugin-scanner:15202): GStreamer-WARNING **: 16:29:49.954: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_inferserver.so': libtritonserver.so: cannot open shared object file: No such file or directory

(gst-plugin-scanner:15202): GStreamer-WARNING **: 16:29:49.969: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_udp.so': librivermax.so.0: cannot open shared object file: No such file or directory
Request sink_0 pad from streammux
joint Edges 1 , 8
joint Edges 8 , 9
joint Edges 9 , 10
joint Edges 1 , 11
joint Edges 11 , 12
joint Edges 12 , 13
joint Edges 1 , 2
joint Edges 2 , 3
joint Edges 3 , 4
joint Edges 2 , 16
joint Edges 1 , 5
joint Edges 5 , 6
joint Edges 6 , 7
joint Edges 5 , 17
joint Edges 1 , 0
joint Edges 0 , 14
joint Edges 0 , 15
joint Edges 14 , 16
joint Edges 15 , 17
connections 0 , 1
connections 1 , 2
connections 1 , 5
connections 2 , 3
connections 3 , 4
connections 5 , 6
connections 6 , 7
connections 2 , 8
connections 8 , 9
connections 9 , 10
connections 5 , 11
connections 11 , 12
connections 12 , 13
connections 0 , 14
connections 14 , 16
connections 8 , 11
connections 15 , 17
connections 0 , 15
Now playing: file:///usr/data/bodypose2d_test.png
WARNING: Deserialize engine failed because file path: /home/nvidia/deepstream_tao_apps/configs/bodypose2d_tao/../../models/bodypose2d/model.etlt_b32_gpu0_fp16.engine open error
0:00:05.148199758 15201 0xaaaae3449e00 WARN                 nvinfer gstnvinfer.cpp:643:gst_nvinfer_logger:<primary-infer-engine1> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1888> [UID = 1]: deserialize engine from file :/home/nvidia/deepstream_tao_apps/configs/bodypose2d_tao/../../models/bodypose2d/model.etlt_b32_gpu0_fp16.engine failed
0:00:05.304385506 15201 0xaaaae3449e00 WARN                 nvinfer gstnvinfer.cpp:643:gst_nvinfer_logger:<primary-infer-engine1> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1993> [UID = 1]: deserialize backend context from engine from file :/home/nvidia/deepstream_tao_apps/configs/bodypose2d_tao/../../models/bodypose2d/model.etlt_b32_gpu0_fp16.engine failed, try rebuild
0:00:05.304678340 15201 0xaaaae3449e00 INFO                 nvinfer gstnvinfer.cpp:646:gst_nvinfer_logger:<primary-infer-engine1> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1914> [UID = 1]: Trying to create engine from model files
WARNING: [TRT]: onnx2trt_utils.cpp:363: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
WARNING: [TRT]: DLA requests all profiles have same min, max, and opt value. All dla layers are falling back to GPU

0:11:05.334551750 15201 0xaaaae3449e00 INFO                 nvinfer gstnvinfer.cpp:646:gst_nvinfer_logger:<primary-infer-engine1> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1946> [UID = 1]: serialize cuda engine to file: /home/nvidia/deepstream_tao_apps/models/bodypose2d/model.etlt_b32_gpu0_fp16.engine successfully
INFO: [FullDims Engine Info]: layers num: 3
0   INPUT  kFLOAT input_1:0       288x384x3       min: 1x288x384x3     opt: 32x288x384x3    Max: 32x288x384x3    
1   OUTPUT kFLOAT heatmap_out/BiasAdd:0 36x48x19        min: 0               opt: 0               Max: 0               
2   OUTPUT kFLOAT conv2d_transpose_1/BiasAdd:0 144x192x38      min: 0               opt: 0               Max: 0               

0:11:06.277470953 15201 0xaaaae3449e00 INFO                 nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<primary-infer-engine1> [UID 1]: Load new model:../../../configs/bodypose2d_tao/bodypose2d_pgie_config.txt sucessfully
Decodebin child added: source
Decodebin child added: decodebin0
Running...
ERROR from element source: Resource not found.
Error details: gstfilesrc.c(532): gst_file_src_start (): /GstPipeline:pipeline/GstBin:source-bin-00/GstURIDecodeBin:uri-decode-bin/GstFileSrc:source:
No such file "/usr/data/bodypose2d_test.png"
Returned, stopping playback
Average fps 0.000233
Totally 0 persons are inferred
Deleting pipeline
nvidia@ubuntu:~/deepstream_tao_apps/apps/tao_others/deepstream-bodypose2d-app$ 
@PaRowsome
Copy link

PaRowsome commented Aug 11, 2022

I saw a similar issue. When I removed the file:// from the file path it worked., i.e.:
file:///usr/data/bodypose2d_test.png change it to /usr/data/bodypose2d_test.png

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants