Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

imageNet performance on Jetson Xavier AGX #1816

Open
roundPot opened this issue Mar 25, 2024 · 0 comments
Open

imageNet performance on Jetson Xavier AGX #1816

roundPot opened this issue Mar 25, 2024 · 0 comments

Comments

@roundPot
Copy link

Hey dusty,

how do you reach such a good performance with the imageNet example?
What is happening under the hood when you call predictions = net.Classify(img, topK=args.topK) in the following example https://github.com/dusty-nv/jetson-inference/blob/master/python/examples/imagenet.py?
I have trouble to understand how you load the onnx-model and use it as a TensorRT model.

I did try to use the same Model together with the triton server but the performance is much worse (0.06s inference time for your solution vs 0.3s inference time for the triton server solution)

Its quite hard to find the right information for me as a novice.
Any advice?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant