You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[Bug] Could not load the library of tensorrt plugins. & Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)
#2747
Open
2 of 3 tasks
xiao-song2022 opened this issue
Apr 23, 2024
· 2 comments
I have searched related issues but cannot get the expected help.
2. I have read the FAQ documentation but cannot get the expected help.
3. The bug has not been fixed in the latest version.
Describe the bug
04/23 11:49:15 - mmengine - WARNING - Could not load the library of tensorrt plugins. Because the file does not exist:
[04/23/2024-11:49:17] [TRT] [E] 1: [pluginV2Runner.cpp::load::290] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)
[04/23/2024-11:49:17] [TRT] [E] 4: [runtime.cpp::deserializeCudaEngine::50] Error Code 4: Internal Error (Engine deserialization failed.)
I can use deploy to convert the model to ONNX and TENSORRT normally, but when running the TENSORRT model, there will be an error message, and I don't know why.
Reproduction
import mmengine
from mmdeploy.backend.tensorrt import TRTWrapper
import tensorrt as trt
import torch
engine_file = './work_dirs/rtmdet/end2end.engine'
model = TRTWrapper(engine_file)
Environment
cuda11.3,tensorrt8.2.3.0,rtx3090
# in bashrcexport PATH=/home/home_node7/pxs/Tensorrt/TensorRT-8.2.3.0:$PATHexport LD_LIBRARY_PATH=/home/home_node7/pxs/Tensorrt/TensorRT-8.2.3.0/lib:$LD_LIBRARY_PATH
Error traceback
No response
The text was updated successfully, but these errors were encountered:
Checklist
Describe the bug
04/23 11:49:15 - mmengine - WARNING - Could not load the library of tensorrt plugins. Because the file does not exist:
[04/23/2024-11:49:17] [TRT] [E] 1: [pluginV2Runner.cpp::load::290] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)
[04/23/2024-11:49:17] [TRT] [E] 4: [runtime.cpp::deserializeCudaEngine::50] Error Code 4: Internal Error (Engine deserialization failed.)
I can use deploy to convert the model to ONNX and TENSORRT normally, but when running the TENSORRT model, there will be an error message, and I don't know why.
Reproduction
import mmengine
from mmdeploy.backend.tensorrt import TRTWrapper
import tensorrt as trt
import torch
engine_file = './work_dirs/rtmdet/end2end.engine'
model = TRTWrapper(engine_file)
Environment
Error traceback
No response
The text was updated successfully, but these errors were encountered: