Skip to content
This repository has been archived by the owner on Dec 7, 2022. It is now read-only.

Advance inference performance using TensorRT for CRAFT Text detection. Implemented modules to convert Pytorch -> ONNX -> TensorRT, with dynamic shapes (multi-size input) inference.

License

Notifications You must be signed in to change notification settings

k9ele7en/ONNX-TensorRT-Inference-CRAFT-pytorch

Repository files navigation

Convert Pytorch pretrain -> TensoRT engine directly for CRAFT (Character-Region Awareness For Text detection)

Overview

Implementation of inference pipeline using Tensor RT for CRAFT text detector. Two modules included:

  • Convert pretrain Pytorch -> ONNX -> TensorRT
  • Inference using Tensor RT

Note: This repo is about converting steps to finally get Tensor RT engine, and inference on the engine. More related repo about Tensor RT inference, check out:

Author

k9ele7en. Give 1 star if you find some value in this repo.
Thank you.

License

[MIT License] A short, permissive software license. Basically, you can do whatever you want as long as you include the original copyright and license notice in any copy of the software/source.

Updates

7 Aug, 2021: Init repo, converter run success. Run infer by ONNX success. Run infer by RT engine return wrong output.

Getting started

1. Install dependencies

Requirements

$ pip install -r requirements.txt

Install ONNX, TensorRT

Check details at ./README_Env.md

2. Download the trained models

Model name Used datasets Languages Purpose Model Link
General SynthText, IC13, IC17 Eng + MLT For general purpose Click
IC15 SynthText, IC15 Eng For IC15 only Click
LinkRefiner CTW1500 - Used with the General Model Click

3. Start converting Pytorch->TensorRT

Use single .sh script to run converter, ready to infer after complete successfully

sh prepare.sh

Seperate single converters

$ cd converters
$ python pth2onnx.py
$ python onnx2trt.py

4. Start infer on Tensor RT engine

$ python infer_trt.py

5. Infer on ONNX format

$ python infer_onnx.py

About

Advance inference performance using TensorRT for CRAFT Text detection. Implemented modules to convert Pytorch -> ONNX -> TensorRT, with dynamic shapes (multi-size input) inference.

Topics

Resources

License

Stars

Watchers

Forks