Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ONNX导出NCNN模型的问题解决+完整int8量化步骤 #53

Closed
ChaucerG opened this issue Oct 20, 2021 · 24 comments · Fixed by #9
Closed

ONNX导出NCNN模型的问题解决+完整int8量化步骤 #53

ChaucerG opened this issue Oct 20, 2021 · 24 comments · Fixed by #9
Labels
good first issue Good for newcomers

Comments

@ChaucerG
Copy link
Collaborator

is:issue is:open 请问,方便提供一下onnx文件导出的代码文件吗?用model文件夹里面跟随的文件输出结果用ncnn推理结果不对

@ppogg
Copy link
Owner

ppogg commented Oct 20, 2021

请问使用的是哪个模型?

@ppogg
Copy link
Owner

ppogg commented Oct 20, 2021

is:issue is:open 请问,方便提供一下onnx文件导出的代码文件吗?用model文件夹里面跟随的文件输出结果用ncnn推理结果不对

是不是anchor的数值没写对?还有你的reshape数值要调整

@ChaucerG
Copy link
Collaborator Author

使用的是仓库提供的v5lite-s.pt权重,然后使用export导出的onnx文件,在使用编译的ONNX2NCNN工具转的.bin和.param,结果不对

@ChaucerG
Copy link
Collaborator Author

anchor的数值是参考仓库提供的yaml文件进行的填写,可执行文件都可以,可以预测,但是结果不对

@ppogg
Copy link
Owner

ppogg commented Oct 20, 2021

我用的ncnn版本是2021.05.25,另外,发一下你改过的param文件看看

@ChaucerG
Copy link
Collaborator Author

7767517
216 242
Input images 0 1 images
Convolution Conv_0 1 1 images 806 0=32 1=3 11=3 2=1 12=1 3=2 13=2 4=1 14=1 15=1 16=1 5=1 6=864
ReLU Relu_1 1 1 806 389
Pooling MaxPool_2 1 1 389 390 0=0 1=3 11=3 2=2 12=2 3=1 13=1 14=1 15=1 5=1
Split splitncnn_0 1 2 390 390_splitncnn_0 390_splitncnn_1
ConvolutionDepthWise Conv_3 1 1 390_splitncnn_1 809 0=32 1=3 11=3 2=1 12=1 3=2 13=2 4=1 14=1 15=1 16=1 5=1 6=288 7=32
Convolution Conv_4 1 1 809 812 0=60 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=1920
ReLU Relu_5 1 1 812 395
Convolution Conv_6 1 1 390_splitncnn_0 815 0=60 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=1920
ReLU Relu_7 1 1 815 398
ConvolutionDepthWise Conv_8 1 1 398 818 0=60 1=3 11=3 2=1 12=1 3=2 13=2 4=1 14=1 15=1 16=1 5=1 6=540 7=60
Convolution Conv_9 1 1 818 821 0=60 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=3600
ReLU Relu_10 1 1 821 403
Concat Concat_11 2 1 395 403 404 0=0
ShuffleChannel Reshape_16 1 1 404 409 0=2 1=0
Slice Split_17 1 2 409 410 411 -23300=2,60,-233 1=0
Convolution Conv_18 1 1 411 824 0=60 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=3600
ReLU Relu_19 1 1 824 414
ConvolutionDepthWise Conv_20 1 1 414 827 0=60 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=540 7=60
Convolution Conv_21 1 1 827 830 0=60 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=3600
ReLU Relu_22 1 1 830 419
Concat Concat_23 2 1 410 419 420 0=0
ShuffleChannel Reshape_28 1 1 420 425 0=2 1=0
Slice Split_29 1 2 425 426 427 -23300=2,60,-233 1=0
Convolution Conv_30 1 1 427 833 0=60 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=3600
ReLU Relu_31 1 1 833 430
ConvolutionDepthWise Conv_32 1 1 430 836 0=60 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=540 7=60
Convolution Conv_33 1 1 836 839 0=60 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=3600
ReLU Relu_34 1 1 839 435
Concat Concat_35 2 1 426 435 436 0=0
ShuffleChannel Reshape_40 1 1 436 441 0=2 1=0
Slice Split_41 1 2 441 442 443 -23300=2,60,-233 1=0
Convolution Conv_42 1 1 443 842 0=60 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=3600
ReLU Relu_43 1 1 842 446
ConvolutionDepthWise Conv_44 1 1 446 845 0=60 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=540 7=60
Convolution Conv_45 1 1 845 848 0=60 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=3600
ReLU Relu_46 1 1 848 451
Concat Concat_47 2 1 442 451 452 0=0
ShuffleChannel Reshape_52 1 1 452 457 0=2 1=0
Split splitncnn_1 1 3 457 457_splitncnn_0 457_splitncnn_1 457_splitncnn_2
ConvolutionDepthWise Conv_53 1 1 457_splitncnn_2 851 0=120 1=3 11=3 2=1 12=1 3=2 13=2 4=1 14=1 15=1 16=1 5=1 6=1080 7=120
Convolution Conv_54 1 1 851 854 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13920
ReLU Relu_55 1 1 854 462
Convolution Conv_56 1 1 457_splitncnn_1 857 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13920
ReLU Relu_57 1 1 857 465
ConvolutionDepthWise Conv_58 1 1 465 860 0=116 1=3 11=3 2=1 12=1 3=2 13=2 4=1 14=1 15=1 16=1 5=1 6=1044 7=116
Convolution Conv_59 1 1 860 863 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13456
ReLU Relu_60 1 1 863 470
Concat Concat_61 2 1 462 470 471 0=0
ShuffleChannel Reshape_66 1 1 471 476 0=2 1=0
Slice Split_67 1 2 476 477 478 -23300=2,116,-233 1=0
Convolution Conv_68 1 1 478 866 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13456
ReLU Relu_69 1 1 866 481
ConvolutionDepthWise Conv_70 1 1 481 869 0=116 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=1044 7=116
Convolution Conv_71 1 1 869 872 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13456
ReLU Relu_72 1 1 872 486
Concat Concat_73 2 1 477 486 487 0=0
ShuffleChannel Reshape_78 1 1 487 492 0=2 1=0
Slice Split_79 1 2 492 493 494 -23300=2,116,-233 1=0
Convolution Conv_80 1 1 494 875 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13456
ReLU Relu_81 1 1 875 497
ConvolutionDepthWise Conv_82 1 1 497 878 0=116 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=1044 7=116
Convolution Conv_83 1 1 878 881 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13456
ReLU Relu_84 1 1 881 502
Concat Concat_85 2 1 493 502 503 0=0
ShuffleChannel Reshape_90 1 1 503 508 0=2 1=0
Slice Split_91 1 2 508 509 510 -23300=2,116,-233 1=0
Convolution Conv_92 1 1 510 884 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13456
ReLU Relu_93 1 1 884 513
ConvolutionDepthWise Conv_94 1 1 513 887 0=116 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=1044 7=116
Convolution Conv_95 1 1 887 890 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13456
ReLU Relu_96 1 1 890 518
Concat Concat_97 2 1 509 518 519 0=0
ShuffleChannel Reshape_102 1 1 519 524 0=2 1=0
Slice Split_103 1 2 524 525 526 -23300=2,116,-233 1=0
Convolution Conv_104 1 1 526 893 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13456
ReLU Relu_105 1 1 893 529
ConvolutionDepthWise Conv_106 1 1 529 896 0=116 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=1044 7=116
Convolution Conv_107 1 1 896 899 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13456
ReLU Relu_108 1 1 899 534
Concat Concat_109 2 1 525 534 535 0=0
ShuffleChannel Reshape_114 1 1 535 540 0=2 1=0
Slice Split_115 1 2 540 541 542 -23300=2,116,-233 1=0
Convolution Conv_116 1 1 542 902 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13456
ReLU Relu_117 1 1 902 545
ConvolutionDepthWise Conv_118 1 1 545 905 0=116 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=1044 7=116
Convolution Conv_119 1 1 905 908 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13456
ReLU Relu_120 1 1 908 550
Concat Concat_121 2 1 541 550 551 0=0
ShuffleChannel Reshape_126 1 1 551 556 0=2 1=0
Slice Split_127 1 2 556 557 558 -23300=2,116,-233 1=0
Convolution Conv_128 1 1 558 911 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13456
ReLU Relu_129 1 1 911 561
ConvolutionDepthWise Conv_130 1 1 561 914 0=116 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=1044 7=116
Convolution Conv_131 1 1 914 917 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13456
ReLU Relu_132 1 1 917 566
Concat Concat_133 2 1 557 566 567 0=0
ShuffleChannel Reshape_138 1 1 567 572 0=2 1=0
Slice Split_139 1 2 572 573 574 -23300=2,116,-233 1=0
Convolution Conv_140 1 1 574 920 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13456
ReLU Relu_141 1 1 920 577
ConvolutionDepthWise Conv_142 1 1 577 923 0=116 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=1044 7=116
Convolution Conv_143 1 1 923 926 0=116 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=13456
ReLU Relu_144 1 1 926 582
Concat Concat_145 2 1 573 582 583 0=0
ShuffleChannel Reshape_150 1 1 583 588 0=2 1=0
Split splitncnn_2 1 3 588 588_splitncnn_0 588_splitncnn_1 588_splitncnn_2
ConvolutionDepthWise Conv_151 1 1 588_splitncnn_2 929 0=232 1=3 11=3 2=1 12=1 3=2 13=2 4=1 14=1 15=1 16=1 5=1 6=2088 7=232
Convolution Conv_152 1 1 929 932 0=232 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=53824
ReLU Relu_153 1 1 932 593
Convolution Conv_154 1 1 588_splitncnn_1 935 0=232 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=53824
ReLU Relu_155 1 1 935 596
ConvolutionDepthWise Conv_156 1 1 596 938 0=232 1=3 11=3 2=1 12=1 3=2 13=2 4=1 14=1 15=1 16=1 5=1 6=2088 7=232
Convolution Conv_157 1 1 938 941 0=232 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=53824
ReLU Relu_158 1 1 941 601
Concat Concat_159 2 1 593 601 602 0=0
ShuffleChannel Reshape_164 1 1 602 607 0=2 1=0
Slice Split_165 1 2 607 608 609 -23300=2,232,-233 1=0
Convolution Conv_166 1 1 609 944 0=232 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=53824
ReLU Relu_167 1 1 944 612
ConvolutionDepthWise Conv_168 1 1 612 947 0=232 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=2088 7=232
Convolution Conv_169 1 1 947 950 0=232 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=53824
ReLU Relu_170 1 1 950 617
Concat Concat_171 2 1 608 617 618 0=0
ShuffleChannel Reshape_176 1 1 618 623 0=2 1=0
Slice Split_177 1 2 623 624 625 -23300=2,232,-233 1=0
Convolution Conv_178 1 1 625 953 0=232 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=53824
ReLU Relu_179 1 1 953 628
ConvolutionDepthWise Conv_180 1 1 628 956 0=232 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=2088 7=232
Convolution Conv_181 1 1 956 959 0=232 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=53824
ReLU Relu_182 1 1 959 633
Concat Concat_183 2 1 624 633 634 0=0
ShuffleChannel Reshape_188 1 1 634 639 0=2 1=0
Slice Split_189 1 2 639 640 641 -23300=2,232,-233 1=0
Convolution Conv_190 1 1 641 962 0=232 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=53824
ReLU Relu_191 1 1 962 644
ConvolutionDepthWise Conv_192 1 1 644 965 0=232 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=2088 7=232
Convolution Conv_193 1 1 965 968 0=232 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=53824
ReLU Relu_194 1 1 968 649
Concat Concat_195 2 1 640 649 650 0=0
ShuffleChannel Reshape_200 1 1 650 655 0=2 1=0
Convolution Conv_201 1 1 655 656 0=128 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=59392
Swish Mul_203 1 1 656 658
Split splitncnn_3 1 2 658 658_splitncnn_0 658_splitncnn_1
Interp Resize_205 1 1 658_splitncnn_1 663 0=1 1=2.000000e+00 2=2.000000e+00 3=0 4=0 6=0
Concat Concat_206 2 1 663 588_splitncnn_0 664 0=0
Split splitncnn_4 1 2 664 664_splitncnn_0 664_splitncnn_1
Convolution Conv_207 1 1 664_splitncnn_1 665 0=64 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=23040
Swish Mul_209 1 1 665 667
Convolution Conv_210 1 1 667 668 0=64 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=4096
Swish Mul_212 1 1 668 670
Convolution Conv_213 1 1 670 671 0=64 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=36864
Swish Mul_215 1 1 671 673
Convolution Conv_216 1 1 664_splitncnn_0 674 0=64 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=23040
Swish Mul_218 1 1 674 676
Concat Concat_219 2 1 673 676 677 0=0
Convolution Conv_220 1 1 677 678 0=128 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=16384
Swish Mul_222 1 1 678 680
Convolution Conv_223 1 1 680 681 0=64 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=8192
Swish Mul_225 1 1 681 683
Split splitncnn_5 1 2 683 683_splitncnn_0 683_splitncnn_1
Interp Resize_227 1 1 683_splitncnn_1 688 0=1 1=2.000000e+00 2=2.000000e+00 3=0 4=0 6=0
Concat Concat_228 2 1 688 457_splitncnn_0 689 0=0
Split splitncnn_6 1 2 689 689_splitncnn_0 689_splitncnn_1
Convolution Conv_229 1 1 689_splitncnn_1 690 0=32 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=5888
Swish Mul_231 1 1 690 692
Convolution Conv_232 1 1 692 693 0=32 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=1024
Swish Mul_234 1 1 693 695
Convolution Conv_235 1 1 695 696 0=32 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=9216
Swish Mul_237 1 1 696 698
Convolution Conv_238 1 1 689_splitncnn_0 699 0=32 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=5888
Swish Mul_240 1 1 699 701
Concat Concat_241 2 1 698 701 702 0=0
Convolution Conv_242 1 1 702 703 0=64 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=4096
Swish Mul_244 1 1 703 705
Split splitncnn_7 1 2 705 705_splitncnn_0 705_splitncnn_1
Convolution Conv_245 1 1 705_splitncnn_1 706 0=64 1=3 11=3 2=1 12=1 3=2 13=2 4=1 14=1 15=1 16=1 5=1 6=36864
Swish Mul_247 1 1 706 708
Concat Concat_248 2 1 708 683_splitncnn_0 709 0=0
Split splitncnn_8 1 2 709 709_splitncnn_0 709_splitncnn_1
Convolution Conv_249 1 1 709_splitncnn_1 710 0=64 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=8192
Swish Mul_251 1 1 710 712
Convolution Conv_252 1 1 712 713 0=64 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=4096
Swish Mul_254 1 1 713 715
Convolution Conv_255 1 1 715 716 0=64 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=36864
Swish Mul_257 1 1 716 718
Convolution Conv_258 1 1 709_splitncnn_0 719 0=64 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=8192
Swish Mul_260 1 1 719 721
Concat Concat_261 2 1 718 721 722 0=0
Convolution Conv_262 1 1 722 723 0=128 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=16384
Swish Mul_264 1 1 723 725
Split splitncnn_9 1 2 725 725_splitncnn_0 725_splitncnn_1
Convolution Conv_265 1 1 725_splitncnn_1 726 0=128 1=3 11=3 2=1 12=1 3=2 13=2 4=1 14=1 15=1 16=1 5=1 6=147456
Swish Mul_267 1 1 726 728
Concat Concat_268 2 1 728 658_splitncnn_0 729 0=0
Split splitncnn_10 1 2 729 729_splitncnn_0 729_splitncnn_1
Convolution Conv_269 1 1 729_splitncnn_1 730 0=128 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=32768
Swish Mul_271 1 1 730 732
Convolution Conv_272 1 1 732 733 0=128 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=16384
Swish Mul_274 1 1 733 735
Convolution Conv_275 1 1 735 736 0=128 1=3 11=3 2=1 12=1 3=1 13=1 4=1 14=1 15=1 16=1 5=1 6=147456
Swish Mul_277 1 1 736 738
Convolution Conv_278 1 1 729_splitncnn_0 739 0=128 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=32768
Swish Mul_280 1 1 739 741
Concat Concat_281 2 1 738 741 742 0=0
Convolution Conv_282 1 1 742 743 0=256 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=65536
Swish Mul_284 1 1 743 745
Convolution Conv_285 1 1 705_splitncnn_0 746 0=255 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=16320
Reshape Reshape_299 1 1 746 764 0=6400 1=85 2=3
Permute Transpose_300 1 1 764 output 0=1
Convolution Conv_301 1 1 725_splitncnn_0 766 0=255 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=32640
Reshape Reshape_315 1 1 766 784 0=1600 1=85 2=3
Permute Transpose_316 1 1 784 785 0=1
Convolution Conv_317 1 1 745 786 0=255 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=65280
Reshape Reshape_331 1 1 786 804 0=400 1=85 2=3
Permute Transpose_332 1 1 804 805 0=1

@ppogg
Copy link
Owner

ppogg commented Oct 21, 2021

你这个是fp32的模型??
还有要做尺度缩放处理,把最后几行改成这样:

Reshape Reshape_299 1 1 746 764 0=-1 1=85 2=3
Permute Transpose_300 1 1 764 output 0=1
Convolution Conv_301 1 1 725_splitncnn_0 766 0=255 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=32640
Reshape Reshape_315 1 1 766 784 0=-1 1=85 2=3
Permute Transpose_316 1 1 784 785 0=1
Convolution Conv_317 1 1 745 786 0=255 1=1 11=1 2=1 12=1 3=1 13=1 4=0 14=0 15=0 16=0 5=1 6=65280
Reshape Reshape_331 1 1 786 804 0=-1 1=85 2=3
Permute Transpose_332 1 1 804 805 0=1

@ChaucerG
Copy link
Collaborator Author

感谢您的回答,我这里根据你的提示

进行了修改,已经可以输出结果,但是结果还是对不上的,我用v5lite-s.pt导出onnx然后输出的bin大概是6.5M,而你给出的是3.3M。
这是我的bin权重的结果:
image

这是用你的bin权重输出的结果(所有参数都是一样的):
image

@ppogg
Copy link
Owner

ppogg commented Oct 21, 2021

我的是3.3m是因为使用了ncnnoptimter将模型从fp32变成了fp16,你试下使用fp16的模型,还有看下你的fp16模型param文件的这一行是不是和我放出来的一样,不一样的话可能是出错了:
image
,另外,你这个是自己训练的模型吗?

@ppogg
Copy link
Owner

ppogg commented Oct 21, 2021

如果你一定要使用fp32的模型的话,改一下v5lite-s.cpp的stride16和stride32:

 // stride 16
    {
        ncnn::Mat out;
        ex.extract("785", out);

        ncnn::Mat anchors(6);
        anchors[0] = 30.f;
        anchors[1] = 61.f;
        anchors[2] = 62.f;
        anchors[3] = 45.f;
        anchors[4] = 59.f;
        anchors[5] = 119.f;

        std::vector<Object> objects16;
        generate_proposals(anchors, 16, in_pad, out, prob_threshold, objects16);

        proposals.insert(proposals.end(), objects16.begin(), objects16.end());
    }
    // stride 32
    {
        ncnn::Mat out;
        ex.extract("805", out);

        ncnn::Mat anchors(6);
        anchors[0] = 116.f;
        anchors[1] = 90.f;
        anchors[2] = 156.f;
        anchors[3] = 198.f;
        anchors[4] = 373.f;
        anchors[5] = 326.f;

        std::vector<Object> objects32;
        generate_proposals(anchors, 32, in_pad, out, prob_threshold, objects32);

两个 ex.extract("xxx", out);的输出节点

@ChaucerG
Copy link
Collaborator Author

您说的那3个位置已经修改了,我也尝试了使用fp16的模型,结果还是更改的那样,还有就是你说的那两行我这里确实和你的不一样。

7767517
156 182
Input                    images                   0 1 images
Convolution              Conv_0                   1 1 images 389 0=24 1=3 3=2 4=1 5=1 6=648 9=1

这里方便说一下你的生成过程吗?
补充一下,我用的权重是仓库链接:https://drive.google.com/file/d/1zkKrJsfUx9WVKuFpPQgDemzf2geAHEx-/view?usp=sharing,在pytorch下的结果是对的。

@ppogg
Copy link
Owner

ppogg commented Oct 21, 2021

是的,是这个权重没错,你的ncnn版本号是多少?

@ChaucerG
Copy link
Collaborator Author

2021.05.25,编译的和你说的一样的

@ChaucerG
Copy link
Collaborator Author

是不是在YOLOv5-Lite的仓库里面的model/export.py是需要修改呢?

@ppogg
Copy link
Owner

ppogg commented Oct 21, 2021

是不是在YOLOv5-Lite的仓库里面的model/export.py是需要修改呢?

不需要的,common.py,yolo.py,export.py都封装好了,问题在于同样的ncnn版本,同样的模型经过提取后param结构变了

@ppogg
Copy link
Owner

ppogg commented Oct 21, 2021

是不是在YOLOv5-Lite的仓库里面的model/export.py是需要修改呢?

有没有用onnxsim过滤一遍?

@ChaucerG
Copy link
Collaborator Author

用了

@ppogg
Copy link
Owner

ppogg commented Oct 21, 2021

1138099162,加我qq,我看下

@ChaucerG
Copy link
Collaborator Author

好的,你通过以下哈

@ChaucerG
Copy link
Collaborator Author

ChaucerG commented Oct 21, 2021

感谢大佬提供帮助,这里问题已经得到解决,在这里也声明一下,大家在从onnx往ncnn的param与bin文件转换的时候一定注意版本,这里YOLOv5-Lite仓库对应的是NCNN-20210525版本,然后就是转换完成后记得把3个head的输出改为-1,下面说一下指令:
修改xxxx.param的部分如下,主要是有85个类别的节点对应的位置更改为-1

Reshape                      Reshape_468               1 1 632 650 0=-1 1=85 2=3
Permute                      Transpose_469            1 1 650 output 0=1
Convolution              Conv_470                       1 1 611_splitncnn_0 652 0=255 1=1 5=1 6=32640
Reshape                      Reshape_484               1 1 652 670 0=-1 1=85 2=3      
Permute                      Transpose_485            1 1 670 671 0=1
Convolution              Conv_486                       1 1 631 672 0=255 1=1 5=1 6=65280
Reshape                      Reshape_500               1 1 672 690 0=-1 1=85 2=3
Permute                      Transpose_501            1 1 690 691 0=1

0、导出onnx模型,以及onnxsim简化

python model/export.py --weights='./weights/v5lite-s.pt'
python -m onnxsim ./weights/v5lite-s.onnx ./weights/v5lite-s.onnx

1 、导出fp16模型的指令如下:

xxxx@PC:~/ncnn-20210525/build/tools/onnx$ ./onnx2ncnn ./v5lite-s.onnx v5lite-s.param v5lite-s.bin
xxxx@PC:~/ncnn-20210525/build/tools/onnx$ cd ..
xxxx@PC:~/ncnn-20210525/build/tools$ ./ncnnoptimize ./v5lite-s.param ./v5lite-s.bin ./v5lite-s-fp16.param ./v5lite-s-fp16.bin 65536

2、导出int8量化模型(这里的校准集为coco2017数据集的val2017):

xxxx@PC:~/ncnn-20210525/build/tools/quantize$ find val2017/ -type f > imagelist.txt
xxxx@PC:~/ncnn-20210525/build/tools/quantize$ ./ncnn2table v5lite-s.param v5lite-s.bin imagelist.txt v5.table mean=[0,0,0] norm=[0.003922,0.003922,0.003922] shape=[640,640,3] pixel=BGR method=kl
xxxx@PC:~/ncnn-20210525/build/tools/quantize$ ./ncnn2int8 v5lite-s.param v5lite-s.bin v5lite-s-int8.param v5lite-s-int8.bin v5.table 

提示一下:

  1. 再转fp16的时候,flag用65536,而不是1,用1可能会带来问题;
  2. 一定注意ncnn的版本,为了不必要的麻烦大家最好直接用ncnn-20210525: https://github.com/Tencent/ncnn/tree/20210525

下面是我转换完成的结果:

1、fp16的检测结果:

image

2、int8的检测结果:

image_1

可以看出,虽然经过校准,检测结果还是有所下降的,int8的检测结果中的自行车还是遗漏掉了的。

@ppogg
Copy link
Owner

ppogg commented Oct 21, 2021

nice,干杯!!!
该issue将永久开放,防止其他人踩坑,感谢楼主的干货!

@ppogg ppogg added the good first issue Good for newcomers label Oct 21, 2021
@ChaucerG ChaucerG changed the title onnx文件导出问题 ONNX导出NCNN模型的问题解决+完整int8量化步骤 Oct 21, 2021
@ppogg ppogg linked a pull request Nov 2, 2021 that will close this issue
@ppogg ppogg pinned this issue Dec 31, 2021
@chen990627
Copy link

请问一下我自己训练的yolov5s-ghost,ncnn推理一个物体出现两个框怎么解决呀

@Klaus-Chow
Copy link

您好,我想请教一下,我这边用了最新版本的ncnn20220420,然后yolov5 5.0,主要想看看量化,我这边fp16量化下来,没问题,但是int8量化的时候会没有检测框出现,我的流程就是先转成onnx,再使用onnxsim简化,然后使用ncnnoptimize flag设置为0,然后int8量化的过程也成功,但是没有显示检测框,对了我是保留了focus层的。

@xxr11
Copy link

xxr11 commented Jun 22, 2022

我也是 请问有解决吗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants