DeepStream,how to get calibration.tabel if the network was modified?

I have used the project to run yolov3 tensorrt,set calibration_image.txt,and yolov3-calibration.table was provided,so it works well .
After I modified the backbone net on yolov3 ,turns the results:

New calibration table will be created to build the engine
trt-yolo-app: …/builder/cudnnBuilder2.cpp:685: virtual std::vector<nvinfer1::query::Portsnvinfer1::query::TensorRequirements > nvinfer1::builder::Node::getSupportedFormats(const nvinfer1::query::Portsnvinfer1::query::AbstractTensor&, const nvinfer1::cudnn::HardwareContext&, nvinfer1::builder::Format::Type, const nvinfer1::builder::FormatTypeHack&) const: Assertion `sf’ failed.

How to solve this problem?

Could you please provide some more information regarding your setup ?

  1. TensorRT and Cuda versions
  2. What are the changes you made in the backbone net ?

Additionally, please check if the following discussion helps solve your issue

2)replace the darknet53 to resnet34/darknet19. I found just modify yolov3-darknet53 to detect one class ,modify filters to 18 ,get the same results,the calibration.table is not compatible and can’t creat new calibration

I have used yolov3.cfg yolov3.weights to run KFLOAT/KINT8 successfully,but not working on my dataset after small changes of filters(one class detections)

I will try TensorRT later .

@NvCJR Thanks for the link ,tensorrt 5.0 works good.

Now I hava replaced darknet53 to darknet19 ,and the anchor is 3, the output layer is 131318. kFLOAT was ok ,but kINT8 failed .
YoloV3::YoloV3(uint): Assertion `m_OutputIndex1 != -1’ failed.
Aborted (core dumped)

I have checked the kOUTPUT_BLOB_NAME_1 is right ,so how to solve it ?

You will need to update the new output blob names in network_config.cpp

You can find the new output blob names in the logs displayed on the console when the network is being built.

If you have already tried this, please provide a repro so that we can take a look at it.

It was solved.Forgot to delete the darknet53 engine build before.Thanks a lot