TensorRT 7 INT8 quantization

FOr TRT 7 INT 8, I created a calibrator, and loaded 1000 pics. After I make the files and then run ./yolox, it has this error, is anyone met this before???

I printed the log and found this line code report error.
CUDA_CHECK(cudaMemcpyAsync(output, buffers[1], output_size * sizeof(float), cudaMemcpyDeviceToHost, stream));

Hi, Please refer to the below links to perform inference in INT8

Thanks!

Cheers, I found the problem. It is that I did not allocate the correct memory for the output.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.