How to do inference with two engine?

Hi,

How to load two engine and do inference with it?

I tried and I can able to do inference with 1st engine but while inferencing 2nd engine I got the following error,

[TensorRT] ERROR: ../rtSafe/safeRuntime.cpp (32) - Cuda Error in free: 77 (an illegal memory access was encountered)
terminate called after throwing an instance of 'nvinfer1::CudaError'

Hi,

Please refer to below link:
https://docs.nvidia.com/deeplearning/sdk/tensorrt-archived/tensorrt-700/tensorrt-best-practices/index.html#thread-safety

Cuda error:77 means
/**

  • The device encountered a load or store instruction on an invalid memory address.
  • This leaves the process in an inconsistent state and any further CUDA work
  • will return the same error. To continue using CUDA, the process must be terminated
  • and relaunched.
    */
    cudaErrorIllegalAddress = 77,

To help us debug, can you please share a small repro that demonstrates the errors you are seeing?
Also, can you provide details on the platforms you are using:
o Linux distro and version
o GPU type
o Nvidia driver version
o CUDA version
o CUDNN version
o Python version [if using python]
o Tensorflow and PyTorch version
o TensorRT version

Thanks