Is it possible to run jetson-inference sample code on DPX2?

I tried to run detectnet-camera of jetson-inference sample code.
There was some errors related with nvinfer1. but, solved it.

However, I have a error like below.

detectnet-camera: cudnnEngine.cpp:605: bool nvinfer1::cudnn::Engine::deserialize(const void*, std::size_t, nvinfer1::IPluginFactory*): Assertion `size >= bsize && “Mismatch between allocated memory size and expected size of serialized engine.”’ failed.
Aborted (core dumped)

Does anyone have any clues how to solve such issues?

Dear bgk,

I’m not sure how to run the app on DPX2.
But Jetson camera and DPX2 camera module are not same, so the app will not be worked properly.
DPX2 provide detectnet and tensorrt app. So could you try it instead of Jetson sample? Thanks.