I am running test5 app inside docker, cant see engine file

i am running container , where is the engine file generated for secondary classifier, do i need to generate engine file outside docker then run it in inside container. I can see it is generating file but cant see engine inside docker conatiner and pipeline is stuck at 0 FPS.

If your model is uff or onnx format, you can set it directly (refer to nvinfer option uff-file or onnx-file), the engine file can be created automatically. Otherwise, you need to create the engine file and set it with the option model-engine-file.

Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

able to built engine from onnx

thanks for the update. Is this still an DeepStream issue to support?

Issue solved

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.