About DeepStreamSDK3.0 model-file

I have a question about model-file in DeepStream SDK 3.0.
Is the file specified for “model-engine-file = XXX” in the [primary-gie] item only the caffe model? For example, can you use YOLO model files?

“model-engine-file” is plan file generated after tensorRT optimizer. You can check the link:

https://docs.nvidia.com/deeplearning/sdk/tensorrt-developer-guide/index.html#work

I checked the official page but I can not understand it.
I knew what was automatically created by TensorRT.

Create a container from “nvcr.io/nvidia/deepstream: 3.0-18.11”
I ran the “source30_720p_dec_infer-resnet_tiled_display_int8.txt” sample, but I do not know the specific steps to perform other inferences.
Please let me know if there are other inference procedures or websites.

You can check test1, test2, test3, deepstrema-app souce code and other source code, and document in SDK.

Thank you very much.
Check each code.

Another matter is that “GitHub - NVIDIA-AI-IOT/deepstream_reference_apps: Samples for TensorRT/Deepstream for Tesla & Jetson” is built in the container “nvcr.io/nvidia/deepstream: 3.0-18.11” and then “deepstream-yolo-app” I want to run
Which location should I specify specifically as “cmake -D DS_SDK_ROOT = <DS_SDK root>” described in the “README”?
Please tell me if there is a file that will be a marker.

No matter which location you specify for deepstream-yolo-app. Make sure the file path in the config is correct.

I checked the configuration file and corrected the path.
Thank you for your reply.