How to run sampleSSD in TensorRT5

How to run sampleSSD in TensorRT5 like other samples?
How to modify original prototxt file? Do I need to modify sampleSSD.cpp like sampleFasterRCNN?

Please reference the README.md that accompanies sampleSSD

  1. compile
  2. download models
  3. read notes on ssd.prototxt

But we didn’t found the ssd.prototxt. Could you please tell me the location?

Please refer to:https://docs.nvidia.com/deeplearning/sdk/tensorrt-release-notes/tensorrt-5.html#rel_5-0-3

sampleSSD
This sample demonstrates how to perform inference on the Caffe SSD network in TensorRT, use TensorRT plugins to speed up inference, and perform INT8 calibration on an SSD network. To generate the required prototxt file for this sample, perform the following steps:
Download models_VGGNet_VOC0712_SSD_300x300.tar.gz from: https://drive.google.com/file/d/0BzKzrI_SkD1_WVVTSmQxU0dVRzA/view
Extract the contents of the tar file;
tar xvf
~/Downloads/models_VGGNet_VOC0712_SSD_300x300.tar.gz
Edit the deploy.prototxt file and change all the Flatten layers to Reshape operations with the following parameters:
reshape_param {
shape {
dim: 0
dim: -1
dim: 1
dim: 1
}
Update the detection_out layer by adding the keep_count output, for example, add:
top: “keep_count”
Rename the deploy.prototxt file to ssd.prototxt and run the sample.
To run the sample in INT8 mode, install Pillow first by issuing the $ pip install Pillow command, then follow the instructions from the README.

Thanks! I got it.