Jetson inference - DetcetNet generates many Tactic messages

I use pednet and custom network when using detectnet.

Unlike using ssd mobilenet, it works after waiting for many tactic messages.

It does not only occur when the network is first loaded, but because it occurs every time, it is inconvenient to use.

It takes about 5 minutes.

Is this a common situation?

Hi,

To have a better suggestion, could you share the message with us first?

Thanks.

After the first run when the tactics are profiled, the TensorRT engine should be saved to disk in the same directory that the source model is located. For example - after you run pednet, do you have this file?

<jetson-inference>/data/networks/ped-100/snapshot_iter_70800.caffemodel.1.1.GPU.FP16.engine

Likewise, you should have a similar engine file for your custom network after you run it the first time. If not, it is failing to save it for some reason (are you out of disk space perhaps?) If you don’t have those engine files, please post the terminal log as Aasta suggested, so we can check it for errors.

1 Like

Thank you.

I solved it by adding option

error log is : cache file not found profiling network model on device gpu
so i adding option : --model = snapshot_iter_70800.cafemodel --prototxt=deploy.prototxt