What's wrong with imagenet/detectnet, continous printf?

imagenet.log (424.8 KB)

Just followed the guide: jetson-inference/docs/imagenet-console-2.md at master · dusty-nv/jetson-inference · GitHub

And I got continous printf?

JetPack6.0 Linux36.3, any idea?


It’s very much could not find engine cache ... MonoDepth-FCN-Mobilenet/monodepth_fcn_mobilenet.onnx.1.1.8602.GPU.FP16.engine ? · Issue #1855 · dusty-nv/jetson-inference · GitHub alike.

Hi,

jetson-inference deploys a model with TensorRT.
The log is about the TensorRT compiling process and will take some time.

Please wait longer; the detection will start after the conversion is done.
This is a one-time job so you can re-use the engine next time.

Thanks.

It takes more than 40 mins, still continue to prinf messages…
How long will it take when first launched?


EDIT: really takes time, anyway, just go out and have a meal, then it probabaly is OK.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.