Nvidia jetson Xavier NX 8 GB - Can this board make inference with 4K images?

I have a Nvidia Jetson Xavier NX 8 GB and I need inference of 4K images. In this site https://learnopencv.com/object-detection-using-yolov5-and-opencv-dnn-in-c-and-python/ they use opencv and a function named blobfromimage. One thing i know is that the size we put in this function needs to be the same size as the images used to train, in my case, yolov5m6. So, my question is the following: Can the board handle four yolov5 models (in series) where the input image is 4K (without resizing)? The inference should be no longer than 5 seconds for the 4 models.

Hi,

On Jetson, it’s recommended to use TensorRT to accelerate the inference.
If dynamic input is accepted for the model, you can test it directly with our trtexec tool.

Thanks.

Is TensoRT performance WAY better than opencv? I mean right now I have the code build for opencv, it would take some time to transform it in TensoRT. Anyway, I tried the code in the board and the ubuntu crashed.

@AastaLLL Please help.

Thanks

Hi,

Sorry for the late update.

Usually, yes.
Do you use Ultralytics official source?
If yes, the have a tutorial to build it for TensorRT directly:

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.