YOLOV5 inference on Jetson TX2

Hey everyone, I have trained my own custom object detection model on my desktop (NOT JETSON PRODUCT) with https://github.com/ultralytics/yolov5. Now I have my best.pt that has the weights of the model and I want to make inference on jetson. I have read about tensorrt but my time and knowledge is limited about both gpu computing and tensorrt. So I just decided not to use tensorrt engine for now and just run the inference code on jetson tx2. However when I get troubles about dependent packages like scipy, matplotlib I cannot just install diretly with pip3 or pip. Why is this happening and what should I do?

Best Regards

Hi,

Due to ARM architecture, some module may not have prebuilt package for installation.
For scipy, please install it with the following command:

$ wget https://github.com/scipy/scipy/releases/download/v1.3.3/scipy-1.3.3.tar.gz
$ tar -xzvf scipy-1.3.3.tar.gz scipy-1.3.3
$ cd scipy-1.3.3/
$ python3 setup.py install --user

After the scipy installed, you should be able to install matplotlib through pip3.

Another alternative is to use the docker image here.
It has common ML module preinstalled, so you don’t install them one by one.

Thanks.

Hello @AastaLLL thanks for the information that helped a lot. Can we say that whenever I want to install python module for Jetson TX2, I should build it from source. I also had the same issue for caffe.

Best Regards

Hi,

It depends.

You will need to build it from source if a module doesn’t have a prebuilt package for ARM system.
But you don’t need to do if there is a corresponding package, ex. keras.

Thanks.