Tested yolov8 on Jetson nano

@shaun.johnson may I ask a question? Thank you!
noticing you tested yolov8 on Jetson nano, here is your topic,
did you set up Deepstream 6.2 on Jetson nano? one user also tested yolov8 on deepstream6.0.1 and Jetson nano, did you meet his issue “Assertion failed: inputs.at(0).isInt32() && “For range operator with dynamic inputs, this version of TensorRT only supports INT32!””? here is his topic.

Hello Fanzh. I did not try to install deepstream 6.2 on the jetson nano…I got deepstream 6.0.1 working on the jetson nano though. And I could not get yolov8 working with the jetson nano. The Onnx inference model could not be build properly - I think this might have been due to the python version 3.6.x that I tried to use to build the model…When I switched to deepstream 6.2 and started to use python3.8.x then the onnx model was created with the correct output shape and it started to work in my program.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.