Getting gstreamer input or csi camera input to Visionworks examples.

Hello,

I have succesfully compiled and tested all the examples of Visionworks on the nano but I can not get it to take csi nor gstreamer input. I have the csi camera (imx219) and gstreamer to work separately so that should be all right.

I’ve followed the documentation and put the --source argument but no luck at all. I there anything needed that I am missing?

Thank you all very much.

Hi davidsas,

Could you please check if you can see your csi camera listed

ls /dev/video*

Also, Can you check if some of the topics on forums such as

https://devtalk.nvidia.com/default/topic/1050567/jetson-nano/running-visionworks-nvx_sample_nvgstcamera_capture-use-usb-camera-logitechc310-error/post/5365543/#5365543

Hello ak-nv,

The camera is detected and working all right on the rest of the applications, gstreamer commands on terminal work perfectly as well as the tensorRT examples of jetson-inference.

ls /dev/video* outputs /dev/video0 as expected.

I’ve read all the threads related to it not only on nano but also for tx1 and tx2, I’ve tried several different strings (I’ve seen on other versions of Visionworks the stream could be slightly different) but it always return “can’t open source uri device”.

I’ve tried also to put on the --source argument a path to a different video that the one on the examples and that also works perfectly. Do you know what I could be missing?

Thank you very much for your answer.

Just find my answer taking a look at nvgstcamera_capture demo. In case anyone else gets this issue the argument needed to pass is:

–source=“device:///nvcamera”

At least in my case, I am using a imx219 (Raspberry camera V2). On the rest of the documentation the string indicated seems to be wrong.

The samples run very nice, more or less with the same speed than taking video files, but the video stabilization one gets stuck every some frames. I will take a look at it as I am very interested in that one in particular.

Regards.