How to change the video capture device resolution in Visionworks?

Hi,

I’ve followed several links to get the video showing from my video capture device.
Can’t get images from webcam with Visionworks 1.6 - Jetson & Embedded Systems / Jetson Xavier NX - NVIDIA Developer Forums

If I set it to 640x480, the display is 640x480
If I skip the width and height, or set it to 720x480, the display is 720x480
But if I set the width and height to 1920x1080 or 1280x720, the device can not be found.

I just omit the height and width setting but the resolution I got is only 720x480. It is supposed to display 1920x1080… (through ffmpeg checking)

Then, I kind of followed this link, there is some slight different. It is for feature_tracker not feature_tracker_nvxcu. Even if I open the feature_tracker, I can not find the (nvxio) location for modification, may be it is the version issue?


After this modification, the resolution is still 720 x 480, may I know what I should try next? Thx
Btw, I have tried my c270 webcam, the resolution is stuck at 640x480 only…I cannot go to 720p

Hi,
Please follow the steps in
Jetson Nano FAQ
Q: I have a USB camera. How can I launch it on Jetson Nano?

And check if 720x480 is supported, and you can launch it in gst-launch-1.0 command.

Beside, we would suggest use VPI rather than VisioWorks. Please check:
VPI - Vision Programming Interface: Installation

Most functions in VisioWorks are supported in VPI. Please take a look at the document:

Hi,

For sure the webcam can go to 720p and my video capture device can go to 1920x1080. I have used them before in detectnet from dusty infererence.

I am using jetson 4.6

If seems your link does not really relate to the question…
is there any instruction of setting 1920x1080 for VisionWorks?

And I have tried to install VPI, but I got errors… I just followed the instruction one by one… Thx


Thanks.

Hi,
The function of dense optical flow is not supported on Jetson Nano. For using the VPI functions, would need to use other Jetson platforms.

There is no document for setting resolution to 1920x1080. Since samples of VisionWorks are open source. Please add prints to get more information. You can also enable gstreamer debug logs to check further:

$ export GST_DEBUG=*:5

Other users may share experience.

If you need to do object detection, may also try DeepStream SDK.

Mmm… But at least 1280x720? the car.mp4 is at 1280x720

And may I know the original link for the VisionWorks github? Maybe I can find some hint in there.

And for VPI, do I need a NX to run it? Thx

Hi,
For VisionWorks, the code is open source and put in the directory:

/usr/share/visionworks/sources

For VPI, if you need other functions, it is fine to use Jetson Nano. If you need function of dense optical flow, would need to use Xavier or Xavier NX

HI,

The VPI said it can be run in NX in the bottom of the page
https://docs.nvidia.com/vpi/index.html
My jetpack version is 4.6-b199
But there is some errors duration installation, I wonder if anyone has successfully run the VPI? Thx

I have tried in the NX, but there is still error.



It seems VPI can not be run in nano. (It is a distraction… :<)
May I have the instruction of how to run the 1080p live feed in VIsionWorks?
If it is the open source, may I have the original github link? I did search in github but all are clones…
I hope I could find the answer in their FAQ or issue.
It is a long… journey…
Thanks

Hi,
Please share which VPI sample does not work. We would like to reproduce the failure and check.

For using VisionWorks, please add prints to the source code for your development. Do you see 1920x1080p30 mode is picked in v4l2src? Should be able to get more information from GST_DEBUG prints.

HIi,

I have run the DEBUG and here is the log…
Btw, how to disable the debug (i.e. restore to original)
What I don’t understand is that mp4 can go to 1280x720 in the sample demo, but not for live stream…

Here are some info, but guess not that useful…

gstv4l2object.c:4238:gst_v4l2_object_probe_caps:<v4l2src0:src>e[00m probed caps: video/x-raw, format=(string)YUY2, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1; video/x-raw, format=(string)YUY2, width=(int)1600, height=(int)1200, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1; video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)1024, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)8/1; video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)960, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)8/1; video/x-raw, format=(string)YUY2, width=(int)1360, height=(int)768, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)8/1; video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)10/1; video/x-raw, format=(string)YUY2, width=(int)1024, height=(int)768, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)10/1; video/x-raw, format=(string)YUY2, width=(int)800, height=(int)600, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 20/1, 10/1, 5/1 }; video/x-raw, format=(string)YUY2, width=(int)720, height=(int)576, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 25/1, 20/1, 10/1, 5/1 }; video/x-raw, format=(string)YUY2, width=(int)720, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 20/1, 10/1, 5/1 }; video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 20/1, 10/1, 5/1 }; image/jpeg, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 60/1, 30/1, 25/1, 20/1, 10/1 }; image/jpeg, width=(int)1600, height=(int)1200, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 60/1, 30/1, 25/1, 20/1, 10/1 }; image/jpeg, width=(int)1280, height=(int)1024, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 60/1, 30/1, 25/1, 20/1, 10/1 }; image/jpeg, width=(int)1280, height=(int)960, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 60/1, 30/1, 25/1, 20/1, 10/1 }; image/jpeg, width=(int)1360, height=(int)768, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 60/1, 30/1, 25/1, 20/1, 10/1 }; image/jpeg, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 60/1, 50/1, 30/1, 20/1, 10/1 }; image/jpeg, width=(int)1024, height=(int)768, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 60/1, 50/1, 30/1, 20/1, 10/1 }; image/jpeg, width=(int)800, height=(int)600, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 60/1, 50/1, 30/1, 20/1, 10/1 }; image/jpeg, width=(int)720, height=(int)576, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 60/1, 50/1, 30/1, 20/1, 10/1 }; image/jpeg, width=(int)720, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 60/1, 50/1, 30/1, 20/1, 10/1 }; image/jpeg, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 60/1, 50/1, 30/1, 20/1, 10/1 }

Thanks,
error1.log (18.1 MB)

Hi,
You have mentioned it works in gst-launch-1.0 command. Please get log of gst-launch-1.0 command so that you can compare the logs.

And would like to know which VisionWorks function is needed in your use-case. If you need object detection and tracking, it is better to use DeepStream SDK.

Hi,

I guess I should summarize my questions.
2. May I know if VisionWorks has github? My intention is to search for answer in there instead. Is there any place I should ask this question instead? Thx

3. I can use detectnet for 1920x1080 video stream, I don’t need to use it in this application, may I know how to use gst-launch-1.0 to get more info. of the streaming problem? What parameter shall I put? I don’t really familiar with it, sorry about it. Is it in my picture of GStreamerCameraFrameSourcelmp.cpp? std::string caps_string(“video/x-raw, format={string}{”)? I have tried 1920 x 1080 already…
4. I just need feature_tracker or feature_tracker_nvxcu, thx
5. My goal is to feed 1920x 1080 stream to the feature_tracker_nvxcu. If I run the sample demo mp4, it can have 1280x720. So far, I can get the 720x480 or 640x480 stream only, thx
Thx

Hi,
Please modify/rebuild GStreamerCameraFrameSourceImpl.cpp from

    stream << "video/x-raw, format=(string){RGB}, width=[1," << configuration.frameWidth <<
              "], height=[1," << configuration.frameHeight << "], framerate=" << configuration.fps << "/1;";

to

    configuration.frameWidth = 1920;
    configuration.frameHeight = 1080;
    configuration.fps = 30;
    stream << "video/x-raw, format=(string){YUY2}, width=" << configuration.frameWidth <<
              ", height=" << configuration.frameHeight << ", framerate=" << configuration.fps << "/1;";

With the modification we can run the sample in 1920x1080p30 with Logitech BRIO 4K usb camera. The command is:

VisionWorks-1.6-Samples/bin/aarch64/linux/release$ ./nvx_demo_feature_tracker_nvxcu -s "device:///v4l2?index=1"

Please give it a try.

1 Like

Hi,

I have tried it but fails…

error: Can't open source URI device:///v4l2?index=0

I have create a new VisionWorks folder (original) and cut and paste your code…

But if I change your code to
configuration.frameWidth = 720;
configuration.frameHeight = 480;
(or 640x480)
there is a video window shows up.

Also, I have also modified this line, but fail in the same way.

std::string caps_string("video/x-raw,width=(int){1920}, height=(int){1080}), format=(string){");

I really want to make it work, may I know which part I have done wrong? Thx

HI,

If I use the original last line of

    configuration.frameWidth = 1920;
    configuration.frameHeight = 1080;
    configuration.fps = 30;
    stream << "video/x-raw, format=(string){YUY2}, width=[1,", << configuration.frameWidth <<
              "], height=[1," << configuration.frameHeight << "], framerate=" << configuration.fps << "/1;";

There is a 640x480 video. But not 1920x1080… Thx

Hi,
It looks like your USB camera does not support 1920x1080p30 YUYV. Do you have other USB camera for a try? Better to be USB3 camera.

Mmmm… but if I use dusty nv infrerence, I can get 1920x1080. And in my previous message, I did check the resolution and it has 1920x1080.
Is there any way I can force the resolution? Thx
resolution

Hi,
If I use yours, even if I set to 640x480, there is an error.
error: Can’t open source URI device:///v4l2?index=0

    configuration.frameWidth = 1920;
    configuration.frameHeight = 1080;
    configuration.fps = 30;
    stream << "video/x-raw, format=(string){YUY2}, width=" << configuration.frameWidth <<
              ", height=" << configuration.frameHeight << ", framerate=" << configuration.fps << "/1;";

If I use this, it display 640x480 even if I set to 1920x 1080.

    configuration.frameWidth = 1920;
    configuration.frameHeight = 1080;
    configuration.fps = 30;
    stream << "video/x-raw, format=(string){YUY2}, width=[1,", << configuration.frameWidth <<
              "], height=[1," << configuration.frameHeight << "], framerate=" << configuration.fps << "/1;";

Which one shall I use? Thx

Hi,
Probably your source supports 1920x1080 5fps like:

                Size: Discrete 1920x1080
                        Interval: Discrete 0.200s (5.000 fps)

Please confirm and set correct fps.

Hi,

Yes, if I change to 5fps in here, the 1920x1080 video shows.

configuration.fps = 30;

So, it is the limitation of the optical flow? I got a smooth 1920x1080 in dusty nv detectnet though… I assume it is not my video capture device problem? Or is it?

And can I increase the fps? And where can I check the maximum fps?
Thx