I am curious to develop an active 3D technology to stream real time visuals in Jetson Orin Nano using Stereoscopic camera module.
Initially I developed at a resolution of 540*320, which works good at a shuttering frequency of 48Hz. Eventually I increased the resolution to 720p, FullHD and 2K, but as the resolution Increases the processing capability of alternate displaying of frames reduces. It of course delayed due to rendering capability of Jetson Orin Nano.
Please suggest me few alternative solutions to increase the processing speed. I’ve used opencv and python for my developments.
I’ve been using Jetpack 5.1.2.
cat /etc/nv_tegra_release
R35 (release), REVISION: 4.1, GCID: 33958178, BOARD: t186ref, EABI: aarch64, DATE: Tue Aug 1 19:57:35 UTC 2023
ls /usr/lib/aarch64-linux-gnu/libargus.so
ls: cannot access ‘/usr/lib/aarch64-linux-gnu/libargus.so’: No such file or directory
jetson@ubuntu:~/jetson_camera_project$ ls /usr/lib/aarch64-linux-gnu/tegra-egl/libnv-eglstream.so
ls: cannot access ‘/usr/lib/aarch64-linux-gnu/tegra-egl/libnv-eglstream.so’: No such file or directory
These files were not available in my system, so please help me to proceed with how to maximise the camera renders without delay.
There is no update from you for a period, assuming this is not an issue any more.
Hence we are closing this topic. If need further support, please open a new one.
Thanks
Sorry for the late response.
Is this still an issue to support? Any result can be shared?