I use gstreamer pipeline to get multimedia streams from 2 cameras, the streams are independent , when I got the data by using egl , they didn’t have timestamps , so When I stitched them and showed the panoramic video, it cann’t sync . For example, a move object from camera 1 to camera 2 , when it disappeared from camera1, but it didn’t appear in camera2 simultaneously. How can I do with it?
Please share your gstreamer pipeline. Is there a way we can reproduce your issue on default carrier board?
I used TX2 with JectPack3.1, it has fixed this problem.
It is good to hear the issue is fixed with Jetpack 3.1, but it is not clear what the issue is. If possible, please share what pipeline you run for reference. It may help others who also face similar issues. Thanks.
The problem didn’t been fixed yet. When I tested on Monday, the problem seemed has been fixed. Maybe it cause by network delay.
I used eglframe to get 8 cameras streams. But they don’t include timestamp anymore,so they may be stitched frames in different timestamp.
Please share more detail. Do you use Bayer sensors through Tegra ISP or YUV sensors in v4l2 control?