Im using the motion_estimation demo and want to render to a buffer off screen so that I can stream the resulting image over ethernet to a remote display. The TX1 is consuming raw RTP video and producing a second RTP stream with the rendered video (nothing ever goes to the screen).
nvxio::Render does not appear to allow you to access the rendered buffer after calling nvxio::Render::flush().
Does anyone know a way to access the vx_image from Render after its all the primitives have been processed?
Hi,
nvx_render support several option.
For example, if you execute
./nvx_demo_motion_estimation --nvxio_render video
nvxio::Render will output a video instead of display on the window.
Options:
VisionWorks library info:
VisionWorks version : 1.5.3
OpenVX Standard version : 1.1.0
./nvx_demo_motion_estimation: invalid value for option --nvxio_render: must be one of "default", "image", "stub", "video", "window" (got "test")
Would you want to give it a try first?
Thanks for the suggestion but I need to render the image off screen so that I can encode the video stream as RTP for streaming there is not display in my application and the box is effectively headless.
The quick solution was to ditch nvxio::Rendrer and use CUDA to render the motion directly into the RGBX buffer. In future I may try and used OpenGL to render my off screen buffer.
If anyone wants to know how I did this the code has been posted to github here: