If so what would you need and how would you do it?
There is a hardware converter in TX2. A general usecase is to use it in video playback:
$ export DISPLAY=:0 $ gst-launch-1.0 uridecodebin uri=file:///home/nvidia/a.mp4 ! nvvidconv ! 'video/x-raw(memory:NVMM),width=640,height=360' ! nv3dsink
In the pipeline, nvvidconv plugin uses the hardware engine to downscale the source into 640x480. The engine also supports upscaling. If you use gstreamer, you may refer to above pipeline.
Beside gstreamer, we also support jetson_multimedia_api.
gstreamer coding has always confused me. Can you suggest how it goes to upscale rather than 640x480 it? Also how would I hook it up and set it to use the input rather than a .mp4 file?
Actually we don’t have experience in this usecase. Is your source a v4l2 source? Generally frame capture on Jetson platforms goes through v4l2 interface.