If you are using the EncodeFromBuffer() path, you would use the NvBuffer object.
It appears that the jpeg_encode sample supports both ways, so you could check the difference. The EncodeFromFd() path uses NvVideoConverter.
All things considered, I’ve been meaning to write Python bindings for gstEncoder / gstDecoder C++ classes from jetson-utils, while doing that I would probably extend gstEncoder to support nvjpegenc while I’m at it. It’s GStreamer from C, but could also easily do network streaming (like RTP/RTSP).
I am very new to this. I have Quadro P2000 with compute capability 6.1 on Windows 10 machine. I want a sample code or APIs to read a video file frame by frame from the hard disk for doing some processing on the video frame and show the frame on display. Does CUDA have APIs for video read and display or we have to use OpenGL for video read and display.
Hi there, the discrete GPUs such as your Quadro card and Windows have different APIs for video decoding. It would seem from your other post you are on the right track with the Video Codec SDK. Jetson uses GStreamer and V4L2 from Linux for the hardware codecs.
Thanks for your response.
I am not able to run samples program of Video Codec SDK. it is giving MSB3073 error.
do I need to install ffmpeg separately to run the sample programs.
Sorry, I am not familiar with using the Video Codec SDK (as mentioned we use other API on Jetson). But quickly looking at your other thread, it appears the Visual Studio post-build event is looking for CUDA toolkit 10.1 path, so if you have 10.0, you could try editing the post-build event to reflect your correct path. Please continue posting about this to your topic in the Video Codec forum.
Just to let you kow that “zeroCopy=True” does helped in my code to save frame only when it detected the object in interest, without this arguments, it won’t.