NVIFR + NVENC H.264 video streaming concrete example

I am running a rendering software on the AWS NVIDIA GRID EC2 instance and want stream the output of that renderer application to my local machine, to VLC if possible.

I’ve looked at Nvidia GRID SDK examples but they barely demonstrate streaming the output of an application. This a very simple usecase that people would want. Basically, run a rendering application on cloud and stream the output of that application to local machine.
All the NVIFR examples show how to create a render target and use capture from that target but not from a rendering application.
Is anyone aware of a solution?


Just want to say that I am also trying to do the same thing, but really there are no clear instruction steps especially for people are less familiar with low level programming. I have managed to set a ubuntu machine by going through these instructions:


and also these instructions:


after that i really cant compile any of the samples that come with the nvidia grid sdk.

I have on this for about 14 hours now. very frustrating.


The streaming is on my ToDo list… But first, I have to record the ‘.h264’ file into an ‘mp4’ file (videoplayers generally do not support .h264)
I’m thinking to use libavformat (part of ffmeg). It may also do the streaming for free…


Hey guys, I’m currently working on the same thing. Any update on this ?

Many developers that use the GRID SDK stream existing applications will use API hooking into OpenGL or DirectX. What this method does, is that it intercepts the present API calls when using NvIFR, which effectively captures render targets and then sends it the GRID API which gets encoded as H.264.


If using NvFBC (to capture the full desktop), the app shimming is not necessary, as it will run in a separate process to capture the screen or desktop to H.264.