Nvidia graphics card makes a image blur when it is rendering through OpenGL.

I am running a NvDecodeGL sample given in NVIDIA Video Codec SDK .The resolution of input image is 1376*768 & render it on OpenGL window which is 1920 *1080 in size.The Nvidia decoder output is good but Nvidia graphics card makes it blur when is render through OpenGL. Can any one suggest me how to get rid of it?

You’re upscaling an image from 1376768 to 19201080 using OpenGL (or D3D in the other post).
But you don’t explain how you’re doing that exactly.

For example, if you’re using a full-window textured quad to display the image and you have linear texture filtering enabled on that texture object, the texture will be filtered and becomes smoothed.
If you use nearest filtering instead you will see duplicated lines and columns because the target size is about 40% bigger in each dimension, which will also look weird.

The better solution would be a dedicated upscaling shader which filters this more cleverly.
You can find a lot of publications about different upscaling filters on the web which try to solve that general problem yourself.
Simply search for “video upscaling filter” to get a general idea and “video upscaling filter OpenGL GLSL” to find some implementations using OpenGL.

This is not graphics API related. The same is applicable for D3D, to answer your duplicate question in the other sub-forum.

Thanks for your reply,Now i am telling you how exactly i am doing for D3D:-

For Device Creation:-

DWORD dwBehaviorFlags = 0;

if( d3dCaps.VertexProcessingCaps != 0 )
	dwBehaviorFlags = D3DCREATE_HARDWARE_VERTEXPROCESSING;
else
	dwBehaviorFlags = D3DCREATE_SOFTWARE_VERTEXPROCESSING;


ZeroMemory(&D3DParameter, sizeof(D3DParameter));   // clear out the struct for use
D3DParameter.BackBufferWidth            = imageWidth;        // current image width is 1376              
D3DParameter.BackBufferHeight           = imageHeight;       // current image height is 768
D3DParameter.BackBufferFormat           = d3ddm.Format;   
D3DParameter.Windowed                   = true;                  // program fullscreen, not windowed                             
D3DParameter.FullScreen_RefreshRateInHz = D3DPRESENT_RATE_DEFAULT; //Default Refresh Rate
D3DParameter.PresentationInterval       = D3DPRESENT_INTERVAL_DEFAULT;   //Default Presentation rate
D3DParameter.BackBufferCount            = 1;                            
D3DParameter.SwapEffect                 = D3DSWAPEFFECT_DISCARD;      // discard old frames


if(FAILED(D3D -> CreateDevice(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, hwnd, dwBehaviorFlags, &D3DParameter, &D3Device)))
{
	MessageBox(NULL, TEXT("CreateDevice"), TEXT("Error"), MB_OK);
}

    //hwnd :-It is the rendering window handle which size is 1920*1080;

For Rendering :-

if( SUCCEEDED(D3Device -> BeginScene() ))
{
if(FAILED(D3Device -> GetBackBuffer(0,0, D3DBACKBUFFER_TYPE_MONO, &BackBuffer)))
{
MessageBox(NULL, TEXT(“GetBackBuffer”), TEXT(“Error”), MB_OK);
}

	hr = D3Device -> EndScene();

	if(FAILED(D3Device->StretchRect(surface,0,BackBuffer,0, D3DTEXF_LINEAR)))		
	{
		MessageBox(NULL, TEXT("StretchRect"), TEXT("Error"), MB_OK);
	}
	
}

hr = D3Device -> Present(0, 0, 0, 0);

I think now you can suggest me , what is my implementation fault.It will be helpful for me.

“D3Device->StretchRect(surface, 0, BackBuffer,0, D3DTEXF_LINEAR)”
That’s your issue. You’re just resizing the image with a linear filter.

If you want the video image to preserve more sharpness during that upscaling operation, you’d need to apply a much more complex filtering/resampling operation.
Just do some research about this on your own. Here’s a starting point: https://en.wikipedia.org/wiki/Image_scaling

To get an idea of what’s possible, have a look at the screenshots on this site. You’re doing what’s shown in the first image, bilinear filtering:
https://github.com/mpv-player/mpv/wiki/Upscaling

OK.I understand that i have to use some scaling algorithm to render a smooth image ,but the same api “D3Device->StretchRect(surface, 0, BackBuffer,0, D3DTEXF_LINEAR)” working fine with intel graphics card and default driver on nvidia graphics card (If i am disable the nvidia driver).
Does it mean that Intel driver has in built scaling algorithm and in the case of nvidia driver we will use scaling algorithm explicitly?

Define “works fine”. You only presented screenshots of the linearly filtered image in the other post. Do the other drivers filter linearly?

In any case, this is solely your responsibility to implement as you need it. Bilinear filtering isn’t sufficient for the quality you’re looking for, but that’s what you told the driver to do.

“Works Fine” means, there is no effect on image quality after stretching.
I run same piece of code that i posted above on both driver (Intel or Nvidia).Intel output is “IntelScreen.png” and Nvidia output is “NvidiaScreen.png”.Please find the attachment and match the difference between both images.According to me Intel output image is smooth and sharp instead of Nvidia.


Could you please provide the system configurations on which you generated these images to allow reproduction of the results for our QA?

OS version, installed GPU(s), display driver version for both vendors.

Thanks.

System Configurations:-

Windows Edition:-

Windows 10 Enterprise
2016 Microsoft Corporation. All rights reserved.

System:-

Processor : Intel® Core™ i7-6700HQ CPU @ 2.60Hz 2.59GHz
Installed memory(RAM) : 4.00 GB
System type : 64-bit Operating System, x64-based processor

Display adapters:-

NVIDIA:-
Card name : NVIDIA GeForce GTX 950
Driver Version : 376.54

Intel:-
Card name : Intel® HD Graphics 4600
Driver Version : 20.19.15.4531

I am attaching a d3d9 image rendering sample code,please run it with both driver (Intel and Nvidia).You will see the difference.

If you can tell me how can i solve this problem ,it will be helpful for me.

Directx 9 Rendering.zip (381 KB)