How to develop OpenGL ES 2.0 library (+ EGL) on Windows / Visual Studio ?

I want to develop a OpenGL ES 2.0 library, for a embedded board with linux on it (ARM processor). As it is much more comfortable, I would like to develop and debug the library on windows, with Visual Studio.
I have NVIDIA graphic cards on my windows PC.

What is the best strategy to do that (I would need OpenGL ES 2.0 + EGL) ? Shall I use OpenGL and restrict myself to the subset (and constraints) covered by OpenGL ES 2.0, or shall I use an OpenGL ES 2.0 ‘emulator’ ? E.g., there seems to be the ‘Angles’ project mentioned in http://stackoverflow.com/questions/1446461/getting-started-with-opengl-es-2-0-on-windows and http://www.g-truc.net/post-0457.html

And how to get EGL for windows ?

Are you looking for an OpenGL ES emulator or something else( I asked because you mentioned the machine architecture, but in the case of an OpenGL ES emulator the architecture does not matter ). In the case you are in need of an OpenGL ES emulator I would recommend first checking to see what GPU is on the embedded board and check to see if the HW vendor have an emulator on their website. With that said,

ARM ( Mali GPUs ) have an OpenGL ES emulator on their website.
Imagination Tech ( PowerVR GPUs ) have an OpenGL ES emulator on their website.
Qualcomm ( Adreno GPUs ) have an OpenGL ES emulator on their website.

Most of the emulators are GPU agnostic as they are just shim over the desktop GL running on the host platform.

As for EGL, that will be provide with the respective OpenGL ES emulator(s) listed above.

I actually use all three and each have their quirks, but are very convenient for development. Also DO NOT expect the behavior of the OpenGL ES emulator to be exactly the same as the device so test often on the actually device itself to save you the headache later on…

Cheers!!

It’s a bit outdated, but here goes my 2cents contribution:

I was also looking for a code snipet to create EGL context on desktop NVidia.
I was aware that on Windows, NVidia exposes EGL through extensions WGL_EXT_create_context_es_profile and WGL_EXT_create_context_es2_profile, and on Linux NVidia allows EGL through GLX_EXT_create_context_es_profile and GLX_EXT_create_context_es2_profile extensions.

Of course, you also have the afore mentioned OpenGL-ES drivers/emulators from PowerVR / Adreno / Mali / Angle, but I was looking for a raw implementation, i.e., my own homegrown libGLESv2.dll and libEGL.dll

After some struggle, I’ve got something like below.
The basic thing is to get required extensions, by hand or using GLEW, create dummy context and then create your GL-ES context. Error handling is omitted for clearance.
You also need to define and load all core and extension proc functions, and make then public for your libs.

/**
 * adapted from
 * from https://www.opengl.org/wiki/Tutorial:_OpenGL_3.1_The_First_Triangle_%28C%2B%2B/Win%29
 */
bool CGLRenderer::CreateGLContext(CDC* pDC){
PIXELFORMATDESCRIPTOR pfd =
{
 sizeof(PIXELFORMATDESCRIPTOR),
1,
PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER,    //Flags
PFD_TYPE_RGBA,            //The kind of framebuffer. RGBA or palette.
32,                        //Colordepth of the framebuffer.
0, 0, 0, 0, 0, 0,
0,
0,
0,
0, 0, 0, 0,
24,                        //Number of bits for the depthbuffer
8,                        //Number of bits for the stencilbuffer
0,                        //Number of Aux buffers in the framebuffer.
PFD_MAIN_PLANE,
0,
0, 0, 0
};  

int nPixelFormat = ChoosePixelFormat(pDC->m_hDC, &pfd); 
if (nPixelFormat == 0) return false;

BOOL bResult = SetPixelFormat (pDC->m_hDC, nPixelFormat, &pfd); 
if (!bResult) return false; 

HGLRC tempContext = wglCreateContext(pDC->m_hDC);
wglMakeCurrent(pDC->m_hDC, tempContext);

// Using GLEW. Init it after ctx creation
GLenum err = glewInit();
if (GLEW_OK != err){
    AfxMessageBox(_T("GLEW is not initialized!"));
}

// create OPENGL ES 2 profile. It may return a compatible, but higher GL-ES, as 3.0 or 3.1 
int attribList[] = {
            WGL_CONTEXT_MAJOR_VERSION_ARB, 2,
            WGL_CONTEXT_MINOR_VERSION_ARB, 0,
            WGL_CONTEXT_PROFILE_MASK_ARB, WGL_CONTEXT_ES2_PROFILE_BIT_EXT,
            0,
};

if(wglewIsSupported("WGL_ARB_create_context") == 1){
    m_hrc = wglCreateContextAttribsARB(pDC->m_hDC,0, attribs);
    wglMakeCurrent(NULL,NULL);
    wglDeleteContext(tempContext);
    wglMakeCurrent(pDC->m_hDC, m_hrc);
}else{  //Failed to create a GL-ES context. 
    m_hrc = NULL;
}

//Debug info - print out GL version 
const GLubyte *glVersionString = glGetString(GL_VERSION); 
const char *vendorChar = (char*)glGetString(GL_VENDOR);
const char *rendererChar = (char*)glGetString(GL_RENDERER);

int glVersion[2] = {-1,-1};
glGetIntegerv(GL_MAJOR_VERSION,&glVersion[0]);
glGetIntegerv(GL_MINOR_VERSION,&glVersion[1]);

cout<<"GL version string: "<< glVersionString << <<endl;
cout<<"OpenGL version: "<<glVersion[0]<<"."<<glVersion[1]<<endl;
cout<<"GPU: "<<vendorChar<<" - "<<rendererChar<<endl;

if (!m_hrc) return false;

return true;
} // end of CreateGLContext

This is available at
http://stackoverflow.com/questions/31971373/how-to-create-egl-context-on-nvidia-desktop