Graphics
Intel® graphics drivers and software, compatibility, troubleshooting, performance, and optimization
20494 Discussions

Possible Memory leak in OpenGL drivers for HD and Series 4 graphics chipsets on Windows 7

idata
Employee
1,303 Views

We have been experiencing a memory leak when using pixel buffer objects to update textures. The leak seems to only occur on certain platforms with certain operating systems, and so we began to suspect the leak is occuring inside the Intel HD Series Windows 7 driver. Below is a summary of our testing on different platforms.

Can anyone tell if we are doing somthing wrong in the code, or if this is potentially a driver leak?

PlatformOperating SystemResultSeries 4 Chipset (Lenovo SL300)Windows XP SP3No LeakSeries 4 Chipset (Lenovo SL300)Windows 7Leaks ~500 kB/minIntel HD Series (Lenovo X1)Windows 7Leaks ~500 kB/minIntel HD 3000 (11" MacBook Air)Mac OS 10.7.3No LeakNvidia Quadro NVSWindows XPNo Leak

Here is the code (VS2008 project at http://www.viionsystems.com/Intel_HD_Series_Win7_Leak_Test_Case.zip http://www.viionsystems.com/Intel_HD_Series_Win7_Leak_Test_Case.zipl). Extensive testing of this code shows no memory leaks detectable by VS2008's memory leak detector, yet GPU memory seems to grow indefinitely (according to ProcessExplorer).

We would appreciate any thoughts from the community or Intel on this issue.

# include # include # include "GL\GLee.h" # include "GL\freeglut.h" unsigned int w = 640; unsigned int h = 480; unsigned int s = 4; char* img = NULL; char* texData1 = NULL; char* texData2 = NULL; char* mappedBuf = NULL; GLuint pixelbufferHandle; void timerCallback(int value); void initializeTextureBuffer(); void mapAndCopyToBuffer(char* img1); void paintGL(); void changeSize(int w, int h); GLuint errorCode; # define checkForGLError() \ if ((errorCode = glGetError()) != GL_NO_ERROR) \ printf("OpenGL error at %s line %i: %s", __FILE__, __LINE__-1, gluErrorString(errorCode) ); int main(int argc, char **argv) { texData1 = new char[w * h * s]; texData2 = new char[w * h * s]; memset(texData1, 85, w * h * s); memset(texData2, 170, w * h * s); img = texData1; glutInit(&argc, argv); glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA); glutInitWindowPosition(300, 300); glutInitWindowSize(w, h); glutCreateWindow("Window"); glutDisplayFunc(paintGL); glutReshapeFunc(changeSize); initializeTextureBuffer(); timerCallback(0); glutMainLoop(); glDeleteBuffers(1, &pixelbufferHandle); delete[] texData1; delete[] texData2; return 0; } void initializeTextureBuffer() { glGenBuffers(1, &pixelbufferHandle); checkForGLError(); glBindBuffer(GL_PIXEL_UNPACK_BUFFER, pixelbufferHandle); checkForGLError(); glBufferData(GL_PIXEL_UNPACK_BUFFER, w * h * s, 0, GL_DYNAMIC_DRAW); checkForGLError(); // initialize and upload the texture glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_BGRA, GL_UNSIGNED_INT_8_8_8_8_REV, 0); checkForGLError(); // Specify filtering and edge actions glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT); glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR); glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR); glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP); glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP); img = (char*) glMapBuffer(GL_PIXEL_UNPACK_BUFFER, GL_WRITE_ONLY); if (img == NULL){ return; } memset(img, 0, w * h * s); glUnmapBuffer(GL_PIXEL_UNPACK_BUFFER); } void mapAndCopyToBuffer(char* img1) { glBindBuffer(GL_PIXEL_UNPACK...
0 Kudos
1 Reply
ND
Novice
406 Views

I don't know if I will talk about the same thing, but me too I have OpenGL issue, but I don't know if it's memory leaks.

I have Intel HD Graphics

I wan to play Il2 1946, an aircraft simulation, and it is used OpenGL 2.0 or 2.1. I have OpenGL 2.1, but I'm not able to have all feature of OpenGl.

I have strange cloud with paper line in them and the ground flash. I'm not able to have 3D water, but I'm suppose to have that. I can just play in DirectX mode, but I don't have "Perfect Setting" because perfect is just with OpenGL.

I said to a technical support in a chat if Intel can test this game with intel HD graphics.

Some screen at the end of my post here : /thread/27375?tstart=60 http://communities.intel.com/thread/27375?tstart=60

EDIT: A guy on another forum wrote that :

"The bigger issue is that your graphics system is designed mostly for low powered performance. 3D acceleration on the Intel GMA cards (including the "HD" versions) is oriented around acceptable throughput for running HD video streams and having a smooth experience in Windows Vista/7.

 

Even bigger an issue is that Intel may say that they support OpenGL 2.1 but it's not likely to be an optimized implementation. They did it to check off the boxes for simple OpenGL applications but flight simulators may be right off their charts. If you can get it running in Excellent mode I'd say you're doing very well. If it works at all in Perfect I'd be surprised"

Moderator, can you increase the performance of 3D for OpenGL 2.1 for Intel HD Graphic.

0 Kudos
Reply