Graphics
Intel® graphics drivers and software, compatibility, troubleshooting, performance, and optimization
20634 Discussions

Shaders use more RAM than needed

Benjamin_L_Intel
Employee
1,188 Views

Hi, I am here to report potential bugs where all Intel HD/Iris seams to be impacted.

 

Test platform

  • CPU: i7-6700HQ | m3-7Y30
  • GPU: Intel HD 530 | Intel HD 615
  • OS: Windows 10
  • Drivers: 15.60.4901
  • Software: CEMU

Shaders use more RAM than needed

In the application (CEMU) we generate shaders dynamically and they often end up being quite complex (Example Vertex and Fragment shader). Furthermore, we deal with large amounts of shaders, in the range of 5k to 20k. The problem we are facing is that the graphics driver allocates up to GBs of RAM just for compiled shaders. The question is, is this intended behavior or a bug? We already double and triple checked to make sure this is not a mistake on our end.

 

Here are average results of the shader compile test (available below) on NVIDIA 1060 GTX and Intel HD 615.

  • NVIDIA: ~2.2GB
  • Intel: ~5GB

Here is a test application to demonstrate the issue. Source is available here (VS2015). It links one set of vertex + fragment shader 1000 times and then prints the amount of RAM commited by the application. The application itself does not allocate any extra memory. Additionally, the .zip comes with multiple sets of example shaders taken from our application to see the difference in RAM usage. For more details see main.cpp.

Some other observations that has been made:

  • Occurs on all drivers versions and all Windows versions
  • RAM usage is proportional to complexity of shader (no surprise here)
  • Conditionals (if clauses and '?' operator) seem to massively increase RAM usage and compile times
  • The size of uniform buffer arrays only slightly affect RAM usage
  • Detaching and deleting shaders (glDetachShader+glDeleteShader) after glLinkProgram helps only a bit
  • Calling glDeleteProgram() correctly releases all memory, indicating there is no leak
  • Same problem occurs when the shader programs are loaded via glProgramBinary

Shaders get corrupted when stored and reloaded

On driver branche 15.45.xx.xxxx, in the same application we save compiled shaders for them to be reloaded at next launch with less calculation. It appears OpenGL implementation in Intel drivers is incorrect and shaders get corrupted when storing and reloading it via OpenGL's glGetProgramBinary() & glProgramBinary().

 

Thanks in advance!

0 Kudos
2 Replies
Benjamin_L_Intel
Employee
1,188 Views

UP

0 Kudos
lazyradly
Beginner
1,188 Views

Bump, this issue is having noticeable effects on certain games and programs.

0 Kudos
Reply