Ashley Gullen

Ashley Gullen

Lead Construct 2 developer, piano player and tech enthusiast

Follow me on Twitter!

Other Entries

We love brains!

Join us! Joiiinnn ussss! Mooooree brains!

nVidia hobbles WebGL performance on laptops

by Ashley Gullen | 30th, December 2013

On dual-GPU laptops, nVidia appears to be deliberately disabling use of the faster GPU in browsers. This means on their systems modern web-based games using WebGL can only ever use the weak GPU, which can be even slower than a mobile phone's GPU.

A little background: mid- and high-end laptops commonly ship with two GPUs. There's a weak, slow one which is low-power and designed to maximise your battery life while doing non-intensive work like email. Then theres a powerful, fast, power-hungry GPU that's designed to maximise performance while playing games or plugged in to mains power. Sometimes the laptop's power settings will switch between them depending on whether it has mains power. There may also be a special tool to designate which GPU to use for each application on the computer, so you can set games to always use the powerful GPU for example.

In the case of nVidia, they have the nVidia Control Panel. However with the latest driver updates, the nVidia control panel strangely seems to have the GPU setting disabled for Chrome, Firefox and IE - all of which support WebGL in their latest versions.

nVidia disabled GPU setting for Chrome

This appears to be a deliberate descision by nVidia, to "prevent using power needlessly". This doesn't make sense. It would be a sensible default when running on battery, but you can't even change the setting even when you're plugged in to mains power and want to kick off some awesome browser-based gaming like the Epic Citadel demo, or any Construct 2 game.

I have an expensive laptop at home which has a weak GPU (Intel HD 4000) backed by a powerful chip (GeForce GTX 675M). Running this performance test it scores a paltry 1250 sprites at 30 FPS in Chrome. For comparison, my Nexus 5 phone (with a chipset by Qualcomm) happily shoots up to over 17000 sprites - well over ten times faster.

There's a crazy workaround (see the answer involving the mklink command). Using that my laptop can then blow past my Nexus 5 scoring well over 25000 sprites at 30 FPS in Chrome (surprisingly, not that much faster than the Nexus 5...). Unfortunately, nVidia don't provide any way at all for ordinary users to set that. So you might have paid a lot of money for a really powerful chip, and nVidia won't let you use it for browser games.

The good news is node-webkit exports appear to be unaffected - nVidia seem to have hard-coded in the detection.

Perhaps the real reason is they are unable to write drivers that work, since browsers are significantly different to the games the drivers are designed towards. Still, I'll definitely think twice before buying a laptop with a dual nVidia GPU in it in future. I don't know if dual AMD GPUs are similarly affected, but let's hope not. I don't know how the incumbent desktop GPU vendors could hope to compete with the new mobile world if you can pop out your phone and get a much better experience than your laptop. Let's hope nVidia change their mind!

Now follow us and share this

Comments

0
Lordshiva1948 10.7k rep

Thank you so much

Monday, December 30, 2013 at 3:00:37 PM
5
Kyatric 41.3k rep

Time to start a campaign similar to "We want OGG" and shove it in nVidia's face :/

Monday, December 30, 2013 at 3:02:35 PM
1
Jaydon 4,732 rep

Another fantastic blog! Good read!

Monday, December 30, 2013 at 3:03:00 PM
2
andreyin 7,837 rep

I have a laptop like that too (Intel HD 3000 geforce 540m) and awhile ago the same thing happened to me, went to download a beta version of the nvidia geforce driver and this option was like that. Had to spend a whole day asking about it on their forums only to get shitty responses, then reverting everything back.

I really don't get why would they do something like this, I'm pretty sure that everything is set to use the Intel HD card by default, not the nvidia one.

Monday, December 30, 2013 at 3:51:55 PM
2
andreyin 7,837 rep

Also since my game is being exported in Node-Webkit, I really dodged a bullet there.. at least now it works 60fps on Intel HD cards thanks to that new fullscreen option. Thanks guys!

Monday, December 30, 2013 at 3:53:24 PM
2
CandyFace 2,780 rep

Awesome post and such stupid decision by Nvidia,why are we not allowed to select whatever GPU we want to use? Why do they have to force the option disabled... i don't get it :/

Monday, December 30, 2013 at 4:07:14 PM
1
iceangel 5,540 rep

Stupid decision by Nvidia. Thanks guys!

Monday, December 30, 2013 at 4:20:11 PM
1
newt 22.4k rep

I wonder if it has anything to do with competition with the Nvidia Shield ?

Monday, December 30, 2013 at 4:32:51 PM
1
danialgoodwin 2,341 rep

Thank you for sharing, this is definitely great information to know!

Monday, December 30, 2013 at 7:55:38 PM
1
DatapawWolf 4,243 rep

I absolutely hate dual GPU anyway. Got suckered into my current laptop with it, and even AMD's drivers don't always work the way they are supposed to.
Thanks for the information. Their decision sounds smells like corrupt money to me.

Monday, December 30, 2013 at 8:03:36 PM
1
Ashley 112.0k rep

@DatapawWolf - I agree, dual GPU seems to be a weird hack. Somehow mobiles get by just fine with low-power high-performance GPUs though - hopefully one day we will be able to get fast and low-power laptop GPUs too!

Monday, December 30, 2013 at 8:28:05 PM
1
gbolt 501 rep

I have AMD/ATI and the options's disable for me as well. I don't know why's that, but I can say that a while ago when I tried to make Firefox run with the more powerful GPU it ran very weirdly (the browser would run very glitchy). Thankfully I have an APU, so I can run quite smoothly with that alone. Maybe Nvidia cards have that problem as well? Maybe it's something that doesn't happen always... well, I hope both Nvidia and AMD fix it (or the browser makers, if it's on the browser side) so we can enjoy more intensive WebGL games. :D

Tuesday, December 31, 2013 at 2:04:59 PM
1
AbelaNET 6,012 rep

Thanks for this useful info. Tested this out on my laptop since I haven't noticed it and yeah it does disable it. What a shame :( Thanks once again for sharing.

Wednesday, January 01, 2014 at 6:51:16 PM
0
mindfaQ 2,196 rep

Okay. Now I am angry. Tried the fix, but seems that the ATI-driver still detects the gpu-exe as browser and it always uses the integrated graphics for this. So there is no way to disable it. Seriously, why is customer service getting worse and worse. Win 8.1 was already an impertinence when using my new Laptop, now I have to put up with this shit as well? Not to mention that most new computers have dual graphics, because it kinda does make sense in a energy saving way and processors bring their own graphics solution automatically.
In what kind of world is it a good idea to completely disable a person to use hardware they own? If they wanna use it, let them use it. Hatemail will be going out, maybe complaining will put them to their senses, although I doubt it :(.

@Ashley: can you post your benchmark as node webkit export? I kind of wanna test out how fast my system would be if it wasn't hindered by STUPID decisions like locking the user into a cage.

Tuesday, January 07, 2014 at 3:07:24 PM
2
lennaert 5,872 rep

@scirra
From what i read it has to do with blacklisting cards which doesnt 'propperly' support various settings.
www.sitepoint.com/firefox-enable-webgl-blacklisted-graphics-card/

That one also has a fix for firefox

Temp solution: www.biodigitalhuman.com/home/enabling-webgl.html

here are some fixes for other browers.

Tuesday, January 07, 2014 at 9:11:35 PM

Leave a comment

Everyone is welcome to leave their thoughts! Register a new account or login.