On dual-GPU laptops, nVidia appears to be deliberately disabling use of the faster GPU in browsers. This means on their systems modern web-based games using WebGL can only ever use the weak GPU, which can be even slower than a mobile phone's GPU.
A little background: mid- and high-end laptops commonly ship with two GPUs. There's a weak, slow one which is low-power and designed to maximise your battery life while doing non-intensive work like email. Then theres a powerful, fast, power-hungry GPU that's designed to maximise performance while playing games or plugged in to mains power. Sometimes the laptop's power settings will switch between them depending on whether it has mains power. There may also be a special tool to designate which GPU to use for each application on the computer, so you can set games to always use the powerful GPU for example.
In the case of nVidia, they have the nVidia Control Panel. However with the latest driver updates, the nVidia control panel strangely seems to have the GPU setting disabled for Chrome, Firefox and IE - all of which support WebGL in their latest versions.
This appears to be a deliberate descision by nVidia, to "prevent using power needlessly". This doesn't make sense. It would be a sensible default when running on battery, but you can't even change the setting even when you're plugged in to mains power and want to kick off some awesome browser-based gaming like the Epic Citadel demo, or any Construct 2 game.
I have an expensive laptop at home which has a weak GPU (Intel HD 4000) backed by a powerful chip (GeForce GTX 675M). Running this performance test it scores a paltry 1250 sprites at 30 FPS in Chrome. For comparison, my Nexus 5 phone (with a chipset by Qualcomm) happily shoots up to over 17000 sprites - well over ten times faster.
There's a crazy workaround (see the answer involving the mklink command). Using that my laptop can then blow past my Nexus 5 scoring well over 25000 sprites at 30 FPS in Chrome (surprisingly, not that much faster than the Nexus 5...). Unfortunately, nVidia don't provide any way at all for ordinary users to set that. So you might have paid a lot of money for a really powerful chip, and nVidia won't let you use it for browser games.
The good news is node-webkit exports appear to be unaffected - nVidia seem to have hard-coded in the detection.
Perhaps the real reason is they are unable to write drivers that work, since browsers are significantly different to the games the drivers are designed towards. Still, I'll definitely think twice before buying a laptop with a dual nVidia GPU in it in future. I don't know if dual AMD GPUs are similarly affected, but let's hope not. I don't know how the incumbent desktop GPU vendors could hope to compete with the new mobile world if you can pop out your phone and get a much better experience than your laptop. Let's hope nVidia change their mind!