I have a question regarding cpu utilization. I realize that this might be better directed right at Google, and they will be my next port of call - but before I (as one man with limited technical experience in this field) ask a multi-billion dollar industry leader to consider making changes to their browser, I need to check my understanding of what goes on under a browser's hood...
When I monitor my in-game CPU utilization using events I find there are occasional spikes (on my laptop) greater than 50% and that these correlate to lots of stuff happening as can be seen in the Javascript Console / Timeline - most of them are my own fault so I have been working hard to optimize my events and layouts. Spikes above 75% tended to be accompanied by drops in framerate, not GPU related - data manipulation in an array is my suspect here because, when I turn those events off, cpuutilisation reduces markedly. I can't do without the array data crunching (well, I would rather not...) so I need to work around this limitation. One thing struck me as rather disappointing - with a cpuutilisation value of 75%+, my Chrome use of my laptop's CPU was just 1.2% and didn't noticeably change when my game became more demanding and started to jank. My game was using 75% of 1.2%!
Has anyone else noticed this / does anyone care / have any suggestions? My thoughts are that Chrome should be capable of simply demanding more CPU access when running a demanding game, so games like mine can run all of their data processing unimpeded. Perhaps a game maker could raise a flag to permit this to happen. Am I being naive - is Chrome's access to processor time controlled by Windows or can it demand more if it needs it.