CPU Utilization...?

Discussion and feedback on Construct 2

Post » Wed Jun 17, 2015 8:48 pm

I have a question regarding cpu utilization. I realize that this might be better directed right at Google, and they will be my next port of call - but before I (as one man with limited technical experience in this field) ask a multi-billion dollar industry leader to consider making changes to their browser, I need to check my understanding of what goes on under a browser's hood...

When I monitor my in-game CPU utilization using events I find there are occasional spikes (on my laptop) greater than 50% and that these correlate to lots of stuff happening as can be seen in the Javascript Console / Timeline - most of them are my own fault so I have been working hard to optimize my events and layouts. Spikes above 75% tended to be accompanied by drops in framerate, not GPU related - data manipulation in an array is my suspect here because, when I turn those events off, cpuutilisation reduces markedly. I can't do without the array data crunching (well, I would rather not...) so I need to work around this limitation. One thing struck me as rather disappointing - with a cpuutilisation value of 75%+, my Chrome use of my laptop's CPU was just 1.2% and didn't noticeably change when my game became more demanding and started to jank. My game was using 75% of 1.2%!

Has anyone else noticed this / does anyone care / have any suggestions? My thoughts are that Chrome should be capable of simply demanding more CPU access when running a demanding game, so games like mine can run all of their data processing unimpeded. Perhaps a game maker could raise a flag to permit this to happen. Am I being naive - is Chrome's access to processor time controlled by Windows or can it demand more if it needs it.
A big fan of JavaScript.
B
74
S
20
G
71
Posts: 2,228
Reputation: 44,888

Post » Wed Jun 17, 2015 11:02 pm

Just to expand on the above. I have been testing on a reasonably powerful laptop (i7, GTX 870 with 6GB / Intel HD4600 3.4 GHz) all of my previous testing was done using the HD4600 (I want the game to run well on average hardware if possible). However, changing to the run the game with NVidia card support, when running NW and Chrome, the game does not suffer any CPU spikes or slow-downs... which implies that the browser is able to offload some of it's work to the more powerful GPU, enabling it to handle the array functions without a hitch. Not what I expected as other indications are that my intel card is supported by chrome. So the lesson is to tell game playing customers that they need to own high-powered hardware, I guess... :/
A big fan of JavaScript.
B
74
S
20
G
71
Posts: 2,228
Reputation: 44,888

Post » Thu Jun 18, 2015 1:26 am

That's really interesting Colludium. I am not really sure about this, but isn't those built-in intel hd graphics supposed to use the intel processors for their workload as well? I remember the first version with the sandy bridge Intel processors, and many of them were explaining how the built in graphics increased their performance by actually pushing the workload to the intel processors. I am not too sure about the more never built in graphics though.
B
46
S
23
G
31
Posts: 693
Reputation: 20,637

Post » Thu Jun 18, 2015 1:58 am

@rekjl, I'm not sure what to make of it really - the point that causes me consternation is the browser demands almost a negligible amount of computer cpu time - even when it's in focus and the user is doing nothing else - but it allows the game to become max'd out with jank etc because the allocated cpu time for the javascript is not enough... Now, this might just mean that things cannot be done faster on one thread in javascript, which doesn't bode well for the future of html5. If that's not it, I just don't understand the philosophy behind the design and wandered if anyone could shed light on it or thoughts on if it's worth a discussion with google. My my public maths, using intel HD4600, my game is allowed to jank in chrome because of cpu demand even though the computer has 98.8 cpu available (I know, probably not accurate even though it's from the task manager, but you get the idea).
A big fan of JavaScript.
B
74
S
20
G
71
Posts: 2,228
Reputation: 44,888

Post » Thu Jun 18, 2015 5:54 am

That's definitely frustrating! To know that your game should be able to run much faster but because of certain programming "restrictions" in place, it is not able to take full advantage of it. One thing that does look bad is, if google was serious about improving html5, this issue would have been resolved quite some time ago. Perhaps they really aren't that interested in going that way. If that's the case, and with Chrome being the benchmark of browsers, our hopes might really rely on the MicroSoft Edge. It's still some way off from being stable or the "it" browser, but maybe this is a good thing for us. Since they need to play catch up, they will have to look at areas that give them an advantage over Chrome, hopefully this is one aspect they might concentrate on. Not sure if there's anything Intel is able to do about this on their part.

But that's the problem nowadays with any technology that is not direct or native (not criticizing Scirra here), cause all the rest will have to depend on so many other groups to be able to function efficiently. It is worth a shot trying to bring this to Google's attention, not sure if they will take this seriously, but at this point, it is the only thing we can do besides waiting.
B
46
S
23
G
31
Posts: 693
Reputation: 20,637

Post » Thu Jun 18, 2015 8:45 am

I noticed that chrome is performing very poor compared to IE when i test my game. My game in IE uses somewhere between 10-15% CPU att all times. In Chrome, my game uses 25% CPU, and even higher just by turning on debug mode. I've been trying to optimise events but i think i took it as far as I can. It's kind of funny... when i test my game in windows phone browser it performs better than chrome on my desktop... :p
Follow my progress on Twitter
or in this thread Archer Devlog
B
40
S
17
G
17
Posts: 980
Reputation: 12,632

Post » Thu Jun 18, 2015 9:32 am

The stutter and fps drop you are seeing with the HD4600 is a known issue in complex games or ones that use a lot of effects and in particular, WebGL shaders. Intel's OpenGL performance is attrocious and their drivers for it even worse. They seem to only care about DX and not OpenGL for gaming.

I posted something regarding CPU usage here: nw-js-v0-12-0-chromium-41-5th-march-discussion_t126227

Essentially, C2 is single threaded for the most part, all that logic uses 1 out of 8 threads on your i7 CPU. It can max out on 1 thread, when it shows 99% CPU use, the game will bog down and frames will be dropped, regardless of your GPU.

So be very careful when you design a bigger complex game, ensure that the reported CPU use stay less than 50% on your i7 as its a very strong CPU with high single thread performance. Someone on an older CPU, or an AMD CPU with weaker single thread performance, will suffer greatly if you are designing your game to go beyond 50% of an i7's single thread capability.

Also if you want gamers to play it on Intel HD graphics, don't use WebGL shaders. :o Maybe do a "Low/High FX" option, low for people on Intel iGPU and High for AMD/NV GPUs.
B
70
S
24
G
19
Posts: 1,757
Reputation: 17,614

Post » Thu Jun 18, 2015 9:50 am

tunepunk wrote:... when i test my game in windows phone browser it performs better than chrome on my desktop... :p


! This is so disappointing, yet somehow hilarious...! Html5 is such a good idea but never properly embraced. I hope that W10 & Edge & universal apps perform as well as IE11 does now...


Silverforce wrote:Essentially, C2 is single threaded for the most part, all that logic uses 1 out of 8 threads on your i7 CPU. It can max out on 1 thread, when it shows 99% CPU use, the game will bog down and frames will be dropped, regardless of your GPU....
Also if you want gamers to play it on Intel HD graphics, don't use WebGL shaders....


This and your following advice are gold. I hadn't really NB'd the implications of this limitation before. Luckily I think I'm going to get away without using shaders - only lots of blend modes - but the 50% max is good advice.

So, let's hope W10 provides what it promises, or it'll be a long wait for hardware to improve so Chrome really becomes viable...
A big fan of JavaScript.
B
74
S
20
G
71
Posts: 2,228
Reputation: 44,888

Post » Thu Jun 18, 2015 10:05 am

Your numbers are very likely wrong. Chrome will happily use up 100% of the cpu if you give it that much work. However measuring CPU time is actually pretty complicated. C2's 'cpuutilisation' expression is really "time spent in main thread", literally based on timer measurements. There isn't a direct way to get CPU usage in Javascript, so this is used as an estimate, but it has some drawbacks:

- time spent in the main thread is not necessarily the CPU usage, e.g. if the CPU suspends the thread and goes idle for a while, then resumes the thread, all that time will be included in cpuutilisation making it look like the CPU usage is higher than it really is
- the timers only cover the main event loop, and some things like input triggers are not covered
- the timers only cover the main thread - anything the browser dispatches to another thread is not covered. For example Chrome basically saves all draw calls then forwards them to another thread to run in parallel, whereas some other browsers run the draw calls on the main thread. This means cpuutilisation excludes the draw calls work in Chrome because they happen on a different thread, but includes the draw calls work in other browsers which do run on the main thread. In actual fact the CPU work done may be identical, but this causes different measurements. Similarly there are other features which may or may not be included in cpuutilisation depending on the threading model of the browser.

Further, if you look at the per-process CPU usage in Task Manager, there are more gotchas:
- Chrome is a multi-process browser, so you need to take in to account the sum of all CPU usage across all Chrome's processes
- Windows tends to merge all cores in to one percentage reading, so e.g. 25% CPU usage on a four-core system may indicate one core at full usage, two cores at 50% usage, four cores at 25% usage, or something else. It can be hard to tell.
- Since C2's cpuutilisation is only measuring one thread, it could read 100% when Task Manager indicates 25% (assuming a quad-core system again). Both are correct: the main thread is entirely busy, but only one core out of four is fully utilised.

If you are not aware of these various complexities, you will simply confuse yourself by making misleading measurements. There are reasons that C2 could measure way higher than Task Manager, or way lower. Your impression that Chrome is not able to use the full system resources is almost certainly incorrect and probably just a result of these complications.

My advice: just look at the measurements in C2's own profiler. That should give you a reasonable idea of which areas of your events are slow. And only treat the specific numbers as a ballpark, as if they can only read "low", "medium" or "high".

If you're wondering why C2 doesn't "just" run everything over all cores, then see my blog post Why do events only run on one core?
Scirra Founder
B
397
S
236
G
88
Posts: 24,420
Reputation: 194,549

Post » Thu Jun 18, 2015 10:20 am

Colludium wrote:Has anyone else noticed this / does anyone care / have any suggestions? My thoughts are that Chrome should be capable of simply demanding more CPU access when running a demanding game, so games like mine can run all of their data processing unimpeded. Perhaps a game maker could raise a flag to permit this to happen. Am I being naive - is Chrome's access to processor time controlled by Windows or can it demand more if it needs it.


You have to be very careful how you test you game when it comes to CPU and FPS and what conclusions you draw from them.

Using the debugger to validate CPU and FPS are extremely unreliable and depending on how you measure it you get different results, that you cant really use to a lot I think.

Here is an example:
Image

All tests are made from the same place in the game.

1. The first image shows the stats that I get using the Debugger in NW.js when the system tab is selected.

2. Second image show how much this information changes simply by changing tab from system to a another tab.

3. Is without debugger tracking the data in game and fits well with the second test when using debugger.

So when testing performance you have to take into account which tab you have selected in the debugger. And at least in my opinion, doing any testing in the system tab is completely useless, because you cant use of any of the data you get there to anything, it doesn't reflect your game performance, but the game and the debugger performance combined.
B
44
S
11
G
2
Posts: 1,182
Reputation: 6,848

Next

Return to Construct 2 General

Who is online

Users browsing this forum: No registered users and 14 guests