# Wait question.

Discussion and feedback on Construct 2

### » Thu Nov 10, 2016 6:21 pm

Hi, I have a misunderstanding with wait.

If I use "wait 1 second" but the game runs very slow on a slow device, the wait time will not be adapted to game time. Then it will wait 1 second even if the game runs slow.

Is this so ?, I mean, "wait" is independent of the speed of the game.
To solutionate this is appropriate to use a timer behavior, am I right?

https://www.scirra.com/tutorials/56/how-to-use-the-system-wait-action
https://www.scirra.com/tutorials/67/delta-time-and-framerate-independence/en
B
35
S
13
G
22
Posts: 877
Reputation: 14,940

### » Thu Nov 10, 2016 9:36 pm

If you want to use wait using game speed, just use tick along with it instead of using a flat number.
Example: Wait (1*dt) meaning wait for a single tick.
B
36
S
18
G
11
Posts: 248
Reputation: 8,694

### » Fri Nov 11, 2016 9:17 am

Sethmaster wrote:If you want to use wait using game speed, just use tick along with it instead of using a flat number.
Example: Wait (1*dt) meaning wait for a single tick.

Ah, but measuring the time in ticks is complicated I think.

Supposedly 60 tick is a second but in a computer that runs the game slowly, I'm not sure that that works well and the wait is 1 second depending on the speed of the game.
Have you tried it? Sure it works correctly and the time adapts to the speed of execution of the game ?.

My question was, does the timer behavior work like this?
I mean the timer behavior does take into account the speed of the game to work.
B
35
S
13
G
22
Posts: 877
Reputation: 14,940

### » Sun Nov 13, 2016 3:06 am

Mirlas wrote:
Sethmaster wrote:If you want to use wait using game speed, just use tick along with it instead of using a flat number.
Example: Wait (1*dt) meaning wait for a single tick.

Ah, but measuring the time in ticks is complicated I think.

Supposedly 60 tick is a second but in a computer that runs the game slowly, I'm not sure that that works well and the wait is 1 second depending on the speed of the game.
Have you tried it? Sure it works correctly and the time adapts to the speed of execution of the game ?.

My question was, does the timer behavior work like this?
I mean the timer behavior does take into account the speed of the game to work.

Wait 1 second is based on real time (I belive), so if your game is running at 10fps on a device and 60 on another, 1 second would pass at the same time, while (60*dt) would take a second to "complete" on a device capable of 60 fps while taking 6 seconds on a device capable of 10 fps...
B
31
S
9
G
7
Posts: 262
Reputation: 5,671

### » Sun Nov 13, 2016 3:08 am

B
175
S
50
G
200
Posts: 8,628
Reputation: 124,532

### » Mon Nov 14, 2016 7:54 pm

If I use "wait 1 second" but the game runs very slow on a slow device, the wait time will not be adapted to game time. Then it will wait 1 second even if the game runs slow.

Yes, and No. Wait is fps independent until the game reaches the minimum fps. Standard the minimum fps = 30 fps.
You can lower the minimum fps with the system action: system > Set minimum framerate.
But, if you do so, most position based conditions (is overlapping/on collision) will fail. As explained in the manual:

Set minimum framerate
Set the maximum delta-time (dt) value based on a framerate. The default minimum framerate is 30 FPS, meaning the maximum dt is 1 / 30 (= 33ms). If the framerate drops below 30 FPS, dt will still not exceed 1/30. This has the effect of the game going in to slow motion as it drops below the minimum framerate, rather than objects stepping further every frame to keep up the same real-world speed. This helps avoid skipped collisions due to stepping a very large distance every frame.

In general, everywhere you need the fill in a value in seconds or something/second, this is fps independent by nature. Timer behavior needs input in seconds, so it is fps independent, to the explained limits.
B
33
S
18
G
29
Posts: 2,493
Reputation: 21,450

### » Mon Nov 14, 2016 8:23 pm

99Instances2Go wrote:If I use "wait 1 second" but the game runs very slow on a slow device, the wait time will not be adapted to game time. Then it will wait 1 second even if the game runs slow.

Yes, and No. Wait is fps independent until the game reaches the minimum fps. Standard the minimum fps = 30 fps.
You can lower the minimum fps with the system action: system > Set minimum framerate.
But, if you do so, most position based conditions (is overlapping/on collision) will fail. As explained in the manual:

Set minimum framerate
Set the maximum delta-time (dt) value based on a framerate. The default minimum framerate is 30 FPS, meaning the maximum dt is 1 / 30 (= 33ms). If the framerate drops below 30 FPS, dt will still not exceed 1/30. This has the effect of the game going in to slow motion as it drops below the minimum framerate, rather than objects stepping further every frame to keep up the same real-world speed. This helps avoid skipped collisions due to stepping a very large distance every frame.

In general, everywhere you need the fill in a value in seconds or something/second, this is fps independent by nature. Timer behavior needs input in seconds, so it is fps independent, to the explained limits.

This is a blow to me, I understand what you say ... I thought that timer behavior would adapt to the speed of the game, now I have my game using timer behavior and I guess it will not work well on slow devices.

So I guess the best is what someone said before:
Number of ticks to wait * dt

60 ticks = 1 second
B
35
S
13
G
22
Posts: 877
Reputation: 14,940

### » Mon Nov 14, 2016 8:52 pm

Make 'something' fps independent by using dt is exactly what behaviors (bullet,platform,timers ...) and also 'wait' & and every X seconds do already.

Same system means same problems when crossing the limits.
(is 10 fps not a bit to low ?)

If you want to accurately measure time, use the system expression 'wallclocktime'.

But even then, you are, well, a little bit screwed. Say it runs at 10 fps. Then 1 tick will be one tenth of a second.
Or, comparing recent wallclocktime to previous wallclocktime happens every 1/10 of a second. So the error is a average not accumulating dt.
Still better then an accumulating error each tick, when going lower then 30 fps.

Sorry if sound confusing.

But, in the end. It might be better to optimise the game. So it dont run lower then 25 fps. That is the best solution.
And do you really want to support devices that are not even anymore supported by there manufactures ?
B
33
S
18
G
29
Posts: 2,493
Reputation: 21,450

### » Tue Nov 15, 2016 9:22 am

99Instances2Go wrote:Make 'something' fps independent by using dt is exactly what behaviors (bullet,platform,timers ...) and also 'wait' & and every X seconds do already.

Same system means same problems when crossing the limits.
(is 10 fps not a bit to low ?)

If you want to accurately measure time, use the system expression 'wallclocktime'.

But even then, you are, well, a little bit screwed. Say it runs at 10 fps. Then 1 tick will be one tenth of a second.
Or, comparing recent wallclocktime to previous wallclocktime happens every 1/10 of a second. So the error is a average not accumulating dt.
Still better then an accumulating error each tick, when going lower then 30 fps.

Sorry if sound confusing.

But, in the end. It might be better to optimise the game. So it dont run lower then 25 fps. That is the best solution.
And do you really want to support devices that are not even anymore supported by there manufactures ?

I think I understand, in my game in many cases I should have used flags instead of time measurement.

I think I have to take into account devices with android 4.x.x, I suppose these devices might not run the game at 60fps, the amount of devices with android 4.x.x is still great.
B
35
S
13
G
22
Posts: 877
Reputation: 14,940

### » Tue Nov 15, 2016 11:11 pm

Mirlas, this is not a solution, just an illustration. The loop simulates a very very slow device. 7 fps on my machine.

https://www.dropbox.com/s/y3atcsbxfyvzu ... .capx?dl=0
B
33
S
18
G
29
Posts: 2,493
Reputation: 21,450

Next