Is timing in C2 accurate?

0 favourites
From the Asset Store
A cool way for kids to write and practice English Alphabets
  • So I have been looking into timing and such - and, obviously it's always "use dt for smooth and proper timing in C2" and "behaviours have dt built-in so they are accurate". I built a little test and there's some strangeness afoot - the longer it runs there's both more and more shift from the "reference" every second thing, but also between a behaviour and a simple +dt*X action.

    I realise it can be fixed with some extra variables and math, but can that be avoided and reasonable accuracy expected with simple behaviour/dt usage? dt seems to be rounded down to 3 digits and the sum total in 1 second apparently almost never comes down to a clean 1? Or am I doing something wrong?

  • This is normal behaviour ; accumulating floating point values will also accumulate errors and encoding inaccuracies and will never give an exact number. Which is why we never test for equality between floating point numbers ; you never have "1.0 + 1.0 = 2.0", you always have "1.0 + 1.0 = 2.0 + epsilon". When using floating point values, always work with ranges and threshold.

    If you know the expected result (like in your behaviour), add a manual correcting factor or snapping mechanism for key values.

    Also, dt doesn't guarantee proper timing, it guarantees framerate independence. "Proper" timing is a very loose concept that should be considered only in the context of a specific platform, as the timing logic differs widely depending on the hardware.

  • Which makes me feel like the dt should have this built in - like always make the closest end value a round 1.0 - wouldn't that make sense? If it's advertised as "1.0 every second" then some of these 0.01-0.05 errors can quickly mount up.

  • Having an automated correcting factor built-in would implicitly mean knowing what the exact value actually is. That's just not possible ; you only get values from sensors and timers to a specific accuracy defined by the limitation of the hardware. And having C2 try to compensate to ensure that we have sum(dt, 0.0, 1.0) = 1.0 would introduce time dilation unrelated to the frame time which would break all consistency in the simulations.

    And why over 1 sec ? That's arbitrary, and in a different example we might prefer time consistency over 0.1 sec, or maybe 60 sec, etc. The definition of dt is not to accumulate exactly to 1.0 over 1 sec ; it is to represent the time elapsed since the last frame, and one doesn't guarantee the other.

    If you know that after x seconds your value should be 123.456, great, you can have a correction factor for your simulation ; but only you know that.

    Plus it's an egg/chicken problem ; assuming you want to enforce that after 1 sec your sum(dt) is indeed 1.0... how do you know that exactly 1.0 sec has elapsed, since your measured timer value is inherently inaccurate as it's limited by its frequency and the encoding of floating point values. Exact floating point values just don't exist in computer science ; and you might be surprised to know that the floating point value "1.0" is NOT actually equal to the exact value "1.0" on many common platforms

  • "The definition of dt is not to accumulate exactly to 1.0 over 1 sec ; it is to represent the time elapsed since the last frame, and one doesn't guarantee the other."

    actually it is also this, except that if you unfortunattelly remember that the time between two frames is not only variying slightly and that it alsomakes the game only refresh when the screen refreshes, basically, there is no rouding accumulating errors in that case (at least in the first), I would suspect more that the rendering (recalculating) does not happen at the right time you need it, the last frame, it was not at the good angle, the frame after that, it already went too far.

    summing up dt will make it add 1 every second at a timescale of 1 (it is the side definition of dt, as it is in seconds per frames), but remember that the game only calculate every frame, so timing cannot be more precise to the point it happens between two ticks.

  • Framerate is also indeed definitely to be considered and could easily explain why things don't appear aligned properly. The animation shows a major error though, I'm still suspicious... I'll check the .capx if I get the chance !

    Though I disagree with the following statement : "summing up dt will make it add 1 every second" ; I would correct to "summing up dt will make it add ~1 every second" with "~1" being "approximately 1, i.e. 1.0+epsilon", as inaccuracy will accumulate over time.

  • And why over 1 sec ? That's arbitrary, and in a different example we might prefer time consistency over 0.1 sec, or maybe 60 sec, etc. The definition of dt is not to accumulate exactly to 1.0 over 1 sec ; it is to represent the time elapsed since the last frame, and one doesn't guarantee the other.

    Well, if I read Ashley's post on dt it certainly creates the impression that we can rely on it summing up to 1:

    [quote:3s6uintr]Notice that if you add dt to a variable every tick, it adds 1 every second, because the time between all the ticks over a period of 1 second must add up to 1! Here's an example showing just that. (Adding dt to an object's instance variable is also a handy way to have a timer in an object.)

    Also "that's arbitrary" - in that case dt might be any random number, but it just happens to be a fraction of a second.

    I'm not necessarily arguing about definitions here - I'd like reasonable precision in things - if it's set to 90 degrees per second and then we get 88,91,92,89,90,88 etc - seems a little vague, that's all.

  • Framerate is also indeed definitely to be considered and could easily explain why things don't appear aligned properly. The animation shows a major error though, I'm still suspicious... I'll check the .capx if I get the chance !

    Though I disagree with the following statement : "summing up dt will make it add 1 every second" ; I would correct to "summing up dt will make it add ~1 every second" with "~1" being "approximately 1, i.e. 1.0+epsilon", as inaccuracy will accumulate over time.

    I will agree that it does not reproduce on a case where dt is not a fixed number with 1/dt being an integer. it is a general rule, however I think the time expression actually can be used if you need a more accurate way too Somebody

  • Yeah, what puzzles me if the kind of error you get ; several degrees on just a single rotation, that's just plain weird and would not be explained by floating point error accumulation

    What I mentioned before certainly doesn't explain the massive inaccuracy you're getting. You could be getting 90.00000001 instead of 90, but 93 or so, that's just odd

  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • If you know that after x seconds your value should be 123.456, great, you can have a correction factor for your simulation ; but only you know that

    OR... the behaviour should know that - if it's set to turn 90 degrees after a second I'd like that to happen. So it uses dt and isn't 100% reliable, but why then is the dt*90 sprite at a different angle than the behaviour one? Something somewhere seems a little off, you know.

    Yeah, what puzzles me if the kind of error you get ; several degrees on just a single rotation, that's just plain weird and would not be explained by floating point error accumulation

    What I mentioned before certainly doesn't explain the massive inaccuracy you're getting. You could be getting 90.00000001 instead of 90, but 93 or so, that's just odd

    Which is why I built that demo - to show the rather big displacement in angles - so perhaps it isn't 3 degrees, but still plenty. And it doesn't just accumulate, but rather jumps back and forth, so in reality it's even bigger at times.

  • I still think this is the expected behavior of C2 since it recalculates every frame, not constantly (which implies it is innacurate to an extend), however Ashley could explain it if that is not the case.

    I do not remember the exact chrome command to break v sync so the frame rate can go very high, if you find it and test it, and that the result is better, that would give one information valkdating my version of the fact.

  • I think the example is flawed ; it assumes that "every 1.0 seconds" is actually accurate and uses this as reference. I wouldn't rely on "every x seconds" for anything that needs to be accurate, as there's little in the documentation about the scheduling of these triggers.

    I modified the capx to just keep the rotate behaviour, and change the dt logic to a simple :

    Every tick > set angle to Self.Angle + rotateSpeed * dt

    The accumulated dt error is minimal and the behaviour and dt bars overlap perfectly from a visual point of view ; this proves the behaviour logic and dt logic are consistent with each other.

  • I modified the example further, to log the system time every time the dt bar completed a full circle ; the difference between each iteration is less than 0.0001s which is within expected margin given the frame time.

    I'd say blame the "every x seconds" ; use it for gameplay stuff where accuracy doesn't matter (e.g. "spawn an enemy every 2 seconds" ; it doesn't really matter if it's 1.9 5 or 2.05), but not for timing behaviours

  • I'll need to check your version once I'm back on pc, but then again if an example is flawed because it assumes stated system functions act the way they are presented - are they really flawed?

    Most of my questions are based on the assumption that one can use just the behaviours and basic events to get reliable results, yet almost every solution ends up with "use these custom events with variables" and so on.

    There was another question about "sub tick" object creation and movement and it seems the answers are also "basically programming". Call me lazy, but I would prefer reliable solutions using the built in systems.

    As for the precision - then why isn't Every X seconds precise if one can get that precision with events?

  • The good news is that you can rely on behaviours

    I believe Ashley said in other related topics that "Every X seconds" is not affected by the framerate, it runs based on real-world time rather than ticks. Therefore I am assuming it must be running on a less accurate timer (system clock, maybe ?), or using a parallel scheduler.

    The only flaw in your example, if my analysis is correct, is that you were using a less accurate method (world time) to time a more accurate method (tick based).

    I'm pretty sure you should be able to replace "every x seconds" in your example with a Timer behaviour, which would then be tick based, and then get consistent results.

    Basically, see "every x seconds" as a way to schedule high-level events, e.g. "every ~20 seconds spawn a new monster", not as something to be used for animation or frame times.

    As for "sub tick", I can definitely see cases where that can be useful (usually used for particle effects, etc.) but that's usually very specific and hard to make generic.

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)