When we set a Fixed Framerate, as I understand it now, we basically set the speed of the repetition of the processing & rendering cycles for the computer, which is why there is no lag.
When we set V-Sync, it is possible to get lag, if the processing and rendering is much faster than the refreshrate of the screen. I don't understand this. That means V-Sync doesn't set the speed of the repetition of the processing cycles, but actually just let's it wait and only allow to display new images, when the screen is finished with the old one? Why doesn't it just work as fixed framerate, just set to the refreshrate of the screen then?
Now let me define lag. Lag means executing an input, but having the related action on screen appear delayed, not on time with the input.
Now if it is that way, I don;t understand another thing. If the lag is due to the speed difference, it should theoretically get worse with time. Let's say the refreshrate of the screen is 60Hz and the speed of the processing is 120FPS, twice as fast. So let's see, it'd go like this, or not? -> 1st Second- Screen(showing): 2 Frames; PC(has): 4 Frames; 2nd Second- Screen: 4 Frames; PC: 8 Frames; 3rd Second - Screen: 6 Frames; PC: 12 Frames.
In other words, in this example in the beginning we have 2 frames difference, and on after 3 seconds, we have 6 frames lagging behind.
Why doesn't anything here make sense to what I see?
I just recently tried to read up on that again, to see what causes screen tearing. Which (interestingly enough), I understand how it happens, but it never happens, even if I make an experiment on my laptop. I know its' refreshrate is 60 Hz, but no matter what fixed framerate I put, it never tears. So I don't get it again.
I feel like I missed something.
Thank You for any help I could get clarifying this.