You don't need to change your monitor's refresh rate to test different framerates - change the framerate to 'Fixed' at 10fps in application properties, or set it uo 'unlimited' (which should reach a few hundred frames per second at least), and the difference will become obvious.
The problem is framerate dependent code (like set X to .X + 1) doesn't have a fixed speed
(it depends on the framerate), and TimeDelta code does have a fixed speed
. So how can you possibly convert between them? TimeDelta always means one speed, and without TimeDelta it could be any speed, depending on your framerate. So to convert to or from framerate-dependent, you have to pick a framerate as your reference, and 60fps is a common one so a good choice.
My advice, though, is to completely ditch framerate-dependent code. Forget it. Just always use TimeDelta, and get everything moving at the right speed using that. Don't bother converting between them, it's totally unecessary if you start making your game right off the bat with TimeDelta. Also, don't forget the built in behaviors are already framerate-independent, so you don't need to worry about them.
I hope that's clear, it can be a tricky area to explain! By the way, you should avoid the 'Every X ticks' event in games, because as the name suggests, it runs depending on the framerate (every 10 ticks is once a second at 10fps, but ten times a second at 100fps!). You always want to use 'real world' times, so use 'every X milliseconds'. TimeDelta is another way of using real world times, so with the two, your game should run at the same speed everywhere!