Is there a way to do this? I want to always show 1 decimal point after my number, including when it's 1.0 for example. I know I can do Int(Number*10)/10 to get 1.1, 1.2 etc, which limits it to 1 decimal point only but if the original number was 1.0, it'll just return '1' as the answer.
So for 1 I want to show 1.0
For 1.1 I want to show 1.1