Ok. I printed out some computed values what happens at 60 fps in the game:
Code:
0.205800 3.430000 -0.004200 -0.00025200
0.201684 6.791400 -0.004116 -0.00024696
0.197650 10.085572 -0.004034 -0.00024202
0.193697 13.313861 -0.003953 -0.00023718
0.189823 16.477583 -0.003874 -0.00023244
0.186027 19.578032 -0.003796 -0.00022779
...
0.067745 116.174625 -0.001383 -0.00008295
0.066390 117.281133 -0.001355 -0.00008129
0.065063 118.365510 -0.001328 -0.00007967
0.063761 119.428200 -0.001301 -0.00007808
0.062486 120.469636 -0.001275 -0.00007651
The first column is speed as currently calculated, the second is r, the third is speed difference between logic updates, and the fourth is acceleration calculated from your formula.
So it appears that the (desired) acceleration is not constant, and the change of acceleration doesn't appear to be linear either.
The math is getting quite hard. I wonder if I could find a linear equation to calculate the drop of acceleration, to approximate current behaviour, and hence get something like the following?
Code:
acceleration += acceleration * delta_time * some_constant
speed += acceleration * delta_time
r += speed * delta_time