I am looking for an algorithm which assesses the stability of a noisy signal over time. To be more specific letīs say a program sets a new setpoint in a temperature controller and the algorithm inside the program now should decide if the setpoint has been reached and - more important - if the temperature can be considered as constant.
Couldn't find anything useful on the internet.
Of course I have some ideas how to tackle this, like linear regression in a moving window and slope checking and so on, but I assume since this is a fairly basic and general problem, there should be many tried-and-tested solutions for this.
Some useful keywords would be fine to have a starting point for further search in the internet.
Thanks in advance