We are currently analyzing ADR issues:
It is correct that the default algorithm is looking for highest SNR value from uplink history and that highest value is also used when it only occurs once. In our case there was one message with SNR = 6, but all other messages have SNR between
-10. But because there was one 6, it changes DR from 4 to 5 and transmit power from 0 to 1. After that we recognized a lot of missing messages and it takes 1-2 days until ADR could fix the problem again probably by increasing tx_power to 0 again.
One idea could be to change the implementation of ADR algorithm to remove the peak SNR values? For example calculate mean and standard deviation and remove all values outside of this window?
we dont want to change only
installation_margin because then it would influence also sensors with no outliers