ADR algorithm to handle outliers

We are currently analyzing ADR issues:
It is correct that the default algorithm is looking for highest SNR value from uplink history and that highest value is also used when it only occurs once. In our case there was one message with SNR = 6, but all other messages have SNR between -3 and -10. But because there was one 6, it changes DR from 4 to 5 and transmit power from 0 to 1. After that we recognized a lot of missing messages and it takes 1-2 days until ADR could fix the problem again probably by increasing tx_power to 0 again.

One idea could be to change the implementation of ADR algorithm to remove the peak SNR values? For example calculate mean and standard deviation and remove all values outside of this window?
we dont want to change only installation_margin because then it would influence also sensors with no outliers

3 Likes

I think that makes a lot of sense. I also observed that Chirpstack is much more optimistic with ADR than Thethings Stack. I think the mean value of the SNR would yield much better results. Btw. is it really only the SNR that is used, why not the RSSI?

2 Likes

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.