Downlink Gateway Selection - 2018 vs now

Hi everyone.

According to @Brocaar back in 2018, that version of Chirpstack used to have slightly different logic. It used to select the downlink gateway, based on SNR like today, but it would start using RSSI past a certain point. The point of change was Github PR #464: Implement downlink gateway randomization option. by brocaar · Pull Request #464 · brocaar/chirpstack-network-server

I also read from the original Github PR that RSSI could be a better metric past some point, as the results become saturated at higher SNRs: Choosing the right gateway on Join or Confirm · Issue #78 · brocaar/chirpstack-network-server

So why was this design reversed in 2018? Are we losing out on something important?


The reason why I bring this up, is because I have been facing a curious problem in my deployment. I have a unique street lighting project, which has many lamps in a compound. Each lamp has a Class C lamp controller attached to it.
Due to the higher urbanized nature of the site, there are many gateways installed, all in close proximity. There is no way to change the position of the nodes.

There is a handful of nodes that cannot receive downlinks from many gateways. I believe this to be linked to the node’s sensitivity. Unfortunately, the gateway with the highest SNR (can be as strong as 10dBm), is always the wrong choice for communicating with those nodes. The project team added a gateway closer to the affected nodes and this gateway works well, but the existing downlink gateway selection algorithm doesn’t seem to be a good fit because it has a good chance of making the wrong choice (only 2/5 gateways are the usable for downlinks).

The 2018 algorithm could possibly help my case, but I don’t think I understand enough of its history to say it surely is a good or bad idea.

I don’t really have a RF background either, but I had a module about some RF theory and clearly I don’t know enough. I’m a software developer who spent considerable time with LoRaWAN.