RX2_DELAY - please clarify

Hello,
I am using OTAA join for EU433 band and everything is working correctly.
Next, I am trying to change RX1 delay from 1 second (default) to 2 seconds.
This is what I found

  1. RX parameter (re)configuration - ChirpStack open-source LoRaWAN<sup>®</sup> Network Server

  2. [network_server.network_settings]

    Class A RX1 delay

    0=1sec, 1=1sec, … 15=15sec. A higher value means ChirpStack Network Server has more

    time to respond to the device as the delay between the uplink and the

    first receive-window will be increased.

    rx1_delay=1

What i did
I set
[network_server.network_settings]
rx1_delay=2
and restarted the LoRa server
I expect that RxTimingSetupReq would be sent to nodes during OTAA Join but it is not.
This is my Join request log from device
7c6dd531557ebe200a1af26c99b8c91

Any help would be greatly appreciated.

This is a mac-command. Mac-commands are not sent as part of the OTAA join-accept. Please note that the join-accept message does contain the RX1 delay value.

Thank you for your reply.
when RxTimingSetupReq will be sent to device?


What could be the “first opportunity”?

For class A devices (which is probably everything initially) downlinks can only be sent in response to uplinks. So the “first opportunity” would be when the device sends an ordinary uplink in the new session. I’d at least expect it in response to one of the first few uplinks in the joined session, or in the case of a configuration change one of the first few uplinks after the change was committed.

Even if the server “wants” to send it, it cannot do so until the device polls the server by sending an uplink (hopefully a useful uplink and not just an empty poll)