Estimating Limits & Costs of Azure

Hi all,

We’re currently using an Azure VM to run Chirpstack and a MING stack for data handling and retention. We’re currently pretty bare-bones with Standard B1ms (1 vcpu, 2 GiB memory) and it’s been pretty stable. My question is how do we estimate the capacity and limiting factors for our setup? I know Azure has a cost estimator, but I’m not sure how soon RAM becomes a limiting factor vs CPU vs cost of network traffic to/from the server.

We currently have 15 devices uploading data every 10 minutes and the payloads are pretty lightweight (1.16kB). I’ve included some screen caps of the load on the server between network traffic and free RAM - CPU was steadily below 10%. Around 11am (indicated by the vertical line), we changed the sample rate from 10 minutes to 1 minute as a guesstimate for what 10 times as many devices would look like. All was fine during this time, so I’m guessing we can handle at least 150 devices (my best guess after simulating a higher load).


Is there any good way to estimate the limits of our server and the number of devices we can support? I was tasked with getting an estimated cost to operate this server but haven’t a clue how to make an educated guess. I was originally planning to request upgrades as we need it, but I need to present an estimated cost per device.

You could try using the LoRaWAN simulator, which simulates devices and gateways, to see the cost at different device counts. Obviously to do this testing you’d actually have to pay that cost for the duration of the test but it could give you pretty accurate results.