From Bytes array to InfluxDB

I’m a quite new in this forum working with a raspberry pi 3 and iC980A SPI to make a concentrator in US Freq Plan. After figuring out how to make it work I found this part in the Applications where I must define a Codec or use the CayenneLPP.
With CayenneLPP everything go smooth, and my nodes send the data perfectly. But I have no idea about how to use Javascript to decode my custom payload. To introduce in the right way the problem I’ll detail it here:

  • The payload is an array of values:
    [a0, a1, a2, a3, a4, a5, a6, a7, b0, b1, b2, b3, c0]
    Every value with the letter “a” represent a single sensor, same to “b” and “c”.
    example of values:
    [121, 164, 9, 110, 152, 213, 55, 244, 59, 55, 60, 61, 5]
  • The output wanted is an array for each group of values defined by me.
    Example: (following the idea)
    So, with this I would like to get:
    a = { 121, 164, 9, 110, 152, 213, 55, 244 }
    b = { 59, 55, 60, 61 }
    c = 5

The problem is that I don’t know about Javascript at all. My question is: Can anyone help me to build the Javascript Codec Function to have an output like the above andd send it to InfluxDB?
The full idea is the next, just to share the project:

  1. Build a node with analog sensors to check the voltage in the line, the temperature of a DC motor and the RPM of this motor like this:
  • Take several measurements in the line to check the stability (that’s why an array of values) of the line and if it’s necessary turn off the system to protect the actives.
  • Take several measurements in the temperature to check it’s not overloaded.
  • Take the average RPM in the interval between radio transmissions.
  1. Send the array of values (maximum 51 to complain LoRaWAN specs) to the LoRa Server and decode the data in the array.
  2. Push each array to an InfluxDB to visualize the state of the active and trigger alerts if an anomaly is detected or if a temperature is Higher than expected.

Hope you can help me, I’ll keep trying and digging to learn Javascript.
Thanks in advance! Regards,


After a few hours reading a little bit more and doing some test an fails I’ve got a way to solve my idea, here’s the code:

function Decode(fPort, bytes) {
  var i;
  var  line_read_length = 32;
  var  temp_raw = [];
  var  rpm_raw = [];
  var  rpm = bytes[48];
    line_raw[i] = bytes[i]+bytes[i+1]*256;
      temp_raw[i+line_read_length] = bytes[i+line_read_length]+bytes[1+i+line_read_length]*256;
  return {"Voltage Line Data": line_raw, "Temperature Data": temp_raw, "RPM Data": rpm};

and this put in the uplink info the data like:
Voltage Line Data:
0: 492
1: 525
2: 545
3: 515
4: 491
5: 503
6: 540
7: 522
8: 493
9: 486
10: 523
11: 543
12: 507
13: 484
14: 503
15: 541
Temperature Data:
0: 10
1: 10
2: 11
3: 12
4: 10
5: 10
6: 8
7: 9

RPM: 0

Now the next step is to know how I must integrate this with InfluxDB to show it with Grafana

After a few hours I make work the RPM but now the array, has anyone any idea how to pass the array to Influx? Thanks

Have you checked the InfluxDB integration docs?

Hi Ignacio,
Thanks for reply. Yes, I’ve checked the documentation and in the measurements in the InfluxDB I see the RPM but I have no way to make the same with the other objects, because I get an error message in the lora-app-server log where it seems it can’t handle an array of values.
So, just to update the situation, I have done with the integration and I’m having problems to send an Array of values to the InfluxDB. If anyone know how to do this, I’ll appreciate the help.
Thanks in advance.

I don’t believe you can store an array in InfluxDB. I have to look into this what LoRa App Server does in this case (I don’t think this is handled), but potentially it could store this as measurement_name_INDEX, thus turning a temperature: [20, 22, 21] into temperature_0 = 20, temperature_1 = 22, … You could also do this in the custom JS function.

Thanks for reply. I’m checking the code inside Influx handler and I figure it out, it can’t handle arrays of values, so I must modify this to make it work, so I have 2 options:

  1. I modify the Go file, to make a case when it get an array and make it push one by one to influx
  2. Send it as you said, tagged values and then check how to make the Grafana side shows them in order and in the right way.

But for now I’m reading how to build from sources the lora-app-server to try to modify the Go file, so probably I’ll post in the github if I have some problems, at the moment I have problems to build from sources after I have a running app at the moment.
Thanks for help, and any suggestion about will be welcome.

Okay, finally with a solution for this problem.
What I’ve done is not exactly the best solution but it’s the simplest.
If you don’t want to mess with the Go code, what you can do it’s take the data and convert the payload in String where you “tag” the measurements and when it’s done, the integration will push the data to the influx in different tables. The code here:

 function Decode(fPort, bytes) {
   var i;
   var  vac_read_length = 32;
   var vac_raw = new String();
   var  temp_raw = new String();
   var  rpm = bytes[48];
     vac_raw[i/2] = bytes[i]+bytes[i+1]*256;
       temp_raw[i/2] = bytes[i+vac_read_length]+bytes[1+i+vac_read_length]*256;
   return {VAC: vac_raw, Temperature: temp_raw, RPM: rpm};

so in this conditions, after decoding, you’ll get:


in the measurements of your InfluxDB.
But, the order of the elements should be:

device_frmpayload_data_VAC_0, element 0
device_frmpayload_data_VAC_1, element 0
device_frmpayload_data_VAC_2, element 0
device_frmpayload_data_VAC_15, element 0
device_frmpayload_data_VAC_0, element 1
device_frmpayload_data_VAC_1, element 1
device_frmpayload_data_VAC_15, element 1
device_frmpayload_data_VAC_0, element N
device_frmpayload_data_VAC_15, element N

Well, to do this I implemented a python code to read, i.e. all the data of VAC (voltage line sensor) from an specific Device and put it in the right order in another measurement DB with the same DevEUI tag. An example of code to do this in python 3 is:
import time
from datetime import datetime
from datetime import timedelta
import pytz
from influxdb import InfluxDBClient

cliente = InfluxDBClient(host='localhost', port=8086)
json_body = []
escritura = []
j = 0
aux = []
readSize = 256

for k in range(0, readSize, 1):

for i in range(0, 16, 1):
	consultar = 'SELECT "value" FROM "autogen"."device_frmpayload_data_VAC_'
	consultar = consultar + str(i)+ '"'
	results = cliente.query(consultar)
	points = results.get_points()
	for point in points:
		if j<readSize:
			aux[j] = point['value']

now ="America/Argentina/La_Rioja"))

for m in range(0, readSize, 1):
	d = timedelta(hours=3,milliseconds=2*m)
	instante = now + d
	instante = instante.strftime("%Y-%m-%dT%H:%M:%S.%fZ")
	valor = float(aux[m]/1024 * 220)
	escritura ={"measurement":"VAC_ff00089006437039","tags":{"DeviceID":"ff00089006437039","GW":"b000ebfffefff627"},"time": instante,"fields":{"value": float(valor)}}


The code just use the InfluxDB python client API with the datetime objects to write in this case a json formated value collecting 256 measurements with the same tag in 16 different tables to put the all together and it runs as a service every 5 minutes when it’s expected to have around 256 values some of them new and some other olders. After that you can improve the query if you want, like if you want the last 256 or wait for new values, or maybe add a drop of the data after collecting it to the new one.
Anyway, I think it’s quite enough to give some idea to someone with the same problem, and it need to be improved for a better performance.
Suggestions are welcome, and I repeat, this is NOT the best way to solve this. I’m working with the Go file, but without success yet.

I’d encourage you to do it directly from lora-app-server using a Go client, but if you’re more comfortable with Python or don’t wanna get into the source, I think at this point it’s better to just use the HTTP integration to get your data out. In this way, you can have your Python service receive the data, format it in whatever way you need and then insert into InfluxDB, keeping the data simple and with the needed measurements only.