Reading Data from core comes in a weird format

when using the code examples from the documentation I tried this:

int loopCount = 0;
void setup()
{
    Spark.variable("loopCount", &loopCount, INT);
}
void loop(){
      loopCount++;
}

but when I call the API to get the value of the loopCount I get this json back:

{
  "cmd": "VarReturn",
  "name": "loopCount",
  "result": [  0,0,1,12 ],
  "coreInfo": {
    "last_app": "foo",
    "last_heard": "2013-12-16T04:22:32.223Z",
    "connected": false,
    "deviceID": "xxxxxxxxxxxxxxxxxxx"
  }
}

the problem is that the value is shown as 4 bytes instead of a single interger. The value you see above is actually 0,0,1,12 which stands for 267. Why dont i get just and integer back?

Yup, we know—super weird, eh? This one’s already on my sprint—you can expect it to be fixed by the end of this week.

cool, just wanted to make sure you guys were aware. At first I thought it was totally wrong before I realized that it was a byte array. I also noticed that if you used a negative number on the return value on the Spark.function() call, you get some super large number, can it only be positive?

Thanks, we know about that one too. That’s showing MAX_INT instead of -1, which it ought to display. Same basic issue, different face—we just have to translate all the steps from the binary messages sent by the Core through the Cloud into user-facing JSON.

Yeah I figured, just trying to document what I see, good luck!

1 Like

Just wanted to add I hit this same issue too - I was about to start doing power of 2 stuff on the resulting array (since 0,0,1,12 = 0 + 0 + 255 + 12 = 267) but will wait till the end of the week. :smile:

1 Like

Looks like it’s fixed!

Not exactly, but we put a quick temporary solution in to help people out until we do it right. At least you don’t have to convert that array yourself!

1 Like

@zachary, Quick bandaid or not, it’s good enough for me!

2 Likes

I have a light-dependent sensor set up on my core. When I use cURL to access the LDR from the REST API, I get the following result returned. Is this the result I should expect? What format is the result field and how would I parse that?

$ curl -G https://api.spark.io/v1/devices/DEVICEID/ldr -d access_token=ACCESSTOKEN
{
“cmd”: “VarReturn”,
“name”: “ldr”,
“allTypes”: {
“string”: “\u0000\u0000\u0005�”,
“uint32”: 1474,
“number”: 1474,
“double”: null,
“raw”: “\u0000\u0000\u0005�”
},
“result”: “\u0000\u0000\u0005�”,
“coreInfo”: {
“last_app”: “foo”,
“last_heard”: “2013-12-22T21:35:53.803Z”,
“connected”: false,
“deviceID”: “DEVICEID”
}
}

I get something similar when I pull the value from my analog thermistor. I simply grab allTypes.number and use that.

{
  “cmd”: “VarReturn”,
  “name”: “temp_raw”,
  “allTypes”: {
    “string”: “\u0000\u0000\bL”,
    “uint32”: 2124,
    “number”: 2124,
    “double”: null,
    “raw”: “\u0000\u0000\bL”
  },
  “result”: “\u0000\u0000\bL”,
  “coreInfo”: {
    “last_app”: “foo”,
    “last_heard”: “2013-12-22T20:42:31.590Z”,
    “connected”: false,
    “deviceID”: “50ff71065067545628110287”
  }
}

Okay. Good to know at least that what was returned seems legitimate. I’ll have to review the docs for the LDR to make some sense of the integer returned. I had expected a number between 0 and 1023, whereas the values in the allTypes.number field are in the 1000 to 3500 range.

Thanks @wgbartley.

I believe the numbers returned will be in the range of 0 - 4095 (4,096 possible values which is 4x 1,024 like you may be accustomed to on an Arduino). The Spark has higher precision analog inputs and uses only 3.3v as well, so a voltage variance could also play into some calculations, though I’m not sure what or how off the top of my head. If you want to stick with 0 - 1023, you can use the map() function like this: map(your_variable, 0, 4095, 0, 1023).

1 Like

Ahh, that makes sense. I didn’t know/realize that the spark had higher precision analog inputs. I had suspected the 3.3v may play a part in the variance, but don’t know enough about electrical engineering to understand why/how. Interestingly, I had to invert the map function. My LDR readings were increasing with less light. Not sure why that would be.

ldr = map(analogRead(A0), 0, 4095, 1023, 0);

In any case, it “works”.

And just to be extra clear @benddennis and @wgbartley — that allTypes key in the response will go away when we fix this problem for real, so do not depend on it long term. We just knew it would take us a while, and it was easy to add a little help for people in the short term. In fact, just in case, I plan to rename that variable to something like TEMPORARY_allTypes for those who don’t read the forum. :smile:

Aaaaand, I just deployed that change—the JSON key is now TEMPORARY_allTypes.

1 Like

Ok, this explains why I’m seeing this as my response today… good to know!

{
  "cmd": "VarReturn",
  "name": "startstate",
  "TEMPORARY_allTypes": {
    "string": "\u0000\u0000\u0000\u0001",
    "uint32": 1,
    "number": 1,
    "double": null,
    "raw": "\u0000\u0000\u0000\u0001"
  },
  "result": "\u0000\u0000\u0000\u0001",
  "coreInfo": {
    "last_app": "foo",
    "last_heard": "2013-12-28T01:18:46.245Z",
    "connected": false,
    "deviceID": "xxxxx"
  }
}

TEMPORARY_allTypes.number it is then… for now :wink:

@benddennis how did you connect the LDR to the Core? Thank you

@d82k see below. I believe the circuit is from Adafruit. If you google LDR arduino tutorial, I believe you’ll find a walk-through.