Öhm sorry what this exactly means?
I cannot subscribe because cli tells me to log in (which is the login to the spark cloud)… so what exactly did i do wrong ?
If you have the CLI pointed at your local cloud, then try the following:
# new feature! :)
# fix normal apiUrl
spark config spark apiUrl https://api.spark.io
# setup your local api url
spark config local apiUrl http://mylocalcloud:8080
# switch to your local setup
spark config local
#start the setup so you can create a new user, but hit ctrl-c after your user account is setup. :)
spark setup
# listen for local events
spark subscribe mine
~ clyde$ spark subscribe mine Subscribing to all events from my personal stream (my cores only)
Listening to: /v1/devices/events
Now it stops, i have tried to ctrl+c and then type another time the curl command with same result. Then I tried in a screen to subscribe and outside the screen run the curl command… same result…
this game runs a dozen times if i switch my core on. The most time it connects successfully to the wifi and then searches for the cloud blinks red and resets and starts at the beginning. I have found in the forum that I not the alone with this problem. After 20-30 tries the core connect and the program starts?! Can´t find the problem, the spark-server log says as you see every time “Core online” and there is no disconnect message. But something must be missing?
Is that “deep update” problem. What we have discussed here.
Sure, I can put that code in my setup call of my project, but y? I thought the update should run out of the box? Give me a bit more what I have to do, feels like you have lost me at the side of the road, when you know what I mean…
And currently I have the problem that I have to flash with USB not OTA means I have to unplugg my project and bring in to my computer thats little bit annoying.
Should I keep it there or is this a “one timer”?
Deinstalled curernt version with npm uninstall -g spark-cli
Reinstalled it with npm install -g spark-cli
To absolute be safe I updeded with npm update -g spark-cli
Info from Readme: 0.3.95 (/usr/local/lib/node_modules/spark-cli/Readme.md
I connect the core to my raspberry and startet the spark-server in a screen
I set the Core in DFU Mode bei Pressing Mode + RST release RST and whaiting for flashing yellow
Now I tried spark flash --factory tinker
/usr/local/lib/node_modules/spark/bin/spark:14
netBinding = process.binding(‘net’),
^
Error: No such module
at Object. (/usr/local/lib/node_modules/spark/bin/spark:14:26)
at Module._compile (module.js:456:26)
at Object.Module._extensions…js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
at startup (node.js:119:16)
at node.js:906:3
Thats it, tried to locate “tinker” or “cc3000” as a file but did not find it.
Sorry for making such trouble… it´s not my intention!
Some checks later I can locate in the /binaries folder of the spark-cli there are the files which we need I think. So I tried first do move to the correct folder:
cd /usr/local/lib/node_modules/spark-cli/
spark flash --factory tinker
Again nothing, last try:
cd /binaries
spark flash --factory spark-tinker.bin
Hmm, it seems like your install of the CLI on your raspberry pi is really off. Can you try making sure you have Node.js installed properly by following this guide?
Why do things twice?
I hace my local-cloud installation on the RaspberryPI and therefore also the cli. An update should not be the problem as far as I understand it. So I guess I did not understand why to use my mac to install the latest node-js, dfu-util and spark-cli. I do have that all up and running on my Pi. So in the future I hopefully only send my new Code to the Core OTA. I don´t do stuff with core and pi.
Nothing has changed? So what can I really do? Reconnect my core to the cloud? Update it and then bring it back to my local cloud instance? It would be awesome if one of my problems can be fixed.
Hmm, Something is still really wrong with your setup, but I'm not sure what yet. Can you paste what is in that file the error message referenced earlier? /usr/local/lib/node_modules/spark/bin/spark ? There shouldn't be a node_module called "spark", so I think maybe you didn't install the CLI at all... But this? spark - npm
/usr/local/lib/node_modules/spark/bin/spark:14
netBinding = process.binding(‘net’),
^
Error: No such module
at Object. (/usr/local/lib/node_modules/spark/bin/spark:14:26)
at Module._compile (module.js:456:26)
at Object.Module._extensions…js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
at startup (node.js:119:16)
at node.js:906:3