Run the flash command: sparkparticle flash --usb deep_update_2014_06
This installs the deep update from a binary that is packaged with the Spark CLI, so you don’t have to download it.
TODO: DEPLOY THIS
Update - nothing … same error. Did not find the file.
But now I know, thanks to you, it´s in the package of the spark-cli and is not downloaded right in the moment where I try to flash the core.
So I did a quick
find / -name deep_update_2014_06.bin
and found the file (Raspbian) under:
/usr/local/lib/node_modules/spark-cli/binaries/deep_update_2014_06.bin
Now I finally able to do the deep-flash-dance with:
…
DfuSe interface name: "Internal Flash "
Downloading to address = 0x08005000, size = 93636
Download [=========================] 100% 94208 bytes
Download done.
File downloaded successfully
Transitioning to dfuMANIFEST state
Flashed!
Last thing I´d like to know is: In the link from my first post there is a troubleshoot section or something like to check if the deep_update applied successfully. But my output looks quite different and there is no “version”
If you ran the command spark flash --usb deep_update_2014_06 and it didn’t work as expected, then you have the wrong version of the CLI, and the wrong version of deep_update. The local spark-protocol module doesn’t store the cc3000 patch version, but if you’re subscribing to your events, you should see it when your core starts up, try a spark subscribe mine.
Hmm, the deep_update works by optionally patching the cc3000, then it connects to the cloud and does an auto-upgrade. That feature might not be working on the local cloud. It sounds like you’re using the local cloud, can you try flashing the latest tinker via usb after the patch is done?
Öhm sorry what this exactly means?
I cannot subscribe because cli tells me to log in (which is the login to the spark cloud)… so what exactly did i do wrong ?
If you have the CLI pointed at your local cloud, then try the following:
# new feature! :)
# fix normal apiUrl
spark config spark apiUrl https://api.spark.io
# setup your local api url
spark config local apiUrl http://mylocalcloud:8080
# switch to your local setup
spark config local
#start the setup so you can create a new user, but hit ctrl-c after your user account is setup. :)
spark setup
# listen for local events
spark subscribe mine
~ clyde$ spark subscribe mine Subscribing to all events from my personal stream (my cores only)
Listening to: /v1/devices/events
Now it stops, i have tried to ctrl+c and then type another time the curl command with same result. Then I tried in a screen to subscribe and outside the screen run the curl command… same result…
this game runs a dozen times if i switch my core on. The most time it connects successfully to the wifi and then searches for the cloud blinks red and resets and starts at the beginning. I have found in the forum that I not the alone with this problem. After 20-30 tries the core connect and the program starts?! Can´t find the problem, the spark-server log says as you see every time “Core online” and there is no disconnect message. But something must be missing?
Is that “deep update” problem. What we have discussed here.
Sure, I can put that code in my setup call of my project, but y? I thought the update should run out of the box? Give me a bit more what I have to do, feels like you have lost me at the side of the road, when you know what I mean…
And currently I have the problem that I have to flash with USB not OTA means I have to unplugg my project and bring in to my computer thats little bit annoying.
Should I keep it there or is this a “one timer”?
Deinstalled curernt version with npm uninstall -g spark-cli
Reinstalled it with npm install -g spark-cli
To absolute be safe I updeded with npm update -g spark-cli
Info from Readme: 0.3.95 (/usr/local/lib/node_modules/spark-cli/Readme.md
I connect the core to my raspberry and startet the spark-server in a screen
I set the Core in DFU Mode bei Pressing Mode + RST release RST and whaiting for flashing yellow
Now I tried spark flash --factory tinker
/usr/local/lib/node_modules/spark/bin/spark:14
netBinding = process.binding(‘net’),
^
Error: No such module
at Object. (/usr/local/lib/node_modules/spark/bin/spark:14:26)
at Module._compile (module.js:456:26)
at Object.Module._extensions…js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
at startup (node.js:119:16)
at node.js:906:3
Thats it, tried to locate “tinker” or “cc3000” as a file but did not find it.
Sorry for making such trouble… it´s not my intention!
Some checks later I can locate in the /binaries folder of the spark-cli there are the files which we need I think. So I tried first do move to the correct folder:
cd /usr/local/lib/node_modules/spark-cli/
spark flash --factory tinker
Again nothing, last try:
cd /binaries
spark flash --factory spark-tinker.bin