Deep Update (source file for spark-cli)

Hi @Dave

sorry tried but it seems like exactly the same result as before:

factory:
DfuSe interface name: "SPI Flash : SST25x"
Downloading to address = 0x00020000, size = 76808

File downloaded successfully
Flashed!

usb:
DfuSe interface name: "Internal Flash "
Downloading to address = 0x08005000, size = 76808

File downloaded successfully
Transitioning to dfuMANIFEST state
Error during download get_status
Flashed!

curl with access_token:
[
{
“id”: “mycoreid”,
“name”: “ArwenUndomiel”,
“last_heard”: “2014-07-23T19:25:30.797Z”,
“connected”: true
}
]

curl with coreid and access_token:
{
“id”: “mycoreid”,
“name”: “ArwenUndomiel”,
“connected”: true,
“variables”: {},
“functions”: [
“digitalread”,
“digitalwrite”,
“analogread”,
“analogwrite”
]
}

:frowning:

Hi @clyde,

Sorry, the new Tinker will publish the version as an event, so you’d need to be subscribed when the core connects initially.

Thanks,
David

Öhm sorry what this exactly means?
I cannot subscribe because cli tells me to log in (which is the login to the spark cloud)… so what exactly did i do wrong ?

Hi @clyde,

If you have the CLI pointed at your local cloud, then try the following:

# new feature! :) 
# fix normal apiUrl
spark config spark apiUrl https://api.spark.io

# setup your local api url
spark config local apiUrl http://mylocalcloud:8080

# switch to your local setup
spark config local

#start the setup so you can create a new user, but hit ctrl-c after your user account is setup.  :) 
spark setup

# listen for local events    
spark subscribe mine

Thanks!
David

~ clyde$ spark subscribe mine
Subscribing to all events from my personal stream (my cores only)
Listening to: /v1/devices/events

Now it stops, i have tried to ctrl+c and then type another time the curl command with same result. Then I tried in a screen to subscribe and outside the screen run the curl command… same result…

Hi @clyde,

Ahh, hmm, sorry! Can you try throwing this code chunk in your setup call?

unsigned char patchver[2];
nvmem_read_sp_version(patchver);
char patchstr[8];
snprintf(patchstr, 8, "%d.%d", patchver[0], patchver[1]);
Spark.publish("spark/cc3000-patch-version", patchstr, 60, PRIVATE);

Thanks,
David

@Dave

Sadly I have more problems since I used the local cloud, hoped the deep update clears up some of them.

  1. Connecting to the cloud takes about 20-30 times.

    Connection from: 192.168.178.xx, connId: 1
    on ready { coreID: ‘coreid’,
      ip: ‘192.168.178.xx’,
      product_id: 65535,
      firmware_version: 65535,
      cache_key: undefined }
    Core online!

this game runs a dozen times if i switch my core on. The most time it connects successfully to the wifi and then searches for the cloud blinks red and resets and starts at the beginning. I have found in the forum that I not the alone with this problem. After 20-30 tries the core connect and the program starts?! Can´t find the problem, the spark-server log says as you see every time “Core online” and there is no disconnect message. But something must be missing?

  1. Former running progam which uses spark.publish() wan´t run anymore. My Website did not get any results. I send my information to the git issue that @kennethlimcp has opened (https://github.com/spark/spark-server/issues/229)

  2. Is that “deep update” problem. What we have discussed here.
    Sure, I can put that code in my setup call of my project, but y? I thought the update should run out of the box? Give me a bit more what I have to do, feels like you have lost me at the side of the road, when you know what I mean…
    And currently I have the problem that I have to flash with USB not OTA means I have to unplugg my project and bring in to my computer thats little bit annoying.
    Should I keep it there or is this a “one timer”?

Hi @clyde,

Deep_update doesn’t work the same way on the local cloud, if you want to deep_update your core locally, try the following (one time thing):

1.) update your copy of the spark-cli npm update -g spark-cli

2.) connect your core in dfu mode:

spark flash --factory tinker
spark flash --usb cc3000

#when your core starts flashing yellow again
spark flash --usb tinker

If you’re building locally, make sure you’ve pulled the latest code from the repositories as well.

– A few other people have mentioned they’re not seeing the local cloud API show published messages properly -( https://github.com/spark/spark-server/issues/22 ), so I’m looking into that.

– 20-30 connection attempts sounds pretty nuts, what else is happening? Does it take 20-30 tries with stock tinker?

Thanks!
David

1 Like

Ok @Dave: Thx for your advise, seems that sometimes I need a helping hand.
I will try this out asap and wrote results to this thread.

1 Like

No worries, thanks for troubleshooting with us :slight_smile:

1 Like

@Dave

Here are my results:

  1. Deinstalled curernt version with npm uninstall -g spark-cli
    Reinstalled it with npm install -g spark-cli
    To absolute be safe I updeded with npm update -g spark-cli
    Info from Readme: 0.3.95 (/usr/local/lib/node_modules/spark-cli/Readme.md

  2. I connect the core to my raspberry and startet the spark-server in a screen
    I set the Core in DFU Mode bei Pressing Mode + RST release RST and whaiting for flashing yellow

Now I tried spark flash --factory tinker

/usr/local/lib/node_modules/spark/bin/spark:14
    netBinding = process.binding(‘net’),
                         ^
Error: No such module
    at Object. (/usr/local/lib/node_modules/spark/bin/spark:14:26)
    at Module._compile (module.js:456:26)
    at Object.Module._extensions…js (module.js:474:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Function.Module.runMain (module.js:497:10)
    at startup (node.js:119:16)
    at node.js:906:3

Thats it, tried to locate “tinker” or “cc3000” as a file but did not find it.
Sorry for making such trouble…:frowning: it´s not my intention!
Some checks later I can locate in the /binaries folder of the spark-cli there are the files which we need I think. So I tried first do move to the correct folder:

cd /usr/local/lib/node_modules/spark-cli/
spark flash --factory tinker

Again nothing, last try:

cd /binaries
spark flash --factory spark-tinker.bin

Same on --usb cc3000 ;(

Hi @clyde,

Hmm, it seems like your install of the CLI on your raspberry pi is really off. Can you try making sure you have Node.js installed properly by following this guide?

Thanks!
David

HI @Dave

node -v

v0.10.29

npm update -g spark-cli

rotating cursor nothing

spark flash --factory tinker

Same error as above…

Here are my locations of the spark-cli

/usr/local/lib/node_modules/spark-cli
/usr/local/lib/node_modules/spark-cli/node_modules/serialport/build/Release/.deps/usr/local/lib/node_modules/spark-cli
/home/clyde/.npm/registry.npmjs.org/spark-cli
/root/.npm/spark-cli
/root/.npm/registry.npmjs.org/spark-cli

are them ok? Or did I mess things up?

@clyde,

Are you on Mac OSx? It’s weird that you don’t require sudo

Can you do the following:

  1. Uninstall spark-cli

  2. CD to the desktop directory before running sudo npm install -g spark-cli

I would try spark flash --usb tinker instead. Never tested the --factory flag but i know --usb works fine.

@kennethlimcp
this runs on my raspberry pi and I have remove spark-cli already and installed it once again…

I would suggest that you do on a laptop instead unless you reallyyyyy need spark-cli on the Rpi…

Treated my Rpi like a server and used spark-cli to connect to it via my laptop as I’m not going to be plugging my cores to the RPI and do stuff

1 Like

Why do things twice?
I hace my local-cloud installation on the RaspberryPI and therefore also the cli. An update should not be the problem as far as I understand it. So I guess I did not understand why to use my mac to install the latest node-js, dfu-util and spark-cli. I do have that all up and running on my Pi. So in the future I hopefully only send my new Code to the Core OTA. I don´t do stuff with core and pi.

Nothing has changed? So what can I really do? Reconnect my core to the :spark: cloud? Update it and then bring it back to my local cloud instance? It would be awesome if one of my problems can be fixed.

Hi Clyde,

Hmm, Something is still really wrong with your setup, but I'm not sure what yet. Can you paste what is in that file the error message referenced earlier? /usr/local/lib/node_modules/spark/bin/spark ? There shouldn't be a node_module called "spark", so I think maybe you didn't install the CLI at all... But this? spark - npm

Thanks,
David

/usr/local/lib/node_modules/spark/bin/spark:14
    netBinding = process.binding(‘net’),
                         ^
Error: No such module
    at Object. (/usr/local/lib/node_modules/spark/bin/spark:14:26)
    at Module._compile (module.js:456:26)
    at Object.Module._extensions…js (module.js:474:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Function.Module.runMain (module.js:497:10)
    at startup (node.js:119:16)
    at node.js:906:3

http://pastebin.com/VrxbsPBt

Hi @Clyde,

You have something else installed that’s using the spark name:

 Sencha Spark
 * Copyright(c) 2010 Sencha Inc.
 * MIT Licensed

Can you uninstall that, and reinstall the cli?

Something like:

npm uninstall -g spark
npm uninstall -g spark-cli
sudo npm install -g spark-cli

Thanks,
David

1 Like