Deep Update (source file for spark-cli)

Hi @clyde,

Hmm, it seems like your install of the CLI on your raspberry pi is really off. Can you try making sure you have Node.js installed properly by following this guide?

Thanks!
David

HI @Dave

node -v

v0.10.29

npm update -g spark-cli

rotating cursor nothing

spark flash --factory tinker

Same error as above…

Here are my locations of the spark-cli

/usr/local/lib/node_modules/spark-cli
/usr/local/lib/node_modules/spark-cli/node_modules/serialport/build/Release/.deps/usr/local/lib/node_modules/spark-cli
/home/clyde/.npm/registry.npmjs.org/spark-cli
/root/.npm/spark-cli
/root/.npm/registry.npmjs.org/spark-cli

are them ok? Or did I mess things up?

@clyde,

Are you on Mac OSx? It’s weird that you don’t require sudo

Can you do the following:

  1. Uninstall spark-cli

  2. CD to the desktop directory before running sudo npm install -g spark-cli

I would try spark flash --usb tinker instead. Never tested the --factory flag but i know --usb works fine.

@kennethlimcp
this runs on my raspberry pi and I have remove spark-cli already and installed it once again…

I would suggest that you do on a laptop instead unless you reallyyyyy need spark-cli on the Rpi…

Treated my Rpi like a server and used spark-cli to connect to it via my laptop as I’m not going to be plugging my cores to the RPI and do stuff

1 Like

Why do things twice?
I hace my local-cloud installation on the RaspberryPI and therefore also the cli. An update should not be the problem as far as I understand it. So I guess I did not understand why to use my mac to install the latest node-js, dfu-util and spark-cli. I do have that all up and running on my Pi. So in the future I hopefully only send my new Code to the Core OTA. I don´t do stuff with core and pi.

Nothing has changed? So what can I really do? Reconnect my core to the :spark: cloud? Update it and then bring it back to my local cloud instance? It would be awesome if one of my problems can be fixed.

Hi Clyde,

Hmm, Something is still really wrong with your setup, but I'm not sure what yet. Can you paste what is in that file the error message referenced earlier? /usr/local/lib/node_modules/spark/bin/spark ? There shouldn't be a node_module called "spark", so I think maybe you didn't install the CLI at all... But this? spark - npm

Thanks,
David

/usr/local/lib/node_modules/spark/bin/spark:14
    netBinding = process.binding(‘net’),
                         ^
Error: No such module
    at Object. (/usr/local/lib/node_modules/spark/bin/spark:14:26)
    at Module._compile (module.js:456:26)
    at Object.Module._extensions…js (module.js:474:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Function.Module.runMain (module.js:497:10)
    at startup (node.js:119:16)
    at node.js:906:3

http://pastebin.com/VrxbsPBt

Hi @Clyde,

You have something else installed that’s using the spark name:

 Sencha Spark
 * Copyright(c) 2010 Sencha Inc.
 * MIT Licensed

Can you uninstall that, and reinstall the cli?

Something like:

npm uninstall -g spark
npm uninstall -g spark-cli
sudo npm install -g spark-cli

Thanks,
David

1 Like
root@cloudberry:/home/clyde# npm uninstall spark
npm WARN uninstall not installed in /home/clyde/node_modules: "spark"
root@cloudberry:/home/clyde# npm uninstall -g spark-cli
unbuild spark-cli@0.3.95
npm ERR! Refusing to delete: /usr/local/bin/spark not in /usr/local/lib/node_modules/spark-cli
File exists: /usr/local/bin/spark
Move it away, and try again. 

npm ERR! System Linux 3.12.22+
npm ERR! command "/usr/local/bin/node" "/usr/local/bin/npm" "uninstall" "-g" "spark-cli"
npm ERR! cwd /home/clyde
npm ERR! node -v v0.10.29
npm ERR! npm -v 1.4.14
npm ERR! path /usr/local/bin/spark
npm ERR! code EEXIST
npm ERR! 
npm ERR! Additional logging details can be found in:
npm ERR!     /home/clyde/npm-debug.log
npm ERR! not ok code 0
root@cloudberry:/home/clyde# cd /usr/local/bin/spark 
bash: cd: /usr/local/bin/spark: Ist kein Verzeichnis
root@cloudberry:/home/clyde# cp /usr/local/bin/spark /home/clyde/Backup/spark
root@cloudberry:/home/clyde# rm /usr/local/bin/spark 
root@cloudberry:/home/clyde# npm uninstall -g spark-cli
unbuild spark-cli@0.3.95
root@cloudberry:/home/clyde# sudo npm install -g spark-cli
 
> serialport@1.4.0 install /usr/local/lib/node_modules/spark-cli/node_modules/serialport
> node-pre-gyp install --fallback-to-build

gyp WARN EACCES user "root" does not have permission to access the dev dir "/root/.node-gyp/0.10.29"
gyp WARN EACCES attempting to reinstall using temporary dev dir "/usr/local/lib/node_modules/spark-cli/node_modules/serialport/.node-gyp"
make: Entering directory `/usr/local/lib/node_modules/spark-cli/node_modules/serialport/build'
  CXX(target) Release/obj.target/serialport/src/serialport.o
  CXX(target) Release/obj.target/serialport/src/serialport_unix.o
  CXX(target) Release/obj.target/serialport/src/serialport_poller.o
  SOLINK_MODULE(target) Release/obj.target/serialport.node
  SOLINK_MODULE(target) Release/obj.target/serialport.node: Finished
  COPY Release/serialport.node
  COPY /usr/local/lib/node_modules/spark-cli/node_modules/serialport/build/serialport/v1.4.0/Release/node-v11-linux-arm/serialport.node
  TOUCH Release/obj.target/action_after_build.stamp
make: Leaving directory `/usr/local/lib/node_modules/spark-cli/node_modules/serialport/build'
/usr/local/bin/spark -> /usr/local/lib/node_modules/spark-cli/bin/spark.js
spark-cli@0.3.95 /usr/local/lib/node_modules/spark-cli
├── xtend@3.0.0
├── when@3.4.2
├── request@2.39.0 (json-stringify-safe@5.0.0, forever-agent@0.5.2, aws-sign2@0.5.0, qs@0.6.6, oauth-sign@0.3.0, stringstream@0.0.4, tunnel-agent@0.4.0, node-uuid@1.4.1, mime-types@1.0.1, form-data@0.1.4, tough-cookie@0.12.1, http-signature@0.10.0, hawk@1.1.1)
├── hogan.js@2.0.0
├── moment@2.7.0
└── serialport@1.4.0 (bindings@1.1.1, sf@0.1.6, async@0.1.18, nan@0.7.1, optimist@0.3.7, node-pre-gyp@0.5.19)

Hi @clyde,

How about

sudo npm uninstall -g spark

you only ran root@cloudberry:/home/clyde# npm uninstall spark before, which wouldn’t be a ‘global uninstall’.

Thanks,
David

1 Like

@Dave:
First line :slight_smile: have edited the post and now I seems to be able to do the job:

spark flash --factory tinker
Apparently I didn't find a DFU device? util said  dfu-util 0.7

Copyright 2005-2008 Weston Schmidt, Harald Welte and OpenMoko Inc.
Copyright 2010-2012 Tormod Volden and Stefan Schmidt
This program is Free Software and has ABSOLUTELY NO WARRANTY
Please report bugs to dfu-util@lists.gnumonks.org


Error writing firmware... no dfu device found.

And this is absolutlely correct, because currently I have no device attached… :slight_smile:

1 Like

@Dave THX!!! Dude!!!

This looks pretty good now, one last question:
I used @suda ´s ATOM IDE package for spark. I want to send my user code to the core again. Is this a problem like that it wrotes back old firmware? I do not want to make the effort we made lost :slight_smile: But let me think in my understanding the deep update contains of two parts: Firmware update for the CC3000 Wifi chip and the Core basic chip. All Code which I wrote and send via :spark: CloudIDE or via anything else are just send to the “user space”?

  1. spark flash --factory tinker
    FOUND DFU DEVICE 1d50:607f
    checking file  /usr/local/lib/node_modules/spark-cli/binaries/spark_tinker.bin
    spawning dfu-util -d 1d50:607f -a 1 -i 0 -s 0x00020000 -D /usr/local/lib/node_modules/spark-cli/binaries/spark_tinker.bin
    dfu-util 0.7
    
    Copyright 2005-2008 Weston Schmidt, Harald Welte and OpenMoko Inc.
    Copyright 2010-2012 Tormod Volden and Stefan Schmidt
    This program is Free Software and has ABSOLUTELY NO WARRANTY
    Please report bugs to dfu-util@lists.gnumonks.org
    
    dfu-util: Invalid DFU suffix signature
    dfu-util: A valid DFU suffix will be required in a future dfu-util release!!!
    Opening DFU capable USB device...
    ID 1d50:607f
    Run-time device DFU version 011a
    Claiming USB DFU Interface...
    Setting Alternate Setting #1 ...
    Determining device status: state = dfuERROR, status = 10
    dfuERROR, clearing status
    Determining device status: state = dfuIDLE, status = 0
    dfuIDLE, continuing
    DFU mode device DFU version 011a
    Device returned transfer size 1024
    DfuSe interface name: "SPI Flash : SST25x"
    Downloading to address = 0x00020000, size = 78828
    Download    [=========================] 100%        78848 bytes
    Download done.
    File downloaded successfully
    Flashed!
  1. spark flash --usb cc3000
    FOUND DFU DEVICE 1d50:607f
    checking file  /usr/local/lib/node_modules/spark-cli/binaries/cc3000-patch-programmer.bin
    spawning dfu-util -d 1d50:607f -a 0 -i 0 -s 0x08005000:leave -D /usr/local/lib/node_modules/spark-cli/binaries/cc3000-patch-programmer.bin
    dfu-util 0.7
    
    Copyright 2005-2008 Weston Schmidt, Harald Welte and OpenMoko Inc.
    Copyright 2010-2012 Tormod Volden and Stefan Schmidt
    This program is Free Software and has ABSOLUTELY NO WARRANTY
    Please report bugs to dfu-util@lists.gnumonks.org
    
    dfu-util: Invalid DFU suffix signature
    dfu-util: A valid DFU suffix will be required in a future dfu-util release!!!
    Opening DFU capable USB device...
    ID 1d50:607f
    Run-time device DFU version 011a
    Claiming USB DFU Interface...
    Setting Alternate Setting #0 ...
    Determining device status: state = dfuIDLE, status = 0
    dfuIDLE, continuing
    DFU mode device DFU version 011a
    Device returned transfer size 1024
    DfuSe interface name: "Internal Flash  "
    Downloading to address = 0x08005000, size = 25068
    Download    [=========================] 100%        25600 bytes
    Download done.
    File downloaded successfully
    Transitioning to dfuMANIFEST state
    Flashed!
  1. LED comes back from magenta to flashing yellow
    spark flash --usb tinker
    FOUND DFU DEVICE 1d50:607f
    checking file  /usr/local/lib/node_modules/spark-cli/binaries/spark_tinker.bin
    spawning dfu-util -d 1d50:607f -a 0 -i 0 -s 0x08005000:leave -D /usr/local/lib/node_modules/spark-cli/binaries/spark_tinker.bin
    dfu-util 0.7
    
    Copyright 2005-2008 Weston Schmidt, Harald Welte and OpenMoko Inc.
    Copyright 2010-2012 Tormod Volden and Stefan Schmidt
    This program is Free Software and has ABSOLUTELY NO WARRANTY
    Please report bugs to dfu-util@lists.gnumonks.org
    
    dfu-util: Invalid DFU suffix signature
    dfu-util: A valid DFU suffix will be required in a future dfu-util release!!!
    Opening DFU capable USB device...
    ID 1d50:607f
    Run-time device DFU version 011a
    Claiming USB DFU Interface...
    Setting Alternate Setting #0 ...
    Determining device status: state = dfuERROR, status = 10
    dfuERROR, clearing status
    Determining device status: state = dfuIDLE, status = 0
    dfuIDLE, continuing
    DFU mode device DFU version 011a
    Device returned transfer size 1024
    DfuSe interface name: "Internal Flash  "
    Downloading to address = 0x08005000, size = 78828
    Download    [=========================] 100%        78848 bytes
    Download done.
    File downloaded successfully
    Transitioning to dfuMANIFEST state
    Flashed!

The Problem with the 2ß-30 times connects to the local cloud still exists…

I did a spark setup and enter my Wifi credentials.
Then I have to wait for the breathing cyan…

flashing green
fast flashing white
flashing red around 10 times

breathing white 2 times
and again flashing green

log from spark-server console:

Connection from: 192.168.178.55, connId: 18
on ready { coreID: 'coreid',
  ip: '192.168.178.xx',
  product_id: 0,
  firmware_version: 6,
  cache_key: undefined }
Core online!
Connection from: 192.168.178.xx, connId: 19
on ready { coreID: 'coreid',
  ip: '192.168.178.55',
  product_id: 0,
  firmware_version: 6,
  cache_key: undefined }
Core online!

This goes on and on… I also tried to stop the server, delete all pub/der/pem files restart the server and let em create a new cert. Then I follow the spark-server tutorial till the “key doctor”. And it connects and connects and connects… what´s wrong with me? Am I to blind to see what goes wrong here?

I followed exactly the tut from git,… but this “spark keys server default_key.der myIP” did not work. I can provide you with a full log if anything have the time to read it. I would try to test it from beginning with no new cert created by server…

Hi @clyde,

Cool, I’m glad things are working for you again! :slight_smile:

The updates you ran replace the factory reset firmware, and updated the patch on the CC3000 radio itself. You can keep developing in whatever environment you want and flashing any code, and it shouldn’t undo the patch to the CC3000 radio. If you’re building locally, just be sure to pull repo changes as they come through, or if the ATOM IDE is using the build server, those changes will be merged in automatically. :slight_smile:

Thanks,
David

1 Like

I’ll update core-firmware and depending libs today :smile: (I probably should do it at least once a week)

Update:
Done! 0.2.6 has now core-common-lib and core-firmware from today :slight_smile:

2 Likes

Hi David,

I still fail to install Spark Cli onto Raspberry Pi. The problem is
npm ERR! Failed at the serialport@1.6.3 install script ‘node-pre-gyp install --fallback-to-build’.

Various versions:
npm ERR! Linux 3.18.7+
node-pre-gyp ERR! node-pre-gyp -v v0.5.19
npm ERR! node v0.12.0
npm ERR! npm v2.5.1

However, below can be completed successfully.
sudo npm install -g serialport

Hope you can shed some light on what I have done wrong.

Cheers,
Michael Leung

Hi @micl,

Sorry about the slow reply. It looks like the CLI should be using serialport version 1.5.0 now which I suspect fixes the issue you’re seeing, but the version released is not upgraded yet. ( https://github.com/spark/spark-cli/blob/master/package.json ) So nothing you’ve done wrong! :slight_smile:

@nexxy has been diving into the CLI recently, so I’ll see if she can test and push out a new version soon which I think will fix the issue.

If you want to get it going in the meantime, can you try:

# installing the spark-cli from source
git clone https://github.com/spark/spark-cli.git
cd spark-cli
npm install
npm link
spark

Thanks!
David

1 Like

I’ve pushed version 1.1.0 to npm, which includes the dependency update for node-serialport@1.5.0 as well as @Dave’s webhooks!

3 Likes

Hi, seems like the link is not working anymore. thanks!