Can I Turn the CC3000 OFF? [solved]

Can I Turn the CC3000 OFF and still run my sketch?

Right now the sketch will not run until there is a successful connection to the web. So during the Green Flashing during startup the sketch will not run. Only once the breathing Cyan indicates a successful connection to the cloud will my sketch start running, so there if there is no internet connection when I start up the Spark Core then my sketch will not start running at all.

I would like to use the Spark Core in products to allow data from these products to be pushed out to the web. But a lot of times these devices will not be near a WiFi connection which would basically freeze up the Spark Core because it will not be able to find a WiFi connection.

What is the best way to address this issue? Are they planning on letting the main loop run while the CC3000 does its own thing? Or will the CC3000 always be tied to the main loop?

@RWB you need this!

Actually my code only works if you have a Cloud connection first.

If you build locally you can comment out #define #SPARK_WLAN_ENABLE in platform_config.h

But then you basically have a souped up Arduino.

And you are also dead in the water with no way to use Spark.connect(); afterwards.

What we really need is a way to start up without the Cloud, and then try Spark.connect() at various points and see if we can connect to the Cloud. Do our business online, then Spark.disconnect();

@BDub Dam. So the Spark Core is worthless in applications where you will be away from WiFi connections.

There is no way to run the main loop without a good internet connection and WiFi right?

Couldn’t you simply tell it not to connect by default with a:

SPARK_SOCKET_HANDSHAKE = 0;

Spark.disconnect() essentially does that Dave… SPARK_SOCKET_HANDSHAKE = 0;

What you need is a way to undefine SPARK_WLAN_ENABLE but I haven’t found a way that works from the Sparkulator. If you find one let me know.

EDIT: I agree @RWB… we need an easy way to program the Core to drop off the Cloud by default… then enable us to write some code that can force the Core to connect again. We essentially have ways of calling Spark.connect() and Spark.disconnect() already… just need a good hook that keeps us off the Cloud from the start.

I know I am in dangerous waters here, but it occurred to me that with @satishgn recent changes to run the constructors for the user’s global objects before the cloud setup (so that you can define pin modes, etc.) one could have an object whose constructor did Spark.Disconnect() before the cloud starts up.

Could this work?

I know, I know this is sketchy way to do this, but I am trying to come up with a webIDE friendly way to make this happen. If could then call Spark.Connect when I wanted to, that would be perfect.

1 Like

Interesting BKO, that just might work. Super hacky, but I like hacking :slight_smile:

Totally! :slight_smile: This is what I was trying to get at, but the latter half of my post got cut off. Given that new commit, that code should be executed after the core is initialized, but before it tries to connect to the cloud. So my suspicion is that this would be effective in changing the default behavior, but I haven’t tested it with the patch yet.

@bko @Dave @BDub

Sounds good. So are you guys gonna give this a shot and let me know if it works or not?

It would be nice to have a switch that could basically allow the main loop to run regardless of the WiFi & Cloud connection status.

Is there any chance that you could give a quick code example of what you mean by this? I'm not sure if I follow as I don't have a strong background in C.

Hi @mbeasley

The basic idea is that when some recent changes make their way into the webIDE you would be able to have a class with a .h like this:

include "application.h"

class NoCloud {
 public:
  NoCloud();
};

And a .cpp like so:

#include "NoCloud.h"

NoCloud::NoCloud() {
  Spark.disconnect();
}

Then in your application/sketch, you would have as a global (i.e. before the void setup() ):

NoCloud myNoCloud;

So that when the NoCloud object is constructed, it calls Spark.disconnect at a time before the cloud has had a chance to get started. It is really an order of operations thing.

I would also like to have something like this to restart the cloud based on a pin or the mode switch etc.

    bool enableCloud = digitalRead(A7);
    if (enableCloud && ~Spark.connected()) {
        Spark.connect();
    }

None of this works right now in the webIDE–it needs that change to be tested and move forward to the web compile branch.

3 Likes

@bko, your NoCloud class should disable the default cloud connection at startup when the latest commit is made available via WebIDE but I guess the setup() and loop() won’t run. I will update the conditional code in main.cpp which executes the wiring code to accommodate this request. @zachary, @Dave what do you say?

Oh, I see–you don’t run the user setup() and loop() unless the cloud is successful, which seems generally reasonable.

Maybe this needs more design. I feel like this is important and I would hate to see an ad hoc solution get pushed out if you guys can think of a better way. [EDIT] In particular, this feels like a job a template method design pattern where you have pre-cloud, post-cloud, etc. hook methods that users could have a subclass for.

1 Like

I think from my point of view as an end user I want the main loop to run all the time regardless of the status of the CC3000 chip.

I would like the ability to turn the CC3000 ON & OFF via code to save on power when not needed. Is this possible?

Preventing the main loop from running just because there is no WiFi signal or Cloud connection seems like a Bad idea. Does the CC3000 have to have so much control over the main loop’s ability to run or not?

If I’m harvesting data I don’t want that data harvesting to stop just because I briefly lost my WiFi connection or because I’m off grid and there is no WiFi connection.

Just thinking out loud here.

1 Like

@satishgn and I talked about this tonight. We will make setup() and loop() run even if Spark.disconnect() has been called in a static constructor. I think your description of user expectations @RWB will be the most common one.

Cheers!

2 Likes

Please check this commit and the comment section: https://github.com/spark/core-firmware/commit/cec3b6ad3c6991591d1ee8cf4931b8be10d68fdd#commitcomment-5569325

To test the execution of setup() and loop() without cloud at startup, create a global instance of type SparkClass as shown below:
SparkClass SparkConnect(false);

The more I play with this application where I’m tracking total consumed power the more frustrated I get at how the Spark Core will basically not run the main loop if there is anything but a good WiFi connection. All data logging stops and all data is lost as soon as there is a network issue.

I can’t use the Spark Core for stuff like this until the main loop runs independent of the CC3000 status.

So if the CC3000 has a good connection I can push data from my device out to the cloud, and if the CC3000 does not have a good connection then no data will be sent out but my main program will keep running regardless.

Kinda like my cell phone and laptop that keep running regardless of them having a Cell or WiFi signal or not.

It seems that @david_s5 latest updates for auto resetting the Spark Core after WiFi issues is not working like it used to maybe? When my Data connection to the net drops out but the WiFi stays on and the Spark stays connected to the WiFi network it causes the Dark Blue Flash and the whole core locks up. I’m getting this on 2 Spark Cores at the exact same time.

Using the Spark Core to just push sensor data to the web is doesn’t cause much issue when the core locks up since I just loose a few sensor reading data points between the time the core freezez up and its reset.

But when I have a LCD screen displaying live data and that data stream and processing stops because a WiFi signal is not present then that causes issues that are simply not acceptable. I look at the WiFi as a bonus feature that allows some cool web based data interaction but it shouldn’t keep the main application from running normally.

Sorry if this is redundant but I think this is the best place to express the issues I am facing when it comes to trying to design around the Spark Core at the moment.

From what I think I understand the Spark Team is working to keep the main loop running regardless of the CC3000 wifi status? Right?

2 Likes

Hmm… Any reason why we couldn’t just add a flag or something for “run setup/loop regardless of whether or not the wifi is connected”?

https://github.com/spark/core-firmware/blob/master/src/main.cpp#L163

1 Like