Hi. I’m a newbie to Spark but have lots of experience with microcontrollers, programming in C, etc. I’m trying to write a program in spark dev and “compile in the cloud”. The code is below. When I try to compile, it says that ‘Serial’ is undeclared. However, the documentation mentions no #includes. Am I supposed to include something? I tried to #include “application.h” and #include “serial.h” but neither worked. What am I missing? Here’s the incredibly simple code:
I’ve edited your post to properly format the code. Please check out this post, so you know how to do this yourself in the future. Thanks in advance! ~Jordy
I can build and flash using the web IDE, but not the “Spark Dev” (aka Atom) tool.
The source code from the documentation won’t compile via spark dev, either. For instance, the example for digitalRead (below) produces 7 errors (starting with ‘D0’ was not declared in this scope). Something is just wrong with my setup, but I have no clue what it might be…
// EXAMPLE USAGE
int button = D0; // button is connected to D0
int LED = D1; // LED is connected to D1
int val = 0; // variable to store the read value
void setup()
{
pinMode(LED, OUTPUT); // sets pin as output
pinMode(button, INPUT_PULLDOWN); // sets pin as input
}
void loop()
{
val = digitalRead(button); // read the input pin
digitalWrite(LED, val); // sets the LED to the button's value
}
I’ve edited your post to properly format the code. Please check out this post, so you know how to do this yourself in the future. Thanks in advance! ~Jordy
Yes. It worked today, but it did not work last night (see above). However, I was having the weird problem last night where my files were getting “stuck” in the cloud compiler (see my other post on that), so maybe that’s why adding the #include didn’t help.