Conditionally include Spark includes #define SPARK_CORE isn't viable, what is?

I need to It looks like USE_SPARK_CORE_XXX have been removed. Ive seen some threads that say use #ifdef SPARK_CORE but that doesnt work because its defined in application.h, and Im trying to conditionally include application.h

What are people using?

1 Like

Yes, that’s a problem I experience too. I want to conditionally include application.h. It seems to me there ought to be a pre-defined cpp pre-compiler variable in much the same way as there is one for ANSI C or for particular architectures so that one can write portable code. #ifdef SPARK_CORE does not work because it is defined in application.h

Sounds like we should change the application.h define to something like SPARK_WIRING, and move SPARK_CORE to something like hwconfig? https://github.com/spark/core-common-lib/blob/master/SPARK_Firmware_Driver/inc/hw_config.h

I would defer to @zachary here, since he’s way more familiar with the firmware.

Thanks!
David

1 Like

Totally! We should move that #define! I created an issue here:

1 Like

@jjrosent @psb777 I just started to do this really quickly, but it doesn’t solve your problem. You have to include something in order to get any symbols at all to use with #ifdef.

When you’re conditionally including application.h, what else are you always including? Tell us more about what you’re trying to build where you don’t want to include application.h.

I have arduino libraries I already maintain. Id like to alter them to be Spark compatible. EX

#if defined(ARDUINO) && ARDUINO >= 100
	#include "Arduino.h"
#elif defined(SPARK_CORE)
	#include "Application.h"
#endif

Please also refer https://community.spark.io/t/code-portability-uno-leonardo-sparkio-and-language-definition/3747 and https://community.spark.io/t/unified-libraries/2785

I am writing code for Arduino and Spark. One main reason for needing a predefined C (OK, “wiring”) pre-processing symbol is for the code to know whether or not to include application.h or to include, instead, the usual header files that any C programmer would expect to have to.

Another of the many reasons is the code needs to know if the A2D conversion is 10 bit Arduino or 12 bit Spark.

And the LED on an Arduino is usually D13 but its D7 here. I want to have one source file that just knows these differences.

Your bootstrap assertion is not correct: You don’t necessarily have to include anything to have and use C pre-processor symbol. Some symbols are pre-defined and it is easy to pre-define a CPP symbol.

E.g. In the 1980’s and 90’s when I was writing code which had to compile on both ANSI and K&R compilers I used the pre-defined symbol __STDC__ to allow me to have both styles of function prototypes in the same source file.

E.g. When programming for different architectures the C compiler usually (if not always) knows what the target is - it has to - and it exposes this to the programmer through a pre-defined C preprocessor macro. On my Ubuntu laptop there are many many symbols defined by default by the C compiler. I have found the unix symbol particularly useful at times, but note the __amd64__ symbol also:

% rm foo.h
% touch foo.h
% cpp -dM foo.h | grep -i amd
#define __amd64 1
#define __amd64__ 1
% cpp -dM foo.h | grep -i unix
#define __unix__ 1
#define __unix 1
#define unix 1
%cpp -dM foo.h | grep -i linux
#define __linux 1
#define __linux__ 1
#define __gnu_linux__ 1
#define linux 1
% cpp -dM foo.h 2>&1 | grep -i word
#define __FLOAT_WORD_ORDER__ __ORDER_LITTLE_ENDIAN__

When you tell the gcc compiler to cross-compile, to compile for a foreign target, it also pre-defines CPP macros automatically so that the target can be detected by #ifdef lines in the code.

There ought to be a spark_core __spark_core__ SPARK_CORE or similar pre-defined symbol. And there might already be one but the documentation is lacking.

If there isn’t already one then the cpp C pre-processor and most C compilers support the -D option. All you need do is include -D SPARK_CORE (or whatever symbol you choose) in the compile command invoked via the Web IDE and invoked via the spark_cli. It may be only a one-liner amendment in your code.

Thanks for the feedback!

First, if you’re building a .ino file in the Arduino IDE and the Spark web IDE, this will already work.

#if defined(ARDUINO) && ARDUINO >= 100
#include "Arduino.h"
#elif defined(SPARK_CORE)
#include "application.h"
#endif

That’s because each IDE already includes the appropriate header files without you explicitly adding the #include line. However, if you’re making generic libraries (.cpp files) it won’t—there’s your rub.

Symbols like __STDC__, __amd64, __unix, and __cplusplus (which core-firmware uses often to differentiate C from C++) are added by the compiler, which we can’t modify.

However @psb777 is right that if we add it as a -D compiler flag, then nothing has to be included beforehand. I went to check whether that’s how Arduino defines ARDUINO, and in fact (though it took me a while to find it), it is:

I’ll make that change momentarily. This also makes it clear that what the macro represents is not :spark: Core hardware—the build has no way of knowing where you’re going to install this—but rather building for the :spark: platform. So, just as Arduino uses ARDUINO, not ARDUINO_UNO, it makes more sense for this macro to be SPARK rather than SPARK_CORE.

Cheers!

1 Like

I submitted a pull request here:

Go add a :+1: there if the solution works for you, or continue the discussion there if there’s a better way. Thanks!

3 Likes

I don’t have the time to install the tool chain and build locally. I hope my comments are still appreciated. And I look forward to the new SPARK symbol being available for cloud compiles. Thanks!

1 Like

Your comments are valuable and appreciated @psb777. Thanks for helping us make :spark: exactly what enables you to do great things!

2 Likes

(yup, reviving an old, dead thread - hope it is OK…)

This thread (and the other’s referenced by it) leave me feeling like there is an us-or-them mindset in play when there really shouldn’t be… As good as I’m finding the Photon to be, I’m still happy with my Teensy’s, Uno’s and ProMini’s - and I need a way to use them all in a cooperative ecosystem. (yeah, I’m OK with dropping those PIC 16F’s off a cliff somewhere, but the others still give good value…)

In the light of “ARDUINO” being more than just AVR (zero, due, edison, yadda…) what does the define really mean? In the same sense, what is SPARK supposed to mean?

To me, ARDUINO is indicative of the environment - if it isn’t defined, I’m probably using a traditional gcc/avr environment with main(); with it, I’m the world of setup() and loop()…

With that in mind, it seems “wrong” that “ARDUINO” and “SPARK” are either-or alternatives - I would have expected to find ARDUINO to be defined by all Arduino-compatible environments, including the build.particle.io one. In addition, I’d also expect to find SPARK (now PARTICLE, I’m sure :smile: ) defined by build.particle.io when building code for my Photon, as well as in the Arduino IDE with a Spark/Particle core in use, but not when using the Arduino IDE for a Uno or Due or Edison board…

Part of living in a world where ARDUINO is defined is an expectation of how that world works - including Arduino.h, the stupid pseudo-language pre-parsing for libraries, setup() and loop(), etc. In as much as you present a compatible experience, I shouldn’t need/care about ifdef SPARK at this level.

Of course, once I get beyond the trivial, I need to be able to determine specifics about what core and what arch and …, where feature-enabling predefines for PHOTON and PARTICLE_IOT_WEB and the like become important.

example:

my generic I2C helper library for managing IO expanders should be completely agnostic of everything, as long as I2C/TWI is supported by a core. My library code should be able to presume that if ifdef ARDUINO and include Wiring.h succeed, it will work.

  • this is a clue that there is a problem - that simple things that should work, don’t.

When I added support in my lib for the onboard pins on the processor itself, I found myself needing to know what core and what board instance ('328 -vs- Leonardo -vs- R1-vs- R3 shield…) were being used; as I added support for my new Photon, I started looking for an ifdef PHOTON “hardware core ID” trigger.

I’d like to see

  • the web based particle.build.io define ARDUINO
  • Renaming application.h to Arduino.h
  • putting a bunch of stuff into it thats missing, like bitRead…
  • Define PARTICLE_CORE , PARTICLE_PHOTON or PARTICLE_ELECTRON in addition as appropriate.
  • In the stand alone world, I’d like to see a Particle supported downloadable core that would work in the Arduino IDE - that environment would obviously have ARDUINO defined in addition to one of the PARTICLE_ defines…

This is hard, but not unique to SPARK/Particle - the whole 'duino community is grappling with this as well as they try to span all the way from AVR’s to ARMs and Atoms…

-John

1 Like

Thanks John — we’re definitely all about the inclusiveness of the whole maker world. We strive to be as easily compatible across architectures as we can. That said, we don’t want to define something that claims a certain kind of compatibility when we’re not, e.g., architecture differences. Library writers commonly use the ARDUINO definition to determine hardware compatibility, so defining it in our firmware would often break such libraries.

Regarding the specifics of which defines should be supported where, I’ll leave it to @mdma to respond. He’s master of the firmware domain at Particle these days. I really appreciate your perspective! Cheers!

2 Likes

I think a year ago, “ARDUINO” meant “UNO” or something for many of us, but now with multi architecture support in 1.5.x and 1.6.x, it no longer can/does. Today, anyone who naively presumes “ARDUINO” implies anything about processor or board architecture, pinouts or any hardware compatibility is already screwed :smile: Those libraries are already broken and its not your fault!

As a concrete example, elapsedMillis() works on Gemma’s (ATtiny), Leo’s, Uno’s and Mega’s, as well as Teensy and the rest of the usual suspects, yet it happily does

#if ARDUINO >= 100
#include "Arduino.h"
#else
#include "WProgram.h"
#endif

With this in mind, “ARDUINO” is not a processor architecture determinant, which is obvious if you look at what it is defined as:

-DARDUINO=10603 -DARDUINO_AVR_LEONARDO -DARDUINO_ARCH_AVR ... -DUSB_PRODUCT="Arduino Leonardo"

This is a representation of the value of the version of the IDE compile environment in use (1.6.3 for the above)

There are other defines provided for processor architecture (“AVR”) and processor/board (“LEONARDO”), not to mention the avr/gcc defines like “AVR_ATmega32U4”. In particular, many of these defines originate in the boards.txt file:

  • The XXX.build.board property is used to set a compile-time variable
    ARDUINO_{build.board} to allow use of conditional code between
    #ifdefs. The Arduino IDE automatically generate a build.board value if not defined. In the uno case the variable defined at compile time
    will be ARDUINO_AVR_UNO.

The Arduino 1.5.x hardware definition page below has a ton of detail that is relevant here; I’d suggest starting there for a good understanding of what #defines are needed/used.

-John


3 Likes

Interesting insights there @JohnP thanks for sharing. I must admit that my Arduino skills is probably at least a year out of date.

We'd have to look carefully at what this would mean, and to be sure it causes more good than harm. To test the waters, individual apps and libraries are free to define their own Arduino.h as

#define ARDUINO 100
#iinclude "applicatiion.h"

That's a quick and easy way to emulate the Arduino environment to see if it helps with the build.

We'd gladly take PRs for this, as well as documentation as a guide to library writers for how to port libraries from Arduino to the Particle Ecosystem.

2 Likes

This would be great!