Expose Interrupt Function with Spark.Function?

I have a function that is called by an interrupt

void startRace()
{
  showStartMsg = true; //indicate that the race has started
}

When I try to expose this to the API with Spark.function("startrace", startRace); I get this complier error

powertarget.cpp: In function 'void setup()':
powertarget.cpp:129:40: error: invalid conversion from 'void (*)()' to 'int (*)(String)' [-fpermissive]
NVIC_DisableIRQ(EXTI3_IRQn); // D4 "Lane 4"
^
In file included from ../inc/spark_wiring.h:34:0,
from ../inc/application.h:29,
from powertarget.cpp:38:
../inc/spark_utilities.h:107:14: error: initializing argument 2 of 'static void SparkClass::function(const char*, int (*)(String))' [-fpermissive]
static void function(const char *funcKey, int (*pFunc)(String paramString));
^
make: *** [powertarget.o] Error 1

Your function is of type void and takes no arguments.

Have a look at the documentation about spark functions. They take a string and return an int.

http://docs.spark.io/firmware/#spark-function

2 Likes

What you can do is create another function to expose to the cloud, and have this function call your startRace function. like this:

int cloudStartRace(String args) {
    startRace();
    return 0;
}

Spark.function("startrace", cloudStartRace);
1 Like