Compiling folder using spark-cli cloud compile

I have my code under RCS so there is a sub-dir for that, then there is a Makefile for reasons which will become clear below but also for pre-processing or autogen of code e.g. if I am generating a const C-array from underlying CSV table, the CSV file and the awk prog to generate the C are also in the directory, then there is a doc file saying what must be connected to which pins and how to use the app. And then I need to build for different Arduino targets not only Spark.io, and there are files such as main.cpp.old and xyz.h.french and xyz.h.german. And there are dummy make target files and files to hold the spark compile output. Better, in other words, to be able to exclude all files except those named on the command line. And the default action in the Makefile can then be ‘spark cloud compile foo.c bar.cpp wiz.h’ - well that’s what I would prefer. Currently the default action is as follows:

default: foo.flashed
  rm -fr build #spark flash needs a dir with just the code
  mkdir build
  cd build && ln -s ../foo.c ../bar.cpp ../wiz.h . #cd only applies to this line
  spark cloud flash $(core_id) build | tee foo.out #save the pages of compile errors
  grep -iv error foo.out >/dev/null #tests for no error
  touch foo.flashed #make doesn't continue after an error

Save me from that!