Subdirectories not compiling in CLI [Fixed]

Hi @Coffee,

I believe the default behavior (without adding a spark.include file), is to grab any source files (*.h, *.c, *.cpp, *.ino) in the directory provided. If you have a spark.include file in that directory, it’ll parse it and interpret any wildcards.

  • spark compile . / spark compile /some/directory

    • look for spark.include / spark.ignore files in that folder, or just grab any source files from there
  • spark compile foo.cpp foo.h main.ino

    • send those files specifically to compile

I think this has always been the default behavior… Maybe I’m confused? Here’s a generic spark.include file that should include everything in subdirectories:

#spark.include 
# from current dir
*.h
*.ino
*.cpp
*.c

# and from any subdirectories
**/*.h
**/*.ino
**/*.cpp
**/*.c

I don’t believe I changed this behavior in the last release, but it’s possible some submodule got updated and broke something. Please do let me know if this doesn’t seem right, or if anyone has more info.

Thanks,
David

1 Like