Spark Core's program memory size

Hi,

I’m running into problems with my app which seems to be too big when compiled. When trying to flash it via the web IDE, just nothing happens, when using the Spark CLI it says “too big”.
This brought me to trying to figure out what the actual space is I have on the device, but I couldn’t really find that information. Somewhere it says “128kb”, but I’m pretty sure my app is under that ( I’ve compiled it directly for Arduino and there it comes out around 32kb size. Is there a way to check how big my sketch actually is, and how much space is left on the Core?

Thanks,
Mad

Hi @Maddimax,
this seems to be a common question to which there is no easy answer, since a lot depends on your code.
Maybe you’d have to plough through some of these threads

Hi @Maddimax,

You should have about 100K for your app (about 20K is reserved for the bootloader). Sometimes including certain libraries can bump your app size way up, are you including system libraries that you could do without?

Thanks,
David

Thanks for the links! I will try to compile my stuff locally to get a better feel of the code size.

Does anyone know whether “PANIC_ONLY” is still a thing ( as it says here: https://community.spark.io/t/understanding-local-compile-output/3234/8 )?

Regarding the libraries I don’t think I can leave anything out ( I’m using SPI, UART and Print and would need to use Wire/I2C in the future as well ). Is there something like “PROGMEM” on Spark ? From what I understand “static const” is enough on Spark.

Thanks,
Marcus

Btw. is there a place for “Feature requests / Ideas”? I would really say that a size output for CLI and Cloud IDE would be hugely benefical and would like to propose that to the Spark Team.

Hi @Maddimax,

Most of the projects have a Github repo at the moment, so you can always hit up those issue pages to post a feature request / bug: https://github.com/spark/spark-cli/issues

Or if you can’t find it, you can also always start a thread on the forums :slight_smile: There are a few feature request threads floating around. I think the feature you’re talking about is something we’ve talked about earlier here: https://community.spark.io/t/how-to-know-how-much-ram-flash-i-am-using/2150/5

We’ve started to pass the binary size output around internally, I think we’re waiting on a nice interface for it on the build site. :slight_smile: – Adding it as CLI output would be great I think.

Thanks,
David

You can get the command line build to output the size with these changes:

Add a size target to all

all: elf bin hex size

Then define this rule:

size: $(TARGET).elf
@echo Invoking: ARM GNU Print Size
$(GCC_PREFIX)size --format=berkeley $<
@echo

Finally, add size to the list of phony targets

.PHONY: all clean check_external_deps elf bin hex size

1 Like

@Dave: Having the output on the CLI is definetly great, but at least the CLI tells you something is wrong, while the Cloud IDE just does nothing.

@Mdma: I will definetly start looking into local builds tonight.

1 Like

As of today’s deployment of firmware to the web IDE, you don’t need to worry about PANIC_ONLY. That’s now the default and an empty user app (so only the Spark code) compiles to a 67k binary. The total available flash space is 108k.

Another issue fixed today was that hitting the flash button in the web IDE used to not show errors—you had to hit the verify button to see compile errors. As of today, however, errors are shown on flash as well.

And the core-firmware README will get you started with local builds.

Cheers!

2 Likes

I don’t know if this has been posted somewhere already…
When I verified my code today, it showed how much of program/RAM was used on my Core, and how much was left. So yeah, you can now see the space left inside the web IDE, without having to make a local build. YEEY, go :spark:!

1 Like

@Moors7,

did you just mentioned you will send me some starbucks credit?! :smiley:

1 Like