Ah, I'm afraid that's not quite how it works.
In C/C++, variables can be stored in one of three places:
-
In memory allocated by the compiler. Variables that go here include global variables, static variables inside functions, and static class variables.
-
On the stack (more on this later). Variables that fall into this category are non-static local variables (ones defined in a function, such as your char y=0;
). The max size of the stack is typically fixed and predefined (I have no idea what it is on the Spark).
-
On the heap. These are variables allocated by new/malloc()
. The size of the heap is generally whatever memory that is left over, after taking into account the storage needed to hold the compiled code, variables allocated by the compiler (item #1, above), and the space allocated for the stack (the max size).
Variables on the stack:
As I mentioned, non-static local variables (ones defined in a function) are stored "on the stack". Memory for these is allocated at runtime, from the stack, and not compile time; the compiler cannot allocate memory for this. Now, you might expect that the compiler could allocate memory for char y=0;
defined inside a function, but it can't. Because functions can be called recursively, there can be multiple "versions" of char y=0;
-- one for each recursive call.
In other words, if you have a function like:
double f(double x)
{
char y[100];
if (x <= 1.0)
{
return (1.0);
}
return (x * f(x - 1.0));
}
And you call it like f(5);
, you will, at one point, end up with five copies of char y[100]
allocated on the stack. Note how the compiler cannot possibly allocate this space at compile time. Given enough recursive calls, or large local variables (ones inside a function), it's possible to overflow the stack, which is really bad. Embedded systems like the Spark generally have no way of detecting stack overflow (which could lead to tragic results -- it's been hypothesized that the unintended acceleration in some cars is caused by a stack overflow).
(Note for the nitpickers: yes, a good compiler will warn you that char y[100];
is unused, and, yes, a good optimizer will eliminate it completely, as it's not used.)
So, if you're trying to see how much space the compiler can allocate, you need to use a global variable, and not one inside a function. However, this assumes that the Spark tools knows how much memory there is on the Spark; it's possible (and I have no idea) that the Spark tools will happily create a huge program that cannot possibly be loaded and run on the Spark (perhaps someone else here knows).