String vs Array of Char

I’ve read a lot about the String implementation for Spark.

Something sounds good (dynamic allocation),

Something sounds hard (HEAP Fragmentation),

Something sounds strange (garbage collection) for 20kbyte of RAM.

On the other side, firmeware functions like publish etc. are supporting the String implementation.

So, finally:

Is it safe to use String (also on the long run) or not ?
Are there functions to get the current HEAPSize ?

Thx to all for supporting this great platform

As for the current heap size you might like to have a look in this thread and the referenced source files.

As for the risks connected with the use of String it greatly depends on your particular usage. But to know what might prove risky and what would be safe, a look into the current implementation of String might give you a feeling.

https://github.com/spark/firmware/blob/master/src/spark_wiring_string.cpp

When I mentioned garbage collection in the other thread I more exactly should have said somthing like "heap defragmentation/compression".
If you use a lot of String objects of which you (or implicit reallocation) free some and some others not, you might run out of blocks big enough to create/modify a String object despite having enough free space but only scattered all over the heap.
And since user mem, stack and heap share the Cores 20kB you'll sometimes hit the limits without reason - as it seems. But knowing about possible pitfalls, will safe you some head-scratching :wink:

Thanx for your answer. I’ve tried the function _sbrk or this one:

uint32_t freeMemoryAvailable(void)
{
    extern char _end;
    char *current_heap_end = &_end;
    char *current_stack_pointer = (char *)__get_MSP();

    return (current_stack_pointer - current_heap_end);
}

but they return always the same number of bytes (the same like the compiler). We are using String very often to save data and send them to a Server using HTTPClient Lib. But the memory size does not change over time ?

Free mem estimation is going on here