There has been much conversation regarding updating documentation. This thread will act as an on-going list of things that need to be documented or have the documentation improved. I invite any Spark Elite or Spark employees to edit this post as needed when new documentation needs arise or are completed. All users are invited to contribute pull requests to the official documentation repository hosted on GitHub. Please keep replies to this topic as concise as possible. If debate is needed, please start a new thread (a la “Documentation - UDP Issues”). I think prepending new topic subjects with "Documentation - " will help everyone visually filter the list of topics. I will propose an official Documentation category the Spark Team that will eliminate the need to add "Documentation - " to the topic subject.
To-do
- UDP issues - UDP.parsePacket() does not function as expected - no packet boundaries per se
- Spark CLI special characters - https://community.spark.io/t/how-to-send-hash-character-to-spark-function/4548
- Supplemental documentation for Spark.publish() and Spark.subscribe() - https://community.spark.io/t/publish-subscribe-semantics-documentation/4403
- Toolchain installation guides
- Windows
- OS X
- Ubuntu - https://community.spark.io/t/how-to-install-the-spark-toolchain-in-ubuntu-14-04/4139
- Spark CLI installation guides
- Windows
- OS X
- Ubuntu - https://community.spark.io/t/how-to-install-spark-cli-on-ubuntu-12-04/3474
- Installing DFU-UTIL and driver for Core
- Windows
- OS X
- Ubuntu
- Bug where delay() does NOT call SPARK_WAN_Loop() when the SPARK_WLAN_SLEEP flag is set by Spark.sleep() - https://community.spark.io/t/can-i-turn-the-cc3000-off/3233/124
- Access token expiration (there is already doc on this–maybe it needs to be polished)
- RAM - Why there is so little, tips and tricks to reduce usage
- Better/more visible panic code documentation (this is under troubleshooting now)
- Adding
#include "application.h"
to user code - Cannot claim core - already claimed cores, other issues
- Cannot connect to wifi - common scenarios and solutions
- Improve documentation regarding connecting to wifi that routes through a splash page
- How to get order and/or shipping information
- Make core firmware changelog accessible/visible to members on the IDE and via documentation
- Web server code example - https://community.spark.io/t/tiny-webserver-code/3297
- Web client code example. Specifically POST, GET and how to buffer all of the data completely (since the TCP buffer is now 128 bytes - was 512) - https://community.spark.io/t/making-a-get-or-post-request-from-the-core/2288
-
itoa()
,utoa()
,atoi()
, etc - https://community.spark.io/t/how-can-i-use-the-itoa-function/2487 -
#include <math.h>
issues - RSSI
- [ON-HOLD] WiFi on/off, cloud on/off (this may be getting a firmware makeover soon)
- [ON-HOLD] New Spark CLI commands (document when released)
- [ON-HOLD] Simplified local toolchain build environment - https://community.spark.io/t/tweak-request-to-core-firmware-src-build-mk/2900/13 (this may make it directly into the core-firmware soon)
- CFOD TI patch
- Add tone()/noTone() functions to Core Firmware documentation (make sure to add all pins tone available on) https://community.spark.io/t/tone-function-missing/1116 and more info at https://github.com/spark/core-firmware/pull/177#issuecomment-44966380
- Add/expand recommendations for safe and unsafe ways of accessing the Spark Core with device ID and access_tokens. See Dave’s comments here: https://community.spark.io/t/putting-access-token-to-rest-inside-the-core-most-secure/5046/15
- Add a section called MACROs to the Firmware Reference. Include all standard macros and any extra goodies that might be useful. [FYI: I added the first MACRO reference to the DOCs… arraySize(). Perhaps MACROs should be abstracted as just other random Functions that can be used? -BDub]
- Document the use of
SPARK_WLAN_Loop();
in code, how, why, when. FYI: This is how you call the entire background process that manages the connection to the Cloud. You can do this from user code, or let the user code endloop()
and it will be called automatically. There is currently an issue logged thatSpark.process()
is not callingSPARK_WLAN_Loop()
and should be. - Document how
#pragma SPARK_NO_PREPROCESSOR
is used to keep the Cloud compiler from preprocessing your*.ino
files. This*.ino
is the default file naming in the Web IDE, but when using Spark CLI or Spark Dev you can simply name your main file*.cpp
and no preprocessing will occur. Document what is needed in your code if you chose not to use the Preprocessor. -
Return codes
Several or many or most or all of the functions described in the firmware docs have undocumented return codes. E.g. most or all of the functions in the UDP library return codes which are undocumented. A specific example: I read in the forum that UDP.write() returns -1 or -2 on error but this is undocumented. It would be nice to know what the return codes of all the functions are. Just to know that they exist, and what signifies failure and what signifies success, would be very useful. The convention in C is that functions return -1 for failure, setting the global errno to identify the error. 0 signifies success. Anyway, whatever, where there are return codes they should please be documented.
Completed
- [@peekay123] Fix Spark.time documentation bugs
- [@wgbartley] Add
Content-length
header to TCPClient sample code - [@wgbartley] Clarify DST and offsets for Spark.time - https://community.spark.io/t/time-zone-and-daylight-savings-and-spark-synctime/4568/2
- [Already Documented] Spark.publish() TTL - https://community.spark.io/t/meaning-of-ttl-parameter-in-spark-publish/4549
- [@BDub] Onboard RGB LED control, other tips and tricks - https://community.spark.io/t/disabling-status-led/3656