API limits for Spark Cloud

Thanks for the great question @sjunnesson!

Right now API rate limiting is not at the top of the priority list, but we’ll add it soon. We haven’t yet determined what the limits will be. In determining them, we will look for a balance between a few factors: the overall load on the Cloud as more people get their Cores, our shifting estimates of ongoing bandwidth costs, and our commitment to keeping the user experience absolutely as high quality as we can.

As you alluded, the right way to get sensor data from a Core is with events—either by opening an SSE stream or registering callbacks to your servers as events are generated (both features in our backlog). Events are on our high priority list, but they didn’t get added to the current sprint, which goes through December 20th. I expect there’s a good chance they’ll at least be started in the the sprint after that (12/23–1/3) and probably fleshed out and debugged in the following one (1/6–1/17).

So my estimate: In early January you’ll have some kind of access to events generated by your Cores, and by mid-January they’ll be more stable and full-featured.

Thoughts?