We’ve been working on porting the Oscuino library to the Spark Core, and we’ve faced the same problems as the ones mentioned in that thread by jfenwick .
Instead of rewriting the UDPSend() function of the OSCMessage and OSCBundle classes, we’ve identified that the issues came from the Spark’s core-firmware code (UDP class in spark_wiring_udp.cpp).
We’ve overloaded the misfunctioning functions (i.e. beginPacket(), endPacket() and write()), and added proper includes in the Oscuino files, as explained in the following.
Our library is provided in our GitHub repository with Puredata and Max/MSP example patches to test the functionalities of our code.
This code will only work with the most recent versions of the core-firmware, since we use the CFLAGS += -DSPARK recently added to the makefile.
1. What we have changed in the Spark Core’s code :
We have overloaded the mysfunctioning UDP functions of the Spark Core (myUDP class in application.cpp)
2. What we have changed in the Oscuino code :
Proper includes have been added to OSCData.h and OSCMessage.h
We do not use the OSCTiming class and OSCBoards.h (both must be rewritten to work with the Spark)
All modifications are labelled with the following comment : // Simon+Emilien-OSCUINO
3. Setup (we’re compiling our code with make as explained in the documentation ) :
Put OSCBundle.cpp, OSCData.cpp, OSCMatch.c, OSCMessage.cpp and build.mk into core-firmware/src/OSC
Put OSCBundle.h, OSCData.h, OSCMatch.h and OSCMessage.h into core-firmware/inc/OSC
Put the IP address of your computer into application.cpp (search for the comment // put the IP address of your computer here)
4. Use (test application) :
Wire LEDs to the D0 and D1 pins
Send OSCMessages and OSCBundles to the Spark Core on port 8888
Receive OSCMessages and OSCBundles from the Spark Core on port 9999
The IP of the Core is automatically sent at startup as an OSCMessage (OSC address = /coreip)
The Spark Core will manage either OSCMessages or OSCBundles (not both at the same time)
5 OSC addresses are taken into account by our test application:
/manageMessages (no arguments) will switch the Core to managing OSCMessages (OSCBundles will be ignored)
/manageBundles (no arguments) will switch the Core to managing OSCBundles (OSCMessages will be ignored)
/sendTestMsg (no arguments) will ask the Core to send an OSCMessage for test (its address is /testmessage)
/sendTestBndl (no arguments) will ask the Core to send an OSCBundle for test (its address is /testbundle/*)
/led followed by two ints will switch on/off the two LEDs
[The 1st int is the pin number (0 for D0, 1 for D1). The 2nd int is the LED state (0 for LOW, 1 for HIGH)]
Not only did you port a really awesome library, you also documented how you did it in a gloriously useful way for others to follow along and for Spark to know where you ran into issues...and documented how to test it. Sweet!
A few comments on the Spark Core code changes:
Nice. That's the way to go. Note, we'll be releasing firmware release to the IDE that will make this kind of thing possible there too without having to do local compiling.
@zachary has been working on identifying + addressing the concerns folks have raised about our UDP implementation this sprint. Until this work completes, though, it's quite useful to have an illustrated pathway to overriding functionality like this both for this project and others in the future.
As mentioned by @sneakthief, the only thing you are supposed to do to use this library and example is to replace Network.localIP(); with WiFi.localIP() on line 73 in application.cpp
I think that other problems should be related to your compiler or IDE. Have you searched for your problem on the forum ? Have you been able to compile other examples using libraries ? (note that we are compiling locally using make, not the online compiler)
We haven’t updated our library yet, since we have been experiencing huge problems with our cores during the past weeks (there have been problems in the makefile of the core firmware + the new WiFi and Spark classes do not work as intented and specified in the documentation + the UDP class have not been corrected yet, and we are unable to use our cores locally although it has been hugely advertised here , …)
We will post GitHub issues today, see how it can be taken into account by the Spark team, and we will know if it is worth correcting and improving our library at the moment
We are correcting the current version to take into account the new architecture of the core firmware and the new makefile structure
We will write new guidelines for installing, compiling and using this library, and we will flag the remaining problems on github, very soon
Don’t hesitate to communicate about your experiences with the Spark Core and OSC. We will get back to you as soon as possible (today or tomorrow)
I’m preparing to build a couple Spark Cores into lego bricks to give to my kids for Christmas. I’m hoping to let them use TouchOSC on IOS so they can control servos and LEDs from the SparkCore. It looks like there’s some current incompatibility between Oscuino and Spark, and I don’t see any changes since June on GitHub - should I give up on using Oscuino with new Cores in the next week or two?
I am having trouble with building the OSC library on windows and mac (after carefully following the steps and changing Wifi.localIP()).
I keep getting a long list of these messages and failing the build: (these are the last few lines of the error)
./obj/src/OSC/OSCBundle.o:C:\Spark\core-firmware\build/…/src/OSC/OSCBundle.cpp:222: first defined here
./obj/src/OSC/OSCBundle.o: In function OSCBundle::fill(unsigned char*, int)': C:\Spark\core-firmware\build/../src/OSC/OSCBundle.cpp:225: multiple definition ofOSCBundle::fill(unsigned char*, int)’
./obj/src/OSC/OSCBundle.o:C:\Spark\core-firmware\build/…/src/OSC/OSCBundle.cpp:225: first defined here
collect2.exe: error: ld returned 1 exit status
make: *** [core-firmware.elf] Error 1
BUILD FAILED (exit value 2, total time: 11s)
Could it work if I use an old firmware to build with all the OSC library assets? And if so, what version should I use then until you manage to fix the issues?
*****EDIT
I got it to work with firmware versions v0.2.1. Not sure what other issues can occur on this version but I can send and receive OSC messages when following the tutorial
Just download the packages under header v0.2.1 and build the firmware with the added OSC files…
If there are no changes to the system files, then I don’t think it’s necessary to clone the entire Particle firmware repo - just putting the OSC files into a directory would be sufficient, which will make getting updates from either repo simpler.
I’ve been working on a relativly simple OSC implementation (thought no complete)
For now it supports :
-send floats
-send ints
-send string
-receive floats
-receive ints
It’s relatively easy to use and to implement.
Feedbacks are welcome