OSC library (Oscuino) for the Spark Core (with Puredata and Max/MSP example patches)

We’ve been working on porting the Oscuino library to the Spark Core, and we’ve faced the same problems as the ones mentioned in that thread by jfenwick .

Instead of rewriting the UDPSend() function of the OSCMessage and OSCBundle classes, we’ve identified that the issues came from the Spark’s core-firmware code (UDP class in spark_wiring_udp.cpp).

We’ve overloaded the misfunctioning functions (i.e. beginPacket(), endPacket() and write()), and added proper includes in the Oscuino files, as explained in the following.

Our library is provided in our GitHub repository with Puredata and Max/MSP example patches to test the functionalities of our code.

This code will only work with the most recent versions of the core-firmware, since we use the CFLAGS += -DSPARK recently added to the makefile.

1. What we have changed in the Spark Core’s code :

  • We have overloaded the mysfunctioning UDP functions of the Spark Core (myUDP class in application.cpp)

2. What we have changed in the Oscuino code :

  • Proper includes have been added to OSCData.h and OSCMessage.h

  • We do not use the OSCTiming class and OSCBoards.h (both must be rewritten to work with the Spark)

  • All modifications are labelled with the following comment : // Simon+Emilien-OSCUINO

3. Setup (we’re compiling our code with make as explained in the documentation ) :

  • Put OSCBundle.cpp, OSCData.cpp, OSCMatch.c, OSCMessage.cpp and build.mk into core-firmware/src/OSC

  • Put OSCBundle.h, OSCData.h, OSCMatch.h and OSCMessage.h into core-firmware/inc/OSC

  • Put the IP address of your computer into application.cpp (search for the comment // put the IP address of your computer here)

4. Use (test application) :

  • Wire LEDs to the D0 and D1 pins

  • Send OSCMessages and OSCBundles to the Spark Core on port 8888

  • Receive OSCMessages and OSCBundles from the Spark Core on port 9999

  • The IP of the Core is automatically sent at startup as an OSCMessage (OSC address = /coreip)

  • The Spark Core will manage either OSCMessages or OSCBundles (not both at the same time)

  • 5 OSC addresses are taken into account by our test application:

    • /manageMessages (no arguments) will switch the Core to managing OSCMessages (OSCBundles will be ignored)
    • /manageBundles (no arguments) will switch the Core to managing OSCBundles (OSCMessages will be ignored)
    • /sendTestMsg (no arguments) will ask the Core to send an OSCMessage for test (its address is /testmessage)
    • /sendTestBndl (no arguments) will ask the Core to send an OSCBundle for test (its address is /testbundle/*)
    • /led followed by two ints will switch on/off the two LEDs
      [The 1st int is the pin number (0 for D0, 1 for D1). The 2nd int is the LED state (0 for LOW, 1 for HIGH)]

5. PD and Max/MSP patches :

  • To test our Max/MSP patch, download and install the Max OSC library made by the CNMAT

  • Start the patch and reboot the Spark Core (short press the “RST” button)

  • The IP address of the Core is automatically setup (it is sent as an OSCMessage when the Core boots and registered to the udpsend object)

  • By default, the Core will manage OSCMessages (OSCBundles are ignored at startup)

  • Test the functionalities


[Simon-L & Emilien-G for trublion]

Let’s keep in touch :
-http://www.trublion.org
-https://www.facebook.com/trubliondotorg
-https://twitter.com/trubliondotorg
-contact_AT_trublion_DOT_org

7 Likes

Wow guys, thanks for putting this together!

Not only did you port a really awesome library, you also documented how you did it in a gloriously useful way for others to follow along and for Spark to know where you ran into issues...and documented how to test it. Sweet!

A few comments on the Spark Core code changes:

Nice. That's the way to go. Note, we'll be releasing firmware release to the IDE that will make this kind of thing possible there too without having to do local compiling.

@zachary has been working on identifying + addressing the concerns folks have raised about our UDP implementation this sprint. Until this work completes, though, it's quite useful to have an illustrated pathway to overriding functionality like this both for this project and others in the future.

Cheers!

-joe

2 Likes

(post withdrawn by author, will be automatically deleted in 24 hours unless flagged)

Has anyone else problems compiling the firmware?

Hi @Louis !

As mentioned by @sneakthief, the only thing you are supposed to do to use this library and example is to replace Network.localIP(); with WiFi.localIP() on line 73 in application.cpp
I think that other problems should be related to your compiler or IDE. Have you searched for your problem on the forum ? Have you been able to compile other examples using libraries ? (note that we are compiling locally using make, not the online compiler)

We haven’t updated our library yet, since we have been experiencing huge problems with our cores during the past weeks (there have been problems in the makefile of the core firmware + the new WiFi and Spark classes do not work as intented and specified in the documentation + the UDP class have not been corrected yet, and we are unable to use our cores locally although it has been hugely advertised here , …)

We will post GitHub issues today, see how it can be taken into account by the Spark team, and we will know if it is worth correcting and improving our library at the moment

Best,

We are correcting the current version to take into account the new architecture of the core firmware and the new makefile structure
We will write new guidelines for installing, compiling and using this library, and we will flag the remaining problems on github, very soon :slight_smile:
Don’t hesitate to communicate about your experiences with the Spark Core and OSC. We will get back to you as soon as possible (today or tomorrow)

I’m preparing to build a couple Spark Cores into lego bricks to give to my kids for Christmas. I’m hoping to let them use TouchOSC on IOS so they can control servos and LEDs from the SparkCore. It looks like there’s some current incompatibility between Oscuino and Spark, and I don’t see any changes since June on GitHub - should I give up on using Oscuino with new Cores in the next week or two?

Great sounds good @trublion
Any further news though? :smile:

I am having trouble with building the OSC library on windows and mac (after carefully following the steps and changing Wifi.localIP()).

I keep getting a long list of these messages and failing the build: (these are the last few lines of the error)


./obj/src/OSC/OSCBundle.o:C:\Spark\core-firmware\build/…/src/OSC/OSCBundle.cpp:222: first defined here
./obj/src/OSC/OSCBundle.o: In function OSCBundle::fill(unsigned char*, int)': C:\Spark\core-firmware\build/../src/OSC/OSCBundle.cpp:225: multiple definition ofOSCBundle::fill(unsigned char*, int)’
./obj/src/OSC/OSCBundle.o:C:\Spark\core-firmware\build/…/src/OSC/OSCBundle.cpp:225: first defined here
collect2.exe: error: ld returned 1 exit status
make: *** [core-firmware.elf] Error 1

BUILD FAILED (exit value 2, total time: 11s)


Could it work if I use an old firmware to build with all the OSC library assets? And if so, what version should I use then until you manage to fix the issues?

*****EDIT
I got it to work with firmware versions v0.2.1. Not sure what other issues can occur on this version but I can send and receive OSC messages when following the tutorial :slight_smile:

Just download the packages under header v0.2.1 and build the firmware with the added OSC files…



managed to compile locally with netbeans using firmware 0.2.1
compiling on web ide I get lots of errors.

has anyone successfully compiled with latest firmware?

I am also wondering if someone has managed to compile the library with the recent firmware.

I am getting this error:

collect2: error: ld returned 1 exit status
make: *** [core-firmware.elf] Error 1

Any ideas on what might be causing such issue.

I have managed to compile it with the older firmware as suggested by Jelle above, but when I flash it, it remains on the DFU mode.

Any ideas on what I might be doing wrong :smile:

Bump!
Was just wondering if anyone was able to get this going with the latest firmware.
Thanks!

Bump! Will try to get it running, but if any body else has done it, i wouldent bother not to waste my time :smiley:

I have OSC working with the current firmware, only in local mode (no cloud).

you can download my branch here:

If there are no changes to the system files, then I don’t think it’s necessary to clone the entire Particle firmware repo - just putting the OSC files into a directory would be sufficient, which will make getting updates from either repo simpler.

Hi everyone,

I’ve been working on a relativly simple OSC implementation (thought no complete)
For now it supports :
-send floats
-send ints
-send string
-receive floats
-receive ints

It’s relatively easy to use and to implement.
Feedbacks are welcome

More infos : https://github.com/ameisso/simple-OSC

great!
how would I send analogue sensor readings over Osc say on analogue0 and analogue1 inputs
and would this work over the cloud?

cheers!
Anton

Hello,

This would work on local. Over the cloud use the Publish function from Spark.
To send sensor reading just do something like :

OSCMessage outMessage("/pong");
 outMessage.addFloat(analogRead(A0));
 outMessage.send(udp,outIp,outPort);

Regards
AntoineM*