Since my average day involves last minute deployment on never_seen_before, often remote systems, I like my software self contained.

When working with 3rd party OSX dylib libraries, embedding everything you need in your bundle sometimes proves to be problematic and, as soon as you try to link, you get an ugly:
dyld: Library not loaded
This happens because your library is not were it expects to be; to see were your library wants to be placed, you simply have to open a terminal and:
$ otool -L /pathToLib/yourLib.dylib
The output will look something like this:
/usr/lib/yourLib.dylib (compatibility version 1.0.0, current version 1.0.0)
which means that if you put yourLib.dylib in /usr/lib/, your software will magically start to work.

If you can’t recompile the library, there’s not much you can do to convince it to live happily in a different folder; luckily you can still tell your application to ignore what the library is saying and to look for it in a different place. Back to your terminal:
$ install_name_tool -change /usr/lib/yourLib.dylib /pathToLib/yourLib.dylib /pathToApp/AppName.app/Contents/MacOS/AppName
Now when you double click on your app, it will run correctly.

Cool: this trick allows you to put a dylib wherever you wish, so you can use it to embed a library in your bundle. I’m going to show the steps needed to do it in an OpenFrameworks project, but it’s going to be more or less the same for any XCode project:

  • open you XCode project and drag your dylib library in the frameworks/3rd party frameworks group
  • go to the Build Phases page, scroll down to Copy Files, set Destination to Frameworks and add your dylib to the list: this will copy the library into the Frameworks folder in your bundle.
  • still in the Build Phases page, scroll up to Run Script and add the line that will tell your app where to look:
    install_name_tool -change /usr/lib/yourLib.dylib @executable_path/../Frameworks/yourLib.dylib "$TARGET_BUILD_DIR/$PRODUCT_NAME.app/Contents/MacOS/$PRODUCT_NAME";

I strongly believe in R/D and I strongly believe in the importance of academic research, so I’m kind of happy to share a little preview of contribute to the experimental work carried out by Parma University’s Neuroscience Department (yes! those rockstars who discovered mirror neurons!).
Here’s a short video demoing a software tool to be used in the scope of experiments on the perception of self. I’ll write more about it as the research progresses; for now let’s just say that the idea is measuring in a precise way how much a face needs to change for you to stop recognizing it.


Function pointers have probably the ugliest syntax in the whole C++ language: almost all the coders I know cursed them at least once.

In order to make my function pointing experience a little more comfortable, I made an ofxFunctionPointer OpenFrameworks addon; OF 9.0 will probably have a proper C++11 support, so I kept things simple and basic, but it’s easy to use, easily extendable and comes with a well commented example.

If you think you can find it useful, it’s on GitHub.


Featured image
Vomitino is a “tongue in cheek firmware” for the HexBright flashlight.
The word “vomitino”, in italian, means something like “little puker” and was chosen for 2 reasons:

– it follows the -ino desinence tradition, so common in Arduino based projects

– it refers to the fact that the firmware implements a “stroboscopic dazzle mode”, inspired by the LED Incapacitator, a security device that is rumoured be able to stop an hypothetic attacker, inducing vomiting and disorientation.

While I used the same frequency range, the Vomitino lacks some of the features of its original inspirer (i.e. multicoloured light). It’s meant to be just a joke and should not be considered a “self defence tool” at all: in the end it’s just very bright and somewhat annoying strobe light, just like the ones you find in disco :)
If you’re interested in the original device, go check Lady Ada’s complete replica. If you’re curious, but don’t want to invest too much time and money into it, you can grab my humbler version on github and flash it on your HexBright.

Keep in mind that some people have weird reactions to flashy lights, some simply do not like them and I’m not responsible if you nag poeple with my code :)

Expression-Swapping demo

A few weeks ago I met a young designer interested in digitally augmented mirrors: in particular he was interested in messing with people faces. Since this is the kind of stuff I have some experience with, we ordered a couple drinks and brainstormed about how he could do this and that.
I ended up writing a little demo showing how to easily change parts of people’s face in realtime and, since I think it could be helpful for other people too, I wanted to share it and quickly explain how it works.
Basically I track the user’s face with Jason Saragih’s library and create a mesh that can be overlaid on the lower part of the tracker’s face mesh; then I can use this “partial mesh” to create a UV map from the user’s mouth expression, or to blend a saved mouth expression into the live feed.

You can find the source code on my github and here’s a video showing how it works:

Recently I found some time to fiddle a little bit with CSS3D and I took the chance to restyle a little the company home page; normally I don’t do web based stuff, but I admit that all this super performant new stuff (I’m mainly thinking of WebGL and the possibility to use some GPU power in a webpage) is making javascript interesting again :)

While I’m here, I’ll also make a quick summary of all the things happened in the last months and that I was to busy (maybe also a little lazy?) to write about:

– During Milan’s Design Week I had the opportunity to give a hand with the setup of M.I.T. Tangible Media Group‘s Transform; needless to say it was an interesting toy to play with and a great way to spend a week with stimulating people;

– a good part of the spring was spent on art projects, including a software tool for Eileen Cowin, a couple dynamic sculptures for George Theonas and some experimental real time visuals for an upcoming Empress Stah show; also a couple SPECTRE installations travelled to a lovely gallery in Paris.

Right now I’m finishing a VJ tool making use of laser projectors and working on some new computer vision stuff.

Performing with Robots

Yesterday Marco Tempest was on TED‘s stage again, but this time he was not alone; he performed together with our new creation (creature?): EDI, the Magic Robot!

EDI is a heavily customised Baxter robot: we created for him a set of custom manipulators, a few top secret hardware attachments (you know, for magic ;) ), and a complete “software brain” that enabled him to display a personality, to train with humans and to learn from them.

Waiting for the official video to be released, you can read the post on the ted blog.

A final note: as you can imagine, giving life to EDI was an exciting, but complex job, so I think it’s not surprising that Marco, David (our robotics guru, borrowed directly from MIT’s Media Lab) and I were a little apprehensive about this first live performance. Given the common belief that machines are cold and feelingless, it might surprise that EDI was anxious too: if you don’t believe me, have a look at the following behind-the-scenes video ;)


Ricevi al tuo indirizzo email tutti i nuovi post del sito.