Tag Archives: Thinking out loud

Next conference: RecPad 2010

22 Oct

I will be presenting a poster at RecPad 2010, alongside Guilherme Fernandes. We will present our Trainable DTW-classifier for feeg gesture recognition, which we built on a foot-controller device that allows to control Mt-Djing application, as shown below.

user2

(Controlling Mt-Djing with feet gestures)

The program is available now here.

Singleton in AS3

4 Jul

Lots of discussion, loads of different versions, I’m using this and still haven’t noted any “flaws”:

// ActionScript file
package
{
   public class MyTestClass
   {
   /* singleton */ { trace ( "Initialising MyTestClass singleton" ); _singleton = true; _instance = new MyTestClass ( ); _singleton = false;  }
     private static var _instance : MyTestClass;
     private static var _singleton : Boolean;

     public var strTestVariable : String = "Was the singleton test successful?";

     public function MyTestClass ( ) : void
     {
     trace ( "Constructor" );

     if ( ! _singleton )
     {
       throw new Error ( "Error: Instantiation failed\n       Please use MyTestClass.getInstance ( )" );
     }
   }

   public static function getInstance ( ) : MyTestClass
   {
     return _instance;
   }
  }
}

Building OpenSG in Ubuntu / Linux

29 May

Grab OpenSG 1.8.X

For debian based systems, we can use the ppa as documented here (http://www.opensg.org/wiki/Releases)

deb     http://ppa.launchpad.net/opensg/ubuntu gutsy main restricted universe multiverse

deb-src http://ppa.launchpad.net/opensg/ubuntu gutsy main restricted universe multiverse

To 9.10 (karmic koala) the package can be found here (https://launchpad.net/ubuntu/+source/opensg), the simplest way to add it is by:

> sudo add-apt-repository ppa:opensg

OpenSG 1.8.X can be build in Linux either by configure (http://www.opensg.org/wiki/BuildLinuxAutoTools) or by GCC and scons (http://www.opensg.org/wiki/BuildLinuxScons).
(if you are really insane, you can compile OpenSG – note: it took 1/1h30 hours on my netbook… full scale build. There’s a parallel build you you have a multicore/manycores)If you wish to compile OpenSG from source, checkout from cvs first using:

cvs -d:pserver:anonymous@opensg.cvs.sourceforge.net:/cvsroot/opensg login
cvs -z3 -d:pserver:anonymous@opensg.cvs.sourceforge.net:/cvsroot/opensg co -P modulename

(where modulesname is OpenSG)

Then, make sure you have build-essential (basic build package for Debian Linux).

And they recommend instaling this (package dependencies):
sudo apt-get install g++ libglut3-dev zlib1g-dev bison flex libjpeg62-dev libpng12-dev libtiff4-dev libqt4-dev
The glut is essential, and I’ve also installed support for both qt3 and qt4, by locating manually the include directories in my system (use find and locate them, if it complains try different directories)

Easy install (via Synaptic or apt-get)

The easiest way is by adding the ppa repository as shown above, an then simply use your preferred way to add new software. On Ubuntu I like to use synaptic (which is a graphical frontend and works nice for newbie users). Type opensg in the search bo and install what you need (I installed everything because I’m in developing stage, glut related packages are important), here’s a screenshot:

Screenshot(larger version by clicking here or in the image)

Building the source code

Go to the OpenSG source folder and type:

./configure --enable-glut --enable-jpg --enable-png --enable-qt3 --with-qt3=/usr/lib --enable-qt4 --with-qt4=/usr/share/qt4

./make >> make.log

sudo ./make install >> make-install.log

Make in a Compaq Mini only compiling with one core, 1.6ghz Atom cpu, took:

real    96m52.034s
user    82m28.065s
sys    7m13.607s(I’d recommend going to do something or work in a different core or computer)

Checking the installation

In theory, you’re done. But now I’ll present a way to test if your OpenSG installation was successful:

  • Check if GLUT support succedded (this is crucial!), for that we use osg-config which is a small executable that checks OpenSG related issues:
> osg-config --libs GLUT
-g -L/usr/local/lib/dbg -lOSGWindowGLUT -lOSGSystem -lOSGBase
-ljpeg -lpng -lz -lglut -lGLU -lGL -lXmu -lXi -lXt -lX11 -lpthread -ldl -lm -L/usr/X11R6/lib

And as you see, GLUT is configured with OpenSG (hooray!) – otherwise it would give a warning and say that GLUT was not configured.

  • Go to the Tutorial folder (in OpenSG/Tutorial) to test a demo:
./make
If demos are running well, you can get a pretty Thai Fighter in 18opengl_slave.cpp – take a look at the image below.
Screenshot-1

Typical FAQ, when compiling the Tutorials

Now you can get all sort of issues, so I’ll post them here:
  • Cannot find osg-config or OSG root directory:
Edit the makefile and correct the OSG_ROOT variable with your root folder for OSG, mine was /usr/local. Also the variable OSGCONFIG should point to OSGCONFIG := /usr/local/bin/osg-config
  • Cannot find Glut stuff, (c++ errors) – check again if GLUT support was installed. This is necessary! (see previous paragraphs on how to check with osg-config), If check fails you must build it with GLUT enabled.
  • Ld (linker) complains that he cannot find dynamic libraries like libOSGWindowGLUT.so (or libOSGWindowX.so, libOSGBase.so, etc..):
This is very easy, and common, what’s happening is your program compiles well but then when you launch it, the system cannot find the libs. Just copy them to the system folder that holds your libraries, i.e.:
> sudo cp /usr/local/lib/dbg/* /usr/lib

A more extended FAQ (with other issues)

  • If you’re not getting it working, you can try to change makefile a bit (for me it was changing LIBTYPE ?= dbg, instead of opt) to compile the tutorials. Of course if you build OpenSG with either opt and dbg, both works.
  • To check glut, simple GL application, without OpenSG: > gcc cube.c -o cube -lglut; this should build a cube application using this source file. Test it.

Working with OSC inside C/C++

28 May

For those that are searching for OSC libs for either C or C++, here’s my top two:

  • liblo is a lightwheight OSC implementation, fully C and targeted for POSIX-compliant systems.

It is very simple to install (and for those who use Debian can enjoy the apt-get install liblo0ldbl or liblo0-dev) or to build from source (see sourceforge to browse the code). I’ve made a test some while ago, that used liblo0 to interface a OpenGL game with PureData for sound engine, here’s the video – it’s just a proof of concept:

Space Invaders with OSC (Pure Data + Libl0) #1 from PedroLopes on Vimeo.

  • OSCpack is the big brother of liblo, because it is C++ driven and easier to build on Windows, thus making it cross-plataform (still POSIX compliant under *UNIX it seems…)

It is very simple to use, I’ve downloaded the latest release from this link, and compiled it. It seems that if you are using the latest gcc it will give errors not finding headers for memcpy, atoi and memset. Just edit the *.cpp and add either #include “cstring”, “stdlib” or other that you seem is needed. It will compile smoothly after (well.. some warnings are shown).

p.s.: I’ve uploaded OSCpack to my github as a part of the previous project, take a look at the files if you have build issues with gcc 4.4.1 or so.

After building it, just go to the bin and type ./OscUnitTests and if all goes well:

> 75 tests run, 75 passed, 0 failed.

  • So depending on the nature of your project you can either go each way, if you’re looking to a more C++ oriented version and easier to build on Linux, I’d go for OscPack, if not and then liblo0 is an excellent lib!

Recognizing gestures by the sound they make (DTW and Puredata)

20 May

This prototype follows the idea of [1] where Harrison uses a microphone to record and recognize a set of defined gestures. In this implementation we follow the same approach by using the Dynamic Time Warp algorithm[2,3] but propose it under a different setting: the PureData programming environment[4].

This research for IMMI course at IST will result in a small decoupled system that is able to recognize a set of gestures and send the resulting gesture in a OSC formatted message, to be received in any connected system via network (either locally of remotely) – thus achieving modularity. For our experiments will be using it to recognized gestures performed by a DJ with his foot to control a DJ setup, can be used for many more applications (such as proposed by Harrison in[1]).

Currently there is no implementation of the DTW algorithm for usage in PureData environment, although it has been proposed by Todoroff and Bettens [5] but still not ported to PureData, our implementation is based on on Andrew Slater and John Coleman’s DTW[6] but ported to PureData with several needed modifications. The DTW object is still in alpha phase but it is already working and available for public usage here[7], the official release will be published later once the API is fully defined.

Dynamic Time Warping in PureData (alpha) from PedroLopes on Vimeo.

This shows a possible implementation of a DTW as a pd external – written in ANSI C. For this demo there are 8 samples that are used as gesture patterns, the algorithm tries to find the best gesture for the sampled input.

Hardware: lo-fi built-in microphone (very bad!)
Software: Puredata 0.41 (works with extended too.); Ubuntu 9.10; Jack Audio Server

Work by: Pedro Lopes and Guilherme Fernandes

References

[1] Harrison, Chris and Hudson, Scott E. Scratch Input: Creating Large, Inexpensive, Unpowered and Mobile finger Input Surfaces. In Proceedings of the 21st Annual ACM Symposium on User interface Software and Technology. UIST ’08. ACM, New York, NY, 205-208.

[2] Toward accurate dynamic time wrapping in linear time and space.. S. Salvador and P. Chan. Intelligent Data Analysis, 11(5):561-580, 2007.

[3] FastDTW: Toward Accurate Dynamic Time Warping in Linear Time and Space. S. Salvador & P. Chan. KDD Workshop on Mining Temporal and Sequential Data, pp. 70-80, 2004

[4] Puckette, Miller Smith (2007). The Theory and Technique of Electronic Music. World Scientific Press, Singapore. ISBN 978-9812705419.

[5] Todor Todoroff , Fré́déric Bettens, REAL-TIME DTW-BASED GESTURE RECOGNITION EXTERNAL OBJECT FOR MAX/MSP AND PUREDATA , Sound Music Computing 2009, Oporto. Faculty of Engineering (FPMs) – TCTS Lab

[6] Andrew Slater and John Coleman’s DTW: –http://www.phon.ox.ac.uk/files/slp/Extras/dtw.html

[7] Github Repository for Pedro Lopes http://github.com/PedroLopes/PD-externals

About the interface concepts

28 Apr

After having some mind struggles with the interface concepts, its now more or less defined:

Realtime Setup Creation

The concept is similar to the pacthing programming languages (PureData, Max/MSP) and pacthing interfaces such as Reactable, AudioPad, BlockJam, etc… thus the user is able to build his own DJ system has he plays. The video below is a mock up that shows the creation of a couple of objects on the interface.

Interface Concepts MockUp: connecting stuff from PedroLopes on Vimeo.

This is the mock up / draft version of the idea of dynamic pacthing. I use the reacTiVision TUIO simulator to send OSC-TUIO commands to the Interface layer, thus controlling the objects.

The objects are: movable, scalable, rotatable and linkable as you see in this mockup.

The need for more objects is rapidly satisfied by dragging and droping a new object from the canvas, the possibilities of connecting it with the current pacth are endless, thus resulting in a expressive virtual system that ultimately empowers the end user.

User Objects

The objects in MtDjing are organized as follows (see the mock up diagram below):

MockUpObjects

Interface Objects – these are the objects that the user manipulates actively within the canvas, in a multitouch fashion. The objects are all related to DJ systems, and strive to be a visual representation that carries such concepts. All of the interface objects can be manipulated by the user: moved, rotated, resized, created, altered and deleted in runtime. They are organized in classes:

Sound Sources – are the objects that output sound, such as a virtual turntable.

DSP Objects – these object-types can perform some sort of audio manipulation, thus they have at least one audio     input and one audio output. A simple example is a sliding fader, that can control the sound volume.

Audio Cables – the links between the objects can be changed in realtime by the DJ-user, by connecting objects with a virtual audio cable, that will let the audio flow.

Static Objects – are part of the interface static view, such as the “Canvas”, “Side Bar”, the “master audio output”,  etc… these cannot be moved, resized or altered by the user during runtime.

Audio Flow

The connectivity style is derived from typical DJ systems, as we see on typical DJ (actually its derived from stage setups) diagrams, the flow f the audio tend to work in the upwards direction… thus having the sound sources bellow, and the chain of processing goes up until sound is delivered into the “Sound System” (which is slang for sound output).

These diagrams (as we see below) represent the mental model that we expect users to accommodate while interacting, because this is very familiar to them we gain more easy on the interface comprehension phase.

DjsystemFlow

On newer setups, those that make use of MIDI control devices) the devices tend to be  on a lower level than the sound sources (CDs, turntables, etc..) or at least on the same level – in ergonomics terms they are always within “arm reach distance” (a very important concept that our large multitouch interface must capture). The image above ilustrates this concept on a DJ setup.

Object Handling

In multitouch interfaces there is a recursive problem when designing objects that can perform a large set of operations that can interfere with each other. In our case, our dynamic Interface Objects have the following set of operations:

  • Move (free move withing a X-Y canvas)
  • Rotate (constrained rotation in 90º multiples’ angles)
  • Resize (constrained to a maximum/minimum values defined in each object)
  • Link (constrained to the rules on input connects to output and output to input)
  • Sub-set of DJ actions (actions that the object performs: click, slide, whatever…)

So the issue is how to create such objects that are Rotatable-Dragable-Scalable and still allow you to use them (by means of direct touch manipulation)? One possible way to g is using Object Handles as proposed and tested by Nacenta in 2009 [1]. These object handles are shown in a diagram below, from the original article (available to the public).

ObjectHandles

Some issues I will test soon: I will deploy a version of my current prototype with object handles so all of these features can be testes. In proposing that my object handles are the following around the object :Link, Move, Rotate & Scale and in the inside of the object is the sub set of its DJ actions.

[1] Nacenta, M.A., Baudisch, P., Benko, H., Wilson, A. 2009. Separability of Spatial Manipulations in Multi-touch Interfaces. In Graphics Interface, Kelowna, B.C., Canada. 175-182.

Combining Objects

The objects on the interface are of a lower granular-level than the typical DJ setup allows, in the physical world one cannot add an extra channel or EQ knob to our mixer… so in order to build a setup fast, many objects must be created by the user. To speed up the process and to allow some customization the “Group Object” is added to the metaphor. This is a object that allows the creation of sub-pacthes (once again… reminiscent of PureData internal workings as a visual programming language). A Group Object is a visual rectangle area that comprises all objects within its boundaries to a new object, thus we can easily create “presets” such as a two channel mixer with this group concept.

Open Source Flash#8: Get your Adobe Flash Builder 4 Licence (students)

21 Apr

Adobe has moved Flex Builder 3 into Flash Builder 4, unless for Linux – which is still running Flex Builder 3 (alpha stage, wit some issues). The plataform is built on Eclipse IDE and offers some nice features, auto-build, sintax highlight and code completion, and bla bla…

The program is available as a trial version but as a student you can ask for an official serial number, by providing Adobe with your student data. If it suits you, its better than using a trial version of course.

Note: if you are really hardcore, there’s the Flex Open Source SDK with a command line mxmlc compiler.

OSC (Open Sound Control) changes to Open Show Control?

2 Apr

I’ve received this today on OSC mailing list, the name is questionable (a protocol to “put on a show”) although it is a new way to show that OSC has gone much further than “sound” or “new MIDI”. We use it here for blob tracking (on CCV).

someone (-at-) cnmat.berkeley.edu

to osc_dev

show details 6:47 AM (2 minutes ago)

In light of the ever broadening scope of application of Open Sound Control
we have decided to change  the name of the OSC encoding to Open Show
Control.

Here are some examples of OSC applications motivating the name change:

– Motion capture data from cameras by Qualisys
– Lighting and theatrical control at the Topological Media Lab.
– VRAS and other acoustic rendering systems at Meyer Sound
– Solenoids at Disneyland
– Motor and Servo control on the Make Controller Kit
– Timely media device control and discovery in IEEE AVB

______________________________

Update: The name seems to be an open discussion (I hope it is), but the main thing is “OSC delivers more than sound related computing” . That’s whats important.

Multitouch Pong (off-thesis 2 hour project)

1 Apr

Multitouch Pong (Processing and CCV) from PedroLopes on Vimeo.

I was tired of my As3/C++/Pd project (multitouch DJ table) so I turned myself into processing for a bit, the first version of this code was done by myself (Pedro Lopes) and Miguel Jerónimo – in a multitouch workshop. This my new version, slightly altered from the initial one.

The finger-tracking of the table was not calibrated, so I used some color marker pens as pads for the game (sort of tangible-Pong).

The code: i will display it, as soon as I have a final version. It uses Processing 1.1 + TUIO object for processing + CCV (as the tracker), everything is open sourced and easy to start learning/developing. Maybe I can patch it with PureData or SuperCollider to make the sound synthesis.

p.s: This is just for some relaxation. Also I have a version where you can shoot at the opponent!!!

The scratch mat

16 Mar

 

An interesting project (appearing on The Fun Theory website) is this Scratch Mat:

Take a look!