After having some mind struggles with the interface concepts, its now more or less defined:
Realtime Setup Creation
The concept is similar to the pacthing programming languages (PureData, Max/MSP) and pacthing interfaces such as Reactable, AudioPad, BlockJam, etc… thus the user is able to build his own DJ system has he plays. The video below is a mock up that shows the creation of a couple of objects on the interface.
Interface Concepts MockUp: connecting stuff from PedroLopes on Vimeo.
This is the mock up / draft version of the idea of dynamic pacthing. I use the reacTiVision TUIO simulator to send OSC-TUIO commands to the Interface layer, thus controlling the objects.
The objects are: movable, scalable, rotatable and linkable as you see in this mockup.
The need for more objects is rapidly satisfied by dragging and droping a new object from the canvas, the possibilities of connecting it with the current pacth are endless, thus resulting in a expressive virtual system that ultimately empowers the end user.
The objects in MtDjing are organized as follows (see the mock up diagram below):
Interface Objects – these are the objects that the user manipulates actively within the canvas, in a multitouch fashion. The objects are all related to DJ systems, and strive to be a visual representation that carries such concepts. All of the interface objects can be manipulated by the user: moved, rotated, resized, created, altered and deleted in runtime. They are organized in classes:
Sound Sources – are the objects that output sound, such as a virtual turntable.
DSP Objects – these object-types can perform some sort of audio manipulation, thus they have at least one audio input and one audio output. A simple example is a sliding fader, that can control the sound volume.
Audio Cables – the links between the objects can be changed in realtime by the DJ-user, by connecting objects with a virtual audio cable, that will let the audio flow.
Static Objects – are part of the interface static view, such as the “Canvas”, “Side Bar”, the “master audio output”, etc… these cannot be moved, resized or altered by the user during runtime.
The connectivity style is derived from typical DJ systems, as we see on typical DJ (actually its derived from stage setups) diagrams, the flow f the audio tend to work in the upwards direction… thus having the sound sources bellow, and the chain of processing goes up until sound is delivered into the “Sound System” (which is slang for sound output).
These diagrams (as we see below) represent the mental model that we expect users to accommodate while interacting, because this is very familiar to them we gain more easy on the interface comprehension phase.
On newer setups, those that make use of MIDI control devices) the devices tend to be on a lower level than the sound sources (CDs, turntables, etc..) or at least on the same level – in ergonomics terms they are always within “arm reach distance” (a very important concept that our large multitouch interface must capture). The image above ilustrates this concept on a DJ setup.
In multitouch interfaces there is a recursive problem when designing objects that can perform a large set of operations that can interfere with each other. In our case, our dynamic Interface Objects have the following set of operations:
- Move (free move withing a X-Y canvas)
- Rotate (constrained rotation in 90º multiples’ angles)
- Resize (constrained to a maximum/minimum values defined in each object)
- Link (constrained to the rules on input connects to output and output to input)
- Sub-set of DJ actions (actions that the object performs: click, slide, whatever…)
So the issue is how to create such objects that are Rotatable-Dragable-Scalable and still allow you to use them (by means of direct touch manipulation)? One possible way to g is using Object Handles as proposed and tested by Nacenta in 2009 . These object handles are shown in a diagram below, from the original article (available to the public).
Some issues I will test soon: I will deploy a version of my current prototype with object handles so all of these features can be testes. In proposing that my object handles are the following around the object :Link, Move, Rotate & Scale and in the inside of the object is the sub set of its DJ actions.
 Nacenta, M.A., Baudisch, P., Benko, H., Wilson, A. 2009. Separability of Spatial Manipulations in Multi-touch Interfaces. In Graphics Interface, Kelowna, B.C., Canada. 175-182.
The objects on the interface are of a lower granular-level than the typical DJ setup allows, in the physical world one cannot add an extra channel or EQ knob to our mixer… so in order to build a setup fast, many objects must be created by the user. To speed up the process and to allow some customization the “Group Object” is added to the metaphor. This is a object that allows the creation of sub-pacthes (once again… reminiscent of PureData internal workings as a visual programming language). A Group Object is a visual rectangle area that comprises all objects within its boundaries to a new object, thus we can easily create “presets” such as a two channel mixer with this group concept.