Pedro Lopes, Ricardo Jota, Joaquim Jorge
[Published at ACM ITS 2011, Kobe, Japan / paper]
Recognizing how a person actually touches a surface has generated a strong interest within the interactive surfaces community. Although we agree that touch is the main source of information, unless other cues are accounted for, user intention might not be accurately recognized. We propose to expand the expressiveness of touch interfaces by augmenting touch with acoustic sensing. In our vision, users can naturally express different actions by touching the surface with different body parts, such as ﬁngers, knuckles, ﬁngernails, punches, and so forth – not always distinguishable by touch technologies but recognized by acoustic sensing. Our contribution is the integration of touch and sound to expand the input language of surface interaction.
Update: some media coverage on the press [Portuguese press / Exame Informática]
To be presented at NIME 2011 (New Interfaces for Musical Expression, Oslo ). The remainder of the program is very interesting, so please feel free to look around here.
What about DJing?
How does it equates within an HCI perspective?
What forms of interaction exist with DJ gear?
How can one classify those interactions/gear?
The following paper addresses these questions, trying to create a framework of through concerning DJ interactions – a very idiosyncratic type of “music interfaces” – and proposes an evaluation of Traditional (analogue turntables, mixers and CD players), Virtual (software), Hybrid (traditional and software in synergy) and Multitouch (virtual and traditional in synergy).
The term multitouch here defines not a categorization but rather an implementation of a touch sensing surface DJing prototype. Explorations among new interfaces for DJing has already been hapenning for at least 10 years, but an evaluation from such perspective (interactions and devices) can aid researchers and developers (as well as DJs) to understand which tools fit in which style.