Pedro Lopes, Ricardo Jota, Joaquim Jorge
[Published at ACM ITS 2011, Kobe, Japan / paper]
Recognizing how a person actually touches a surface has generated a strong interest within the interactive surfaces community. Although we agree that touch is the main source of information, unless other cues are accounted for, user intention might not be accurately recognized. We propose to expand the expressiveness of touch interfaces by augmenting touch with acoustic sensing. In our vision, users can naturally express different actions by touching the surface with different body parts, such as fingers, knuckles, fingernails, punches, and so forth – not always distinguishable by touch technologies but recognized by acoustic sensing. Our contribution is the integration of touch and sound to expand the input language of surface interaction.
Update: some media coverage on the press [Portuguese press / Exame Informática]
It reminds me of the America artist, Jackson Pollock. Cool Awesome!
Of Pollock’s action painting?
p.s.: glad you liked it.
Nice implementation! Didn’t get to ITS Japan this year but have published on a similar topic before at ITS 2010: http://www.mee.tcd.ie/~lmosulli/pubs.html.
Let’s get tabletops listening to us, as well as watching us! :}
L.
Thanks, I will look up your publications. Hopefully this type of works will enable more expressiveness in tabletops.