Tag Archives: Multitouch

#ITS2011: Augmenting Touch Interaction Through Acoustic Sensing

9 Nov

Pedro Lopes, Ricardo Jota, Joaquim Jorge
[Published at ACM ITS 2011, Kobe, Japan / paper]

Recognizing how a person actually touches a surface has generated a strong interest within the interactive surfaces community. Although we agree that touch is the main source of information, unless other cues are accounted for, user intention might not be accurately recognized. We propose to expand the expressiveness of touch interfaces by augmenting touch with acoustic sensing. In our vision, users can naturally express different actions by touching the surface with different body parts, such as fingers, knuckles, fingernails, punches, and so forth – not always distinguishable by touch technologies but recognized by acoustic sensing. Our contribution is the integration of touch and sound to expand the input language of surface interaction.

Update: some media coverage on the press [Portuguese press / Exame Informática]

Presentation of paper at Interacção 2010

22 Oct

Pedro Lopes apresenta MtDjing 1

(download the paper)

Ps3eye in Ubuntu Tutorial

25 Sep

Tutorial kindly sent by Dino Magri (See his blog)

PS3 Eye

The ps3 Eye works in Ubuntu, although like it happens with other cameras the CCV does bnot recognize it (It happens with my USB built-in camera on both laptops). For that we have to install the library unicap and the driver gspca. First we download the latest version:

cd /home/user/
wget http://unicap-imaging.org/downloads/unicap-0.9.5.tar.gz
Uncompress the files:
tar -xvzf unicap-0.9.5.tar.gz
Download the patch (that we need to apply):
wget http://kaswy.free.fr/sites/default/files/download/ps3eye/unicap/unicap-gspca.patch
Before inatlling we need the packet, so just type:
sudo apt-get install patch
patch -p0 < unicap-gspca.patch
Now configuration followed by compilation:
cd /home/user/unicap-0.9.5
./configure
make
sudo make install
Done, CCV should be able to get the frames from PS3 driver.

Allowing DJs to share their patches!

3 Jun

Multitouch Djing: Loading Objects via XML from PedroLopes on Vimeo.

This is just showing the Objects in the interface being XML defined, that can be read at start up and create the patch for the DJ.

This will allow users of the interface to share patches, save, load, etc…

Multitouch Pong (off-thesis 2 hour project)

1 Apr

Multitouch Pong (Processing and CCV) from PedroLopes on Vimeo.

I was tired of my As3/C++/Pd project (multitouch DJ table) so I turned myself into processing for a bit, the first version of this code was done by myself (Pedro Lopes) and Miguel Jerónimo – in a multitouch workshop. This my new version, slightly altered from the initial one.

The finger-tracking of the table was not calibrated, so I used some color marker pens as pads for the game (sort of tangible-Pong).

The code: i will display it, as soon as I have a final version. It uses Processing 1.1 + TUIO object for processing + CCV (as the tracker), everything is open sourced and easy to start learning/developing. Maybe I can patch it with PureData or SuperCollider to make the sound synthesis.

p.s: This is just for some relaxation. Also I have a version where you can shoot at the opponent!!!

Computer Vision – Tbeta first analysis…

12 Jan

As previously noticed on the blog, I’ve already built a prototype (with very very DIY look and components) of a multitouch table – using Tbeta as the vision software.

Community Core Vision or CCV for short (also known as tbeta), is a open source/cross-platform solution for computer vision and machine sensing.

It takes an video input stream and outputs tracking data (e.g. coordinates and blob size) and events (e.g. finger down, moved and released) that are used in building NUI aware applications.

CCV can interface with various web cameras and video devices as well as connect to various TUIO/OSC/XML enabled applications and supports many multi-touch lighting techniques including: FTIR, DI, DSI, and LLP with expansion planned for the future vision applications (custom modules/filters).

Features

  • Simple GUI – The new interface is more intuitive and easier to understand and use.
  • Filters (dynamic background subtraction, high-pass, amplify/scaler, threshold) – This means it�ll work with all optical setups (FTIR, DI, LLP, DSI). More filters can be added as modules.
  • Camera Switcher – Have more than one camera on your computer? Now you can press a button and switch to the next camera on your computer without having to exit the application.
  • Input Switcher– Want to use test videos instead of a live camera? Go ahead, press a button and it�ll switch to video input.
  • Dynamic Mesh Calibration – For people with small or large tables, now you can add calibration points (for large displays) or create less points (smaller displays) while maintaining the same speed and performance.
  • Image Reflection– Now you can flip the camera vertical or horizontal if it�s the wrong way.
  • Network Broadcasting – You can send OSC TUIO messages directly from the configapp for quick testing.
  • Camera and application FPS details viewer – Now you can see the framerate of both the tracker and camera that you�re getting.
  • GPU Mode – Utilize your GPU engine for accelerated tracking.
  • Cross-platform – This works on windows, mac, and linux.

Software License: CCV is currently released under the GPL License however we are still considering alternatives that may be helpful in code reusablity and development such as LGPL, MIT or BSD.

Members

Founder: Christian Moore, Seth Sandler Developer: Artem Titoulenko, Charles Lo, citi zen, Daniel Gasca S., Ian Stewart, Kim ladha, Mathieu Virbel, Taha Bintahir Observer: Anirudh Sharma, Boyan Burov, David Wallin, Davide Vittone, Gabriel Soto, Gorkem Cetin, Guilherme Sette, Paolo Olivo, Sashikanth Damaraju, Sharath Patali, Thiago de Freitas Oliveira Araujo, Thomas Hansen, Tiago Serra

NUI design talk

3 Dec

As a thread in the NUI group forum, there’s this nice video talk from Darren David, Nathan Moody – Designing Natural User Interfaces from Interaction Design Association on Vimeo.
http://vimeo.com/moogaloop.swf?clip_id=4420794&server=vimeo.com&show_title=1&show_byline=1&show_portrait=0&color=&fullscreen=1

A brief summary of its core topics, and how they apply in my MTdjing application for music control over multitouch surfaces:

  • NUI goal: “eliminate proxy control do increase immersion” -> Computer DJs can get rid of those mice device (interacion proxys) and manipulate the digital medium with their hands like they would on their natural hardware (mixer,s turntables, cd decks, fx processors, vynil records, etc…).
  • “NUI emotional conotation if for play instead of work (GUI)” -> this means the usual coment that is heard about computer djs (“that they’re not that physical / they don’t move that much / they’re checking emails”) is colapsed under the fact that with a MTdjing application a DJ can perform all the tasks of his favoutire Djing-software with his hands moving around a table like a regular DJ.
  • “Manage the user expectation issue” luckly when we think about a DJing application its  an easy situations because we have an advantage: “this is the users old method”. A table + hand interaction is the way everyone Dj’ed until recently (where softwares like traktor and others appeared to the main public).
  • “Expectations of GUI” this can be a problem, because DJs that use actual virtual solutions (traktor Dj and other software to spin records) are used to a “software application” metaphor – and in a touchtable where talking about a NUI where the interaction is a bit diferent than an application running inside a computer: no mouse, no keyboard, no windows nor operating system, etc… So its hard to develop metaphor that will serve both the normal djs and the computer djs.
  • “Multi-user paradigm” this is where it really gets tricky. As far as MTdjing prototypes we are not considering the detection of each user – that is one of the core challenges of multitouch nowadays – because Djing is a task that you can do with multiple users but there’s no imediate need to understand who’s who. Of course we can think of matters where a multi-user multi-touch system for Djing matters, and I’ll write a post soon of some thoughs on “if we knew who’s hand is this”.
  • “Predictable, guessable interface” because we’re aiming for a mix of metaphors (somewhere beetween what computer djs see on their laptop screen and what hardware djs use with their hands) the interface will be inherently discoverable and natural for the users.
  • Real gestures matter” nothing need to be said about this, it just goes back to the previous bullet.

Final note: watch the video.

Gesture Research #1 (multitouching…)

28 Oct

So I gather a collection of gestures (that later will turn into my pool of tests for users analyse user behaviour – prior to defining the gestures for my dj system) by means of observation of multiple systems and multitouch demos:

a) Reactable

(note: the reactable relies mainly on tangible objects, but parameter changes are done via gestures/touch, so its a interface worth of researching..)

reactableGesture1ReactableGesture2

Gathered info from: Luckly, I’ve used the Reactable once – after a concert/showcase integrated in a festival where I played with Whit – and got a first person experience with the incredible interface. Also I’ve seen it being performed live 2 times in Portugal and there’s a bunch of videos out there for analysis. On a more technical note, you can acess the articles on the Reactable project at the archive page from UPF.


b) tbeta demos

(i.e.: the very popular photo demo, google maps navigation and etc…)

TbetaDemosGesture1

TbetaDeosGesture2

TbetaDemosGesture3

Gathered info from: from the demos package available on the NUI group page.

c) surface

(note: the surface has pretty much the same gestures as the rest of MT tables)

SurfaceGesture1

Gathered info from: Microsoft is a closed one, but I’ve tryed surface recently on the Mix event – where I saw the talk of August de los Reyes (director of the project) – and had my first person experience with the product there. Also there are a lot of videos to analyze the type of gestures mainly used but theres a void of technical information available.


d) iPhone and similar PDA/mobile devices

MobileGestures1MobileGestures2

Gathered info from: Recently I participated in the Future Places festival and in the final day I played with the THE FUTURE PLACES IMPROMPTU ALL-STARS ORCHESTRA, which gathers many artists to improv with music – luckly there was two guys playing with iPhone apps – one using sounds to feed an analogue synth and other using a touch app that sequenced music and sounds. Whatching them interact with those portable “instruments” told me a lil’ something about the gestures.

And: Also I’ve seen and tryed myself apps for iPhone that use gesture recognition to extend the capabilities of touch.

Some thougts on MT inspired by August de los Reyes (pt.2)

24 Oct

After the first few thoughs (pt.1 on the blog) we arrive at a new idea:

Interacting with a table: what’s new on that?

1) So, first August brings out an important point (that of course is relevant to the Surface project) which is the nature of the “Table” element. First let’s recall that August is much into Product Design – so he would be the correct one to mention this – tables are indeed a special object in our society. For instante, what sort of tasks would you perform on a table? Let’s brainstorm on this:

Table objects: dinner table, conference table, work table, play table, surgeon table, and so on.

Table usages: dinner, conversation, have a drink, discuss, reunion, etc…

Whats the underlying aspect: social.

So even if I’m not directly proposing a tool for multi DJs, I have to address that question of “social play” on the interface. First because since the dawn of Djing the hardware real systems have been used for more than one Dj at a time. Secondly, addressing this question brings out certain questions that are “the big challenges” of NUI/Multitouch, for instance: “Who’s Hand Is This?” – how to identify players in certain contexts. Of course (luckly or not) in my context here we do need have the iminent urge to define/identify which player is interacting with the DJ app.

2) The map between a table / physical tool – (also remembering a bit of the previous post on NUI as a contextual interaction, rather than exploratory in GUI).

The really important key here is maintaing a sort of visual cognitive map of the interface: allowing the user to almost-guess how the gestual system will work.

I will build more on this later, after the next topic (still on August talk) and then we can start discussing the methods for user testing for the proposed system.

Let’s give it a first try… (PureData and TUIO)

6 Oct

First Try of PureData and TUIO (homemade multitouch) from PedroLopes [turn up the volume because its very low]

This is the document of my first 10 minutes of PureData alongside a really DIY multitouch table (done with a 16 fps camera, and absolutely no conditions!). It is simply a synth made with 2 oscilators that are fed from the positions of the tracked blobs. That’s why they change frequency values so fast nd sound glitchy (I might just use them for music soon…)

Software: Tbeta (sending TUIO OSC), PureData
Hardware: terrible web-cam, toshiba, piece of glass and a box.

The construction method is the one listed on the Tbeta webpage/forum, it’s quite simple but to get great results you need a better camera and lighting conditions (which I dont have at home now) – luckly I’ll have a great surface filled with a Laser Light Plane to get the correct blobs.

The Home solution, looks like this:

mt

mt2

Follow

Get every new post delivered to your Inbox.

Join 336 other followers