[Last but not least: still inspired and following the previous two posts.]
August gave a small hint that has been very helpfull to Microfost’s Surface Reseach Team (which he leads).
Creating a interface for Multitouch?
Of course when creating a sort of Multitouch interface, for a digital interaction of a table, several questions have to pop out of our minds:
What are the interaction metaphors?
The environment where its used?
What people are using this / handicap extensions?
And in the end… all we want is too understand how the people would feel natural about using this interface. How do we accomplish this? Giving people a pen and paper test.
Show them picture A and B. What tehy need to tell you is what gesture would they use to transform A into B. Simple.
Probable gesture: “Drag with their hand”…
Probable gesture: “rotate with finger” or “rotate with many fingers of the same hand” or “one finger each hand”
Probable gesture: “scale with one finger of each hand”
And it leads here… this is an example of a test built for a Dj veryical fader, I will use them in my research for the interface and interaction layers. Even so the earlier dot and square examples work very well to understand the way people move around inside multitouch interfaces and how the think of simple/atomic moves.
Example: vertical fader test
Possible Gesture: “one finger” or “one hand” or “two fingers and thumb” (this is a tricky one!)
Once again, thanks August de los Reyes for the talk at Remix 09 conference and for the inspiration in Multitouch apps. I would just leave a final link here, for an interview with him.