Ballagas et al. summarized prior work and constructed a design space of using mobile phones as a ubiquitous input device.
First they discussed how mobile phones were used to accomplish different subtasks (based on Foley et al’s work). Further they developed and proposed several metrics to evaluate such input techniques. Finally, all the prior work is situated in a design space with a five-part spatial layout.
* James Foley, Victor Wallace, and Peggy Chan’s classic article structures a taxonoomy of input devices around the graphics subtasks that they can perform: position, orient, select, path, quantify, and text entry.
** Position: during a position task, you specify a position in application coordinates;
** Orient: the orienting subtask involves specifying an orientation (not a position) in a coordinate system;
** Select: in the selection subtask, you select from a set of alternatives, such as a set of commands;
** Path: pathing involves specifying a series of positions and orientations over time;
** Quantify: quantifying involves specifying a value or number within a range of numbers;
** Text: text entry for mobile phones is a well studied area…
* Positioning techniques can be continuous, with the object position changing continually, or discrete, with the position changing at the end of the task.
* Evaluation of techniques
** Perceptual, cognitive, and motor load.
** Distance sensitivity – the range of distances the interaction will support… Both display size and interaction distance influence perspective size.
* Examining these new mobile phone interaction techniques within the structure of a taxonomy and design space helps designers understand the relationships between the different techniques and identify which techniques are most suited for a particular interaction scenario.
* This paper starts with a vision that the smart phone is going to be the mouse/keyboard for ubiquitous computing (devices);
* Try to read all the related work;
* To track the smart phone, there are always two framework:
** The phone is a receiver: Madhavapeddy et al’s tagged GUI –> environment display information, the phone senses this information;
** The phone is an emitter: Patel and Abowd augmented smart phones with laser pointers –> the phone sends out signals, the environment senses them.
* Optical flow: by analyzing the changes (delta) between consecutive frames, it can tell the movement of the phone (independent of what it is pointing to).