Gustafson et al. presented “Imaginary Phone” as a technique of transferring knowledge of using mobile device to using an imaginary interface on the palms. Essentially, instead of directly holding and using the device, people imagine the interface is on their non-dominant hand where they use the dominant one to perform touch gestures. This is enabled by tracking the two hands with depth camera to tell the touch positions with the palm space. Three assumptions make Imaginary Phone work: 1) users learn the spatial layouts of interface elements; 2) such knowledge can be transferred to imaginary interfaces; and 3) using imaginary interfaces can be accurate. Two studies proved these assumptions: first they studied and found most people can recall app locations from both an iPhone prop and their palm; second they studied the accuracy of touching on palms to select particular apps and found sufficient accuracy to operate the device.
* “The findings in  indicate that proximity to landmarks – in that case the tip of index finger and thumb – helps acquire targets; yet the empty space design is all but void of landmarks.”
* Keywords: spatial memory, knowledge transfer,
* Effective use of figures:
** Figure 2 shows a typical usage scenario;
** Figure 3b shows that the technology is not affected by brightness;
** Figure 4 easily explains the algorithm;
** Figure 8 and 10 sufficiently shows the design ideas and variations;
** Both Figure 14 and Figure 15 are very effective and non-boring ways of visualizing data and results;
** Figure 18 is a very strong demonstration of how mobile interface evolved.
* Relations to my work
** First, similar to the idea. When designing body-centric interaction, how can we transfer the knowledge? (what is the knowledge? how can we transfer?)
** Since this paper proposed a “transfer” of knowledge, a similar thought is not to transfer such knowledge, instead to reform it. Questions can be: why do we have to stick to how we use a mobile device; why do we have to be confined to a small palm space, regardless of the rest of our body space; can the next step of Imaginary Phone be extended to the whole body?
** A way to question (or critique) body-centric interaction is that there is no knowledge transfer (pockets + phone interface elements?). ””One way to frame the knowledge might be: we know what mobile interface is like – there are some applications, either the primitive “call”, “contacts”, “alarm”, etc. or the current hundreds of thousands of apps.”’ Can we understand the body as a canvas? ”’Another way to frame the knowledge is: we work in the peripersonal space, consider you typing on your laptop, you lean to look at the screen, you place your hands on the keyboard in front of you, when you lay back you might take the machine with you too, etc.”’ But when we use mobile devices everything happen in that device, regardless of the natures of tasks. How about transferring this to meet the scenario of mobile interaction?
** The tech part, the design discussion and the studies are ways to support the central idea. The central idea is knowing spatial memory help us learn where the apps are, we can make the knowledge transfer to our palms (thus the imaginary phone) which ensures accuracy for enabling interactions.