SideSight used infra-red sensors (IR) to detect finger gestures off the two sides of a mobile device, thus enabled multi-“touch” interaction beyond the device’s screen.
In the prototype system, a linear array of IR sensors were attached along the side of a mobile device and returned a depth image indicating at which point (sensor) a finger was closer/closest to the device.
* Just for writing:
** Despite the flexibility of touchscreens, using such an input mode carries a number of tradeoffs.
** (Note how this leads the flow in the intro) This paper describes the concept, implementation and potential application of SideSight.
** LucidTouch supports a rich repertoire of interaction possibilities…
** … to divert the user input region to the areas on either side of the device;
** Given these issues we have opted for an approach based around…
* “… it is typically first placed onto a flat surface such as a table or desk.” – is this the typical use case of mobile devices? Not addressed in the paper;
* Shift focuses on stylus input – what happens when it comes to multi-touch (multi-occlusions?) ?
* Writing plus:
** The introduction follows a fluid, nicely progressive structure;
** In the related work, let your work lead the related work, e.g., “Sidesight does blah blah blah. This approach is found in prior work, …”;
* Questions for the work: should the hands keep off when not using it or when just using one finger or two?
* Note that this paper doesn’t have a study – instead they present “three different interaction scenarios…”.