This mini-post explores the use of deformation as a medium for human computer interaction. The question being asked is the following: can we use the shape of an object, and how it changes in time, to encode information? Here I present a deep learning approach to this problem and show that we can use a balloon to control a computer.
A detailed description of OrbTouch can be found in my recent paper entitled “OrbTouch: Recognizing Human Touch in Deformable Interfaces with Deep Neural Networks” (the code can be found here). The gist of this work is captured in the two videos below.
MOVIE 1: The real-time output from the software running on the embedded computer inside the controller. It uses a series of embedded convolutional neural networks to learn and then predict user-defined touch gestures.
MOVIE 2: Users can define their own gestures and train OrbTouch to recognize them. Here I am using the gestures shown in Movie 1 to control a game of Tetris running on a host laptop. The controller communicates with its host via Bluetooth.