Artificial skin creates first ticklish devices
The Skin-On interface, developed by researchers at the University of Bristol in partnership with Telecomm ParisTech and Sorbonne University, mimics human skin in appearance but also in sensing resolution.
The researchers adopted a bio-driven approach to developing a multi-layer, silicone membrane that mimics the layers present in human skin. This is made up of a surface textured layer, an electrode layer of conductive threads and a hypodermis layer. Not only is the interface more natural than a rigid casing, it can also detect a plethora of gestures made by the end-users. As a result, the artificial skin allows devices to ‘feel’ the user’s grasp — its pressure and location, and can detect interactions such as tickling, caressing, even twisting and pinching.
“This is the first time we have the opportunity to add skin to our interactive devices. The idea is perhaps a bit surprising, but skin is an interface we are highly familiar with so why not use it and its richness with the devices we use every day?” said Dr Anne Roudaut, Associate Professor in Human-Computer Interaction at the University of Bristol, who supervised the research.
“Artificial skin has been widely studied in the field of Robotics but with a focus on safety, sensing or cosmetic aims. This is the first research we are aware of that looks at exploiting realistic artificial skin as a new input method for augmenting devices,” said Marc Teyssier, lead author.
In the study, researchers created a phone case, computer touch pad and smart watch to demonstrate how touch gestures on the Skin-On interface can convey expressive messages for computer mediated communication with humans or virtual characters.
“One of the main use of smartphones is mediated communication, using text, voice, video, or a combination. We implemented a messaging application where users can express rich tactile emotions on the artificial skin. The intensity of the touch controls the size of the emojis. A strong grip conveys anger while tickling the skin displays a laughing emoji and tapping creates a surprised emoji” said Marc Teyssier.
“This work explores the intersection between man and machine. We have seen many works trying to augment human with parts of machines, here we look at the other way around and try to make the devices we use every day more like us, i.e. human-like,” said Dr Roudaut.
It may not be long before these tactile devices become the norm. The paper offers all the steps needed to replicate this research, and the authors are inviting developers with an interest in Skin-On interfaces to get in touch.
Researchers say the next step will be making the skin even more realistic. They have already started looking at embedding hair and temperature features which could be enough to give devices — and those around them — goose-bumps.
Paper: Marc Teyssier, Gilles Bailly, Catherine Pelachaud, Eric Lecolinet, Andrew Conn, Anne Roudaut. Skin-On Interfaces: A Bio-Driven Approach for Artificial Skin Design to Cover Interactive Devices. UIST 2019