UBC researchers have designed a new, inexpensive and flexible sensor which can be used to make foldable screens and highly sensitive artificial skins. As a bonus, it might just keep us from getting harmed by a robot.
Sensor technology has been advancing in leaps and bounds over the last few years. These innovations use materials like carbon-based nanofibres and nanowires to produce excellent electrical conductivity and are capable of accurately measuring environmental conditions like humidity, pH, chemical compositions and mechanical forces.
At the same time, flexible technology is also booming, with applications in wearable electronics, smart prosthetics devices, robotics and electronic skins. But so far, the trick has been to combine the accurate sensing and collecting of information with materials that allow for more natural movements, thus creating a more functional and comfortable user experience. That’s where Mirza Saquib Sarwar, PhD student in electrical and computer engineering at UBC, sees his new creation coming in.
“There are sensors that can detect pressure, such as the iPhone’s 3D Touch, and some that can detect a hovering finger, like Samsung’s AirView. There are also sensors that are foldable, transparent and stretchable,” said Sarwar in a press release. “Our contribution is a device that combines all those functions in one compact package.”
The result is a transparent sensor made up of two widely available, low-cost materials: a stretchable, highly conductive hydrogel, held between thin layers of silicone. As described in a new study published in the journal Science Advances, the research team created a prototype 4 x 4 cross-grid sensor capable of detecting a finger (and its electrical charge) hovering a few centimetres above it, and, when bent or folded, the material retains its ability to detect finger pressure, touches and swipes.
The new material could have applications for robotic skins, making human-machine interactions safer, says John Madden, co-author of the study and professor in UBC’s Faculty of Applied Science. “Currently, machines are kept separate from humans in the workplace because of the possibility that they could injure humans,” says Madden. “If a robot could detect our presence and be ‘soft’ enough that they don’t damage us during an interaction, we can safely exchange tools with them, they can pick up objects without damaging them, and they can safely probe their environment.”
With these and other advances in robotic skins and transparent nanosensors, the dream (if it happens to be your dream, that is) of producing lifelike, “Westworld-calibre” humanoid robots seems perhaps not so far away. The basic elements, at least —a bit of artificial skin here, a whole lot of artificial intelligence there — are already available in their rudimentary forms, as displayed in AI robots like the Bosch invention, Kuri, a smart home bot that can detect facial expressions and is programmed to tap into its owner’s emotions. Then there’s Nadine, the personal assistant with lifelike skin and facial expressions from Singapore’s Nanyang Technological University, a “social robot” that can mimic gestures and is being touted as a future companion for those with dementia.
But big hurdles still remain in the area of artificial intelligence, say the experts, who suggest that current AI technologies are a long way from matching the complexities of everyday social interactions.