Researchers at the Creative Machines Lab at Columbia Engineering have now developed a new robot named EVA. This robot is autonomous in nature and contains a soft and expressive face. This enables the robot to respond to nearby humans by matching their expressions. This research is set to be presented at the ICRA conference on May 30, 2021.
Video Credits: Columbia Engineering, YouTube
The team of researchers has been working on this robot for the past five years. The blueprints of the robot are open sourced on Hardware-X. There is an increasing use of robots in various sectors to carry out a lot of activities. People usually rely on facial expressions to build trust among each other but the robots most of the time keep a straight, expressionless face.
Since this is the case, there is a rising demand for robots with realistic faces and are more responsive. According to Hod Lipson, James and Sally Scapa Professor of Innovation (Mechanical Engineering) who is also serving as the director of Creative Machines Lab, “The idea for EVA took shape a few years ago, when my students and I began to notice that the robots in our lab were staring back at us through plastic, googly eyes.”
Proving facial expressions to the robots can prove to be a major challenge. Traditionally, robotic body parts are developed out of hard plastic or metal. These materials are stiff in nature and do not move freely like a human does. EVA contains artificial muscles which are made out of cables and motors that pull specific points located on the robot’s face. With the help of this, EVA is capable of expressing emotions such as anger, disgust, fear, joy, surprise and sadness.
Zanwar Faraj who was an undergraduate student when he developed the robot’s physical machinery said, “The greatest challenge in creating EVA was designing a system that was compact enough to fit inside the confines of a human skull while still being functional enough to produce a wide range of facial expressions.”
EVA’s parts were created with the help of a 3D printing mechanism that created complex shapes which were easily integrated with the robot’s skull. The team worked for weeks to make the robot give out several expressions when they noticed that it could elicit emotional responses from the researchers.
Hod Lipson said, “I was minding my own business one day when EVA suddenly gave me a big, friendly smile. I knew it was purely mechanical, but I found myself reflexively smiling back.”
The software phase of this project was led by Boyuan Chen who is a PhD student. Chen and a team of students created the brain of this robot by making use of several Deep Learning neural networks.
According to the researchers, EVA still has a long way to go to learn and replicate the complex way in which humans communicate with each other by using facial expressions. However, they also believe that such technology will be useful someday for real life applications.
Hod Lipson said, “There is a limit to how much we humans can engage emotionally with cloud-based chatbots or disembodied smart-home speakers. Our brains seem to respond well to robots that have some kind of recognizable physical presence.”
Adding to that Chen said, “Robots are intertwined in our lives in a growing number of ways, so building trust between humans and machines is increasingly important.”