The robot Digit stands approximately five feet, four inches high, with a metallic torso the teal color of a hospital worker’s scrubs. It can walk up and down staircases and around corners on two legs, and lift, carry, and stack boxes up to 40 pounds with arms whose hinges evoke the broad shoulders of a swimmer.
Agility Robotics, Digit’s manufacturer, shipped roughly 30 of these robots earlier this year to industrial and academic clients. The robot is designed to labor alongside human workers in industrial spaces like warehouses and factories, and the Albany, Oregon-based company expected that the initial feedback would focus on Digit’s mobility and functionality.
It did not anticipate a swift early consensus that the robot gave people the creeps.
“We’ve unfortunately gone a little bit into the uncanny valley there, with something that people identify as being wrong,” said Agility co-founder and chief technology officer Jonathan Hurst, a professor of robotics at Oregon State University.
“We don’t need [a head] for functionality. But… it effectively is necessary for functionality if you consider humans working with it and accepting it as part of its necessary function.”
Digit is meant to navigate spaces originally constructed for human workers, and so its form follows that of the human body — at least, from the “shoulders” down. Rising above its torso like a human neck is a black cylinder that houses the robot’s lidar sensor. Above that, there’s nothing. Digit has no head. And if it’s going to work alongside people and not make them uncomfortable, Agility now believes it needs to have one.
“We don’t need [a head] for functionality,” Hurst said. “But… it effectively is necessary for functionality if you consider humans working with it and accepting it as part of its necessary function.”
The oversight underscores a compelling challenge in robot design: The way a robot makes people feel can matter at least as its ability to complete its tasks. And given that humans read an extraordinary amount of nonverbal information in one another’s faces, a robot’s head or face can have outsize impact on the way users perceive it.
When creating a robot that will interact with people, be they residents of a nursing home, customers at a restaurant, or co-workers in a warehouse, designers have to walk a fine line between features that can be anthropomorphized enough that humans feel comfortable, yet not so realistic that the robot tumbles into the “uncanny valley” of creepily lifelike tech. They have to create robots that remind humans of themselves but that also make clear they’re not human, just machines with discrete and limited functions. A recognizable head or face can offer an approaching human all kinds of nonverbal clues about the extent of a robot’s abilities and what its next move will be. And should the designer intend it, those same design principles can also be used to deceive.
Even if a robot isn’t intentionally designed with a face, people will seek one out. Though Nicholasville, Kentucky-based Badger Technologies deliberately made its aisle-roving, shelf-scanning grocery store robots as simple and unobtrusive as possible, observers have pointed out that its glowing twin information lights (blue when operating normally, red when the robot is bumped) bear an unfortunate resemblance to evil eyes. (Think Buzz Lightyear’s nemesis Emperor Zurg, from Toy Story.) The company’s main client, the supermarket chain Stop & Shop, attempted to solve that problem by outfitting their robots with googly eyes.
“It’s really hard for people not to see faces,” said Maya Cakmak, an associate professor of computer science and engineering and director of the Human-Centered Robotics Lab at the University of Washington. “Our vision system is designed for detecting faces, so we detect them everywhere, even when they aren’t there.”
The name for this phenomenon is called pareidolia: the unconscious tendency for humans to organize ambiguous visual information into something recognizable, like seeing a shape in a cloud or a face grinning back at you from a plank of wood, a soap dispenser, or a robot’s cameras.
Humans derive an enormous amount of nonverbal information from faces. We infer emotional states from people’s expressions, and follow their gaze to determine their intent or take instructions. People make unconscious assumptions about others’ personality traits from their faces alone.
When it comes to robots, that unconscious expectation of seeing a recognizable face — and the subtle sense of frustration or unease when it’s not met — becomes more pronounced the more humanoid a robot is. Plenty of robots used in industrial settings don’t have a recognizable equivalent to a head or face, and human workers don’t care. A robot that evokes as much of the human form as Digit does, however, doesn’t get that same pass.
Robots have no emotional states or personalities, but that doesn’t stop us from projecting onto them. (And given the difficulty and expense of mechanically reproducing the flexibility of a human face, most robot faces are rendered on screens.) In a 2018 study, Cakmak and her colleagues found that people rated robots as more “friendly” the more human-like facial features they had, and cited those without mouths or pupils as significantly more “creepy” and “untrustworthy” — or “soulless,” as one participant put it. The presence of robot eyelids also triggered strong feelings, with subjects describing the robots as “sly” and “smug.”
A robot face can have practical uses. A pair of eyes gives people a sense of the robot’s &l
- Ten years ago I watched as protesters toppled Egypts brutal regime. Now their hopes of a new era of freedom lie in tatters They’re going
- But with the ease of denying responsibility and the wide range of possible attackers, the traditional deterrents of the nuclear age no longer work.
- Miserable winter weather is still hitting Texas and its spreading to the East Coast As Texans continue to cope with the effects of deadly