My favorite style of human-robot interaction is minimalism. I’ve met a lot of bots, and some of the bots that have captured my heart most effectively are those that express themselves through their basic simplicity and purity of purpose. What’s great about simple purpose-based robots is that they encourage humans to project their own needs, wants, and personality, allowing us to do much of the heavy lifting for human-robot interaction (HRI).
In terms of simple, purpose-oriented bots, you can’t do much better than a robotic trash can (or bin or can or whatever you have). In a paper presented in HRI 2023 This week, researchers from Cornell explored what happened when random strangers interacted with a pair of standalone trash barrels in New York City, with intermittently delightful results.
What’s particularly cool about this, though, is the amount of HRI that’s going on around these robots that basically don’t have obvious HRI features, because they’re literally just trash cans on wheels. They don’t even have googly eyes! However, as the video points out, they are remotely controlled by humans, so a lot of the motion-based expressions they show most likely come from a human source – whether that was intended or not. These remote controlled robots move much differently than an autonomous robot. People who know how autonomous mobile robots work would expect these machines to perform slow, measured movements along smooth paths. But as A previous paper on litter barrel robots describedMost people expect the opposite:
One of the features we discovered is that individuals seem to have low confidence in autonomy, associating poor mobility and social errors with autonomy. In other words, people were more likely to believe a robot was computer-controlled if they noticed it getting stuck, bumping into obstacles, or ignoring people’s attempts to get its attention.
We first stumbled upon this perception when a less experienced robot driver was experimenting with controls, actively moving the robot in strange patterns. A nearby observer confirmed that the robot “must be autonomous. It is very strange that someone is controlling it!”
A lot of extrapolated characters can come from bots that make mistakes or need help; This is wrong in many contexts, but for simple social bots where their purpose can be easily understood, it can turn out to be a welcome feature:
Due to the irregular sidewalk surface, the robots sometimes get stuck. People were eager to help the bots when they were in trouble. Some observers preemptively move chairs and obstacles to make way for robots. Moreover, people interpreted the back-and-forth wobble motion as if the robots were nodding and approving, even when that motion was only caused by uneven surfaces.
Another interesting thing going on here is how people expect bots to want to “feed” trash and recycling:
Sometimes people thought the robots were expecting trash from them and felt obligated to give the robots something. When the robot passed by and was stopped by the same person a second time, she said, “I think he knows I’ve been sitting here long enough, I have to give her something.” Some people may even find an excuse to generate litter in order to “please” them and refuse the litter barrel by digging into a bag or picking up litter off the floor.
The previous paper goes into more detail on what this leads to:
It appears that people naturally attribute intrinsic motivation (or the desire to satisfy some need) to the robot’s behavior and that this mental model encourages them to interact with the robot in a social way by “feeding” the robot or expecting a social response in kind. . Interestingly, the role given to the robot by passers-by is reminiscent of a beggar as he asks for donations and is expected to be grateful for the donations. This contrasts sharply with human counterparts such as waitstaff or cleaners where they provide assistance and the receiving bystander is expected to express gratitude.
I wonder how much of this social interaction depends on the novelty of meeting trash barrel robots for the first time, and whether (if these robots are going to become full-time employees) humans will start treating them like janitors. I’m also not sure how well these bots would do if they did He was Self-confident. If part of the magic comes from having a human in the loop to manage what seem (but maybe not) relatively simple human-robot interactions, then turning that into effective autonomy can be a real challenge.
Garbage barrel robots in the cityBy Fanjun Bu, Ilan Mandel, Wen-Ying Lee, and Wendy JoThis week at HRI 2023 in Stockholm, Sweden.