You won’t even need to dress yourself in the future — thanks to robots like this

Luke Dormehl

Chalk it up to a misspent youth watching The Jetsons if you want, but we’ve always kind of liked the idea of a home robot that can dress you in the morning. Forgot the mundane chore of pulling on our own pants and shirt; this is one job we’d happily hand over to machines without a second thought. Sadly, while cooking, cleaning, and even ironing robots are a real thing, breakthroughs involving dressing robots have been in disappointingly short supply.

We may not yet have gotten there, but research coming out of the Georgia Institute of Technology suggests that it’s not out of the question for the near future. Engineers at Georgia Tech have developed a robot that is able to assist users in putting on a hospital gown. Although it’s still at a very early stage — capable of just putting on a single sleeve of the gown — the technology it uses to do this is actually pretty darn smart.

“At a high level, we have taught the robot, a PR2, how to consider what people physically feel when receiving assistance,” Zackory Erickson, lead researcher on the project, told Digital Trends. “In a sense, the robot is able to take on a human’s perspective when providing dressing assistance. In doing so, we observed emergent behaviors in which the robot was able to fully dress a person’s arm and mitigate the chance of the garment getting caught on the person’s body.”

Georgia Institute of Technology

The robot learned to perform the task by carrying out thousands of virtual dressing attempts using a physics-based simulation. By making mistakes, the robot was able to learn from scenarios such as cloth getting caught on a body part without putting users at risk — or, in the real world, making you suffer through thousands of mornings of failed dressing scenarios before the robot finally works correctly. From this simulation, the robot’s recurrent neural networks were trained to estimate the forces a garment exerts onto a person during dressing.

Rather than relying on vision, the robot uses haptics to predict how clothing will interact with a person’s body during assistance. That’s because clothing typically hides a person’s body, and therefore makes it difficult for a robot to see what is happening.

“Robot-assisted dressing is an application that merits investigation due to the number of people who could benefit from it,” Professor Charlie Kemp told DT. “We’re making progress, but there is still much to be done. In the meantime, we’re optimistic that aspects of our work will be broadly applicable.”

For now, the researchers are continuing to work with PR2 — including simulated demonstrations showing how it could be used to put on a person’s T-shirt, shoes, and even a cardigan.