Mercredi 20 Novembre 2024
taille du texte
   
Lundi, 27 Septembre 2010 21:40

Quadriplegics Prefer Robot Arms on Manual, not Automatic

Rate this item
(0 Votes)

Aman Behal’s automated robotic arm functioned perfectly. Outfitted with sensors that could “see” objects, grasp them with enough force to hold but not crush them, and return them to the user, it easily outperformed the same arm under manual control on every quantitative measurement.

Except one. The arm’s users — patients with spinal cord injuries in an Orlando hospital — didn’t like it. It was too easy.

“Think about the Roomba,” Behal told Wired.com. “People like robots, and they like them

to work automatically. But if you had to watch and supervise the Roomba while it worked, you’d get frustrated pretty quickly. Or bored.”

This wasn’t what Behal had expected. This was the new sensor’s system first time in the field; the user satisfaction survey was supposed to be one more data point, secondary to measuring the performance of the device itself. But it made his team rethink their entire project.

Behal’s arm is just one in a long line of robotic arms aimed at giving paraplegics and quadriplegics greater freedom and mobility. Recent advances have made robot arms far more sensitive, powerful, and realistic than ever before. In many cases the enhancements depend on software that allows the robot arms to take simple commands (or even signals from the user’s brain) and translate them into complex movements involving multiple motors without requiring their users to specify the exact movements of each servo. But in this study, Behal found that there’s such a thing as too much automation.

Behal, an Assistant Professor at the University of Central Florida, had initially used the arm in a 2006 study at the University of Pennsylvania funded by the National Science Foundation and the National Multiple Sclerosis Society. In addition to weakening physical control, MS often impairs attention and memory, and the complexity of the arm’s controls overwhelmed them. At that time, the arm’s sensors and AI were much more limited, and users were mostly frustrated by its complicated controls.

For these patients, according to Behal, something that might seem as simple as scratching their heads was a prolonged struggle. They needed something that took the guesswork of movement, rotation, and force out of the equation.

The quadriplegics at Orlando Health were the opposite. They were cognitively high-functioning, and some had experience with computers or video games. All had ample experience using assistive technology. Regardless of the extent of their disability or whether they were using a touchscreen, mouse, joystick, or voice controls, they preferred using the arm on manual. The more experience they had with tech, the happier they were.

It didn’t matter that the arm performed faster and more accurately when it was fully automated. Users were actually more forgiving of the arm when they were piloting it. If the arm made a mistake on automatic mode, they panned it. Harshly. (“You see a big vertical spike downward,” when that happened, Behal said.) On manual mode, the users learned how to operate it better — and how to explain their problems with the device to someone else.

To users accustomed to navigating the world in a wheelchair — and frequently having to explain how their chair worked to others — this made the arm both more familiar and more useful. It felt less like an alien presence, and more like a tool: a natural extension of the body and the will.

This feeling is essential for anyone’s satisfaction using technology, but particularly so for disabled users, according to John Bricout, Behal’s collaborator and the associate dean for Research and Community Outreach at the University of Texas at Arlington School of Social Work.

“If we’re too challenged, we get angry and frustrated. But if we aren’t challenged enough, we get bored,” said Bricout. He’s seen this repeatedly with both disabled and older adults.

In an interview with Wired.com, he expanded on this, drawing on psychologist Mihály Csíkszentmihályi’s theory of flow: “We stay engaged when our capabilities are matched by our challenges and our opportunities,” Bricout said. If that balance tilts too far to one direction, we get anxious; if it tilts to the other, we get bored. Match them, and we’re at our happiest, most creative, and most productive.

Behal and Bricout hadn’t anticipated, for example, that users operating the arm using the manual mode would begin to show increased physical functionality.

“There’s rehabilitation potential here,” Bricout said. Thinking through multiple steps to coordinate and improve physical actions “activated latent physical and cognitive resources… It makes you rethink what rehabilitation itself might mean.”

For now, Behal, Bricout and their team plan on repeating their study with a larger group of users to see if they can replicate their results. They’re also going back to users with MS, and perhaps traumatic brain injuries, early next year. Colleagues at other institutions are experimenting with the arms with even more diverse disabled populations.

The engineering team has already given the robotic arm a “voice” that announces its actions and makes it feel more responsive and less alien, even on automatic mode. They’re revamping the software interface again, including exploring the possibility of adding haptic feedback, so users can feel when the robotic arm can grasp an object — or the user’s body itself. If you’re going to scratch your head, the fingertips benefit from touch almost as much.

“You have to listen to users,” Behal said. “If they don’t like using the technology, they won’t. Then it doesn’t matter how well it does its job.”

Robotic arm’s big flaw: Patients say it’s ‘too easy’ [UCF Press Release]

See Also:

Authors: Tim Carmody

to know more click here

French (Fr)English (United Kingdom)

Parmi nos clients

mobileporn