New bionic hand can ‘see’, grasp objects quickly

LONDON:  Scientists have developed a new bionic hand that can ‘see’ objects and allow amputees to grasp things ten times faster than currently available prosthetics.

The bionic hand is fitted with a camera which instantaneously takes a picture of the object in front of it, assesses its shape and size and triggers a series of movements in the hand.

Bypassing the usual processes which require the user to see the object, physically stimulate the muscles in the arm and trigger a movement in the prosthetic limb, the hand ‘sees’ and reacts in one fluid movement.

A small number of amputees have already trialled the new technology developed by researchers at the Newcastle University in the UK.

“Prosthetic limbs have changed very little in the past 100 years – the design is much better and the materials’ are lighter weight and more durable but they still work in the same way,” said Kianoush Nazarpour, from the Newcastle University.

“Using computer vision, we have developed a bionic hand which can respond automatically – in fact, just like a real hand, the user can reach out and pick up a cup or a biscuit with nothing more than a quick glance in the right direction,” said Nazarpour.

“Responsiveness has been one of the main barriers to artificial limbs. For many amputees the reference point is their healthy arm or leg so prosthetics seem slow and cumbersome in comparison,” he said.

“Now, for the first time in a century, we have developed an ‘intuitive’ hand that can react without thinking,” he added.

Current prosthetic hands are controlled via myoelectric signals – that is electrical activity of the muscles recorded from the skin surface of the stump. Controlling them takes practice, concentration and time.

Using neural networks – the basis for artificial intelligence – researchers showed the computer numerous object images and taught it to recognise the ‘grip’ needed for different objects.

“We would show the computer a picture of, for example, a stick,” said Ghazal Ghazaei, from at Newcastle University.

“But not just one picture, many images of the same stick from different angles and orientations, even in different light and against different backgrounds and eventually the computer learns what grasp it needs to pick that stick up,” she said.

Using the camera fitted to the prosthesis, the hand ‘sees’ an object, picks the most appropriate grasp and sends a signal to the hand – all within a matter of milliseconds and ten times faster than any other limb currently on the market.

The study was published in the Journal of Neural Engineering. (AGENCIES)