Manipulation and sensing have long been regarded as two critical pillars for realizing the full potential of robotics. Of course, there is some overlap between the two.
As grippers have become an essential component of industrial robotics, these systems require appropriate mechanisms for interacting with their surroundings.
Vision has long been important in all of this, but companies are increasingly turning to tacticity as a method of data collection. It provides the robot with a better sense of how much pressure to apply to a given object, whether it is a piece of produce or a human being.
YOU MAY ALSO LIKE: Amazon First Trailer For ‘The Lord Of The Rings: The Rings of Power’
Touchlab, an Edinburgh, Scotland-based startup, won the pitch-off at our TC Sessions: Robotics event a few months ago, despite stiff competition.
The judges agreed that the company’s approach to creating robotic skin is critical and can help unlock greater sensing potential. So far, the XPrize has also agreed. The company is currently a finalist in the XPrize Avatar Competition, which is worth $10 million.
The company is currently collaborating with Schunk, a German robotics firm that is providing the gripper for the XPrize finals.
“Our mission is to make this electronic skin for robots to give machines the power of human touch,” co-founder and CEO Zaki Hussein said, speaking to TechCrunch from the company’s new office space.
“There are a lot of elements going into replicating human touch. We manufacture this sensing technology.
“It’s thinner than human skin and it can give you the position and pressure wherever you put it on the robot. And it will also give you 3D forces at the point of contact, which allows robots to be able to do dexterous and challenging activities.”
To begin, the company is investigating teleoperation applications (hence the XPrize Avatar), specifically using the system to remotely operate robots in understaffed hospitals.
On one end, a TIAGo++ robot outfitted with its sensors provides human workers with an extra pair of hands; on the other, an operator wearing a haptic VR bodysuit that translates all touch data. However, such technologies currently have limitations.
“We have a layer of software that translates the pressure of the skin to the suit. We’re also using haptic gloves,” says Hussein. “Currently, our skin gathers a lot more data than we can currently transmit to the user over haptic interfaces.
“So there’s a little bit of a bottleneck. We can use the full potential of the best haptic interface of the day, but there is a point where the robot is feeling more than the user is able to.”
Additional data gathered by the robot is translated via a variety of channels, including visual data via a VR headset. The company is almost ready to start real-world pilots with the system.
“It will be in February,” says Hussein. “We’ve got a three-month hospital trial with the geriatric patients in the geriatric acute ward. This is a world-first, where this robot will be deployed in that setting.”
Cool. I spent a long time looking for relevant content and found that your article gave me new ideas, which is very helpful for my research. I think my thesis can be completed more smoothly. Thank you.