Robots with Feeling: How Tactile AI Could Transform Human-Robot Relationships

Robots with Feeling: How Tactile AI Could Transform Human-Robot Relationships Robots with Feeling: How Tactile AI Could Transform Human-Robot Relationships

Sentient robots have been a staple of science fiction for decades, raising tantalizing ethical questions and shining light on the technical barriers of creating artificial consciousness. Much of what the tech world has achieved in artificial intelligence (AI) today is thanks to recent advances in deep learning, which allows machines to learn automatically during training. 

This breakthrough eliminates the need for painstaking, manual feature engineering—a key reason why deep learning stands out as a transformative force in AI and tech innovation. 

Building on this momentum, Meta — which owns Facebook, WhatsApp and Instagram — is diving into bold new territory with advanced “tactile AI” technologies. The company recently introduced three new AI-powered tools—Sparsh, Digit 360, and Digit Plexus—designed to give robots a form of touch sensitivity that closely mimics human perception. 

The goal? To create robots that don’t just mimic tasks but actively engage with their surroundings, similar to how humans interact with the world. 

Sparsh, aptly named after the Sanskrit word for “touch,” is a general-purpose agentic AI model that allows robots to interpret and react to sensory cues in real-time. Likewise, the Digit 360 sensor, is an artificial fingertip for robots that can help perceive touch and physical sensations as minute as a needle’s poke or changes in pressure. The Digit Plexus will act as a bridge, providing a standardized framework for integrating tactile sensors across various robotic designs, making it easier to capture and analyze touch data. Meta believes these AI-powered tools will allow robots to tackle intricate tasks requiring a “human” touch, especially in fields like healthcare, where sensitivity and precision are paramount.

Yet the introduction of sensory robots raises larger questions: could this technology unlock new levels of collaboration, or will it introduce complexities society may not be equipped to handle?

“As robots unlock new senses, and gain a high degree of intelligence and autonomy, we will need to start considering their role in society,” Ali Ahmed, co-founder and CEO of Robomart, told me. “Meta’s efforts are a major first step towards providing them with human-like senses. As humans become exceedingly intimate with robots, they will start treating them as life partners, companions, and even going so far as to build a life exclusively with them.”

A Framework for Human-Robot Harmony, the Future? 

Alongside its advancements in tactile AI, Meta also unveiled the PARTNR benchmark, a standardized framework for evaluating human-robot collaboration on a large scale. Designed to test interactions that require planning, reasoning, and collaborative execution, PARTNR will allow robots to navigate both structured and unstructured environments alongside humans. By integrating large language models (LLMs) to guide these interactions, PARTNR can assess robots on critical elements like coordination and task tracking, shifting them from mere “agents” to genuine “partners” capable of working fluidly with human counterparts. 

“The current paper is very limited for benchmarking, and even in Natural Language Processing (NLP), it took a considerable amount of time for LLMs to be perfected for the real world. It will be a huge exercise to generalize for the 8.2 billion population with a limited lab environment,” Ram Palaniappan, CTO of TEKsystems, told me. “There will need to be a larger dedicated effort to boost this research paper to get to a workable pilot.”

To bring these tactile AI advancements to market, Meta has teamed up with GelSight Inc. and Wonik Robotics. GelSight will be responsible for producing the Digit 360 sensor, which is slated for release next year and will provide the research community access to advanced tactile capabilities. Wonik Robotics, meanwhile, will handle the production of the next-generation Allegro Hand, which integrates Digit Plexus to enable robots to carry out intricate, touch-sensitive tasks with a new level of precision. Yet, not everyone is convinced these advancements are a step in the right direction. 

“Although I still believe that adding sensing capabilities could be meaningful for robots to understand the environment, I believe that current use cases are more related to robots for mass consumers and improving on their interaction,” Agustin Huerta, SVP of Digital Innovation for North America at Globant, told me. “I don’t believe we are going to be close to giving them human-level sensations, nor that it’s actually needed. Rather, it will act more as an additional data point for a decision-making process.”

Meta’s tactile AI developments reflect a broader trend in Europe, where countries like Germany, France, and the UK are pushing boundaries in robotic sensing and awareness. For instance, the EU’s The Horizon 2020 program supports a range of projects aimed at pushing robotic boundaries, from tactile sensing and environmental awareness to decision-making capabilities. Moreover, The Karlsruhe Institute of Technology in Germany recently introduced ARMAR-6, a humanoid robot designed for industrial environments. ARMAR-6 is equipped to use tools like drills and hammers and features AI capabilities that allow it to learn how to grasp objects and assist human co-workers. 

But, Dr. Peter Gorm Larsen, Vice-Head of Section at the Department of Electrical and Computer Engineering at Aarhus University in Denmark, and coordinator of the EU-funded RoboSAPIENS project, cautions that Meta might be overlooking a key challenge: the gap between virtual perceptions and the physical reality in which autonomous robots operate, especially regarding environmental and human safety. 

“Robots do NOT have intelligence in the same way that living creatures do,” he told me. “Tech companies have a moral obligation to ensure that their products respect ethical boundaries. Personally, I’m most concerned about the potential convergence of such advanced tactile feedback with 3D glasses as compact as regular eyewear.”

Are We Ready for Robots to “Feel”?

Dr. Larsen believes the real challenge isn’t the tactile AI sensors themselves, but rather how they’re deployed in autonomous settings. “In the EU, the Machinery Directive currently restricts the use of AI-driven controls in robots. But, in my view, that’s an overly stringent requirement, and we hope to be able to demonstrate that in the RoboSAPIENS project that I currently coordinate.” 

Of course, robots are already collaborating with humans in various industries across the world. For instance, Kiwibot has helped logistics companies dealing with labor shortages in warehouses, and Swiss firm Anybotics recently raised $60 million to help bring more industrial robots to the US, according to TechCrunch. We should expect artificial intelligence to continue to permeate industries, as “AI accelerates productivity in repeatable tasks like code refactoring, addresses tech debt and testing, and transforms how global teams collaborate and innovate,” said Vikas Basra, Global Head, Intelligent Engineering Practice, Ness Digital Engineering.

At the same time the safety of these robots – now as well as in their potentially “sentient” future – is the main concern in order for the industry to progress. 

Said Matan Libis, VP of product at SQream, an advanced data processing company, in The Observer, “The next major mission for companies will be to establish AI’s place in society—its roles and responsibilities … We need to be clear about its boundaries and where it truly helps. Unless we identify AI’s limits, we’re going to face growing concerns about its integration into everyday life.”

As AI evolves to include tactile sensing, it raises the question of whether society is ready for robots that “feel.” Experts argue that pure software-based superintelligence may hit a ceiling; for AI to reach a true, advanced understanding, it must sense, perceive, and act within our physical environments, merging modalities for a more profound grasp of the world—something robots are uniquely suited to achieve. Yet, superintelligence alone doesn’t equate to sentience. “We must not anthropomorphize a tool to the point of associating it as a sentient creature if it has not proven that it is capable of being sentient,” explained Ahmed. “However if a robot does pass the test for sentience then they should be recognized as a living sentient being and then we shall have the moral, and fundamental responsibility to grant them certain freedoms and rights as a sentient being.”

The implications of Meta’s tactile AI are significant, but whether these technologies will lead to revolutionary change or cross ethical lines remains uncertain. For now, society is left to ponder a future where AI not only sees and hears but also touches—potentially reshaping our relationship with machines in ways we’re only beginning to imagine.

“I don’t think that increasing AI’s sensing capabilities crosses the line on ethics. It’s more related to how that sensing is later used to make decisions or drive others’ decisions,” said Huerta. “The robot revolution is not going to be different from the industrial revolution. It will affect our lives and leave us in a state that I think can make humanity thrive. In order for that to happen, we need to start educating ourselves and the upcoming generations on how to foster a healthy relationship between humans and robots.”

Add a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use