Meta Is Creating A Robot Hand That Can Feel Touch at Human Level Precision In Partnership With GelSight And Wonick Robotics

Meta has announced a partnership with GelSight, a leading sensor technology firm, and South Korea’s Wonik Robotics to develop a robot hand that digitizes touch at human-level precision using tactile sensors for advanced AI applications. This collaboration aims to create the Digit 360 , a state-of-the-art robotic fingertip equipped with human-like multimodal sensing capabilities, and the next-generation Allegro Hand, a robotic hand integrated with these advanced sensors.

Meta’s strategic decision to focus on tactile sensing technology reflects a growing recognition of the importance of touch perception in enhancing AI’s understanding of its environment. The Digit 360 is a significant upgrade from Meta’s previous Digit sensor, featuring an on-device AI chip and approximately 18 distinct sensing features that allow it to detect subtle changes in its surroundings. This technology is expected to support scientists and researchers in their efforts to develop AI that can interpret the physical world with unprecedented depth and accuracy.

steppingintoai news Meta is making a robot hand

In a recent blog post, Meta explained, “We developed a touch-perception-specific optical system with a wide field of view for capturing omnidirectional deformations on the fingertip surface.” The sensor not only measures pressure but can also perceive vibrations, sense heat, and even detect odors, providing a comprehensive sensory profile for each interaction.

Image Credit: Meta

While these devices are not intended for consumer use, Meta envisions them playing a crucial role in scientific research, allowing for deeper insights into robotics, AI, and their interactions with the physical realm. Digit 360 is set to be available for purchase next year, with Meta inviting researchers to propose projects for early access.

The Allegro Hand: A New Standard in Robotic Dexterity

In conjunction with the development of the Digit 360, Meta and Wonik Robotics are advancing the Allegro Hand, a robot hand designed to incorporate the tactile capabilities of the new sensor. This new iteration will utilize control boards that transfer tactile data from the Digit 360 directly to a host computer, allowing for more nuanced control and interaction.

Image Credit: Meta

Meta’s collaboration with Wonik builds on their existing platform, which integrates various sensors onto a single robot hand. The upcoming Allegro Hand is expected to redefine standards in robotic dexterity and could pave the way for applications in fields ranging from medicine to virtual reality. Meta’s recent announcements also include the launch of Sparsh, a suite of encoder models for vision-based tactile sensing developed in collaboration with the University of Washington and Carnegie Mellon University. Sparsh aims to provide robots with the ability to perceive touch, an essential capability for delicate tasks such as handling fragile objects.

The researchers have implemented a self-supervised learning (SSL) approach for training these models, eliminating the need for labeled datasets. This method has proven effective, achieving an impressive average improvement of 95.1% over previous task- and sensor-specific models under limited labeled data conditions. With over 460,000 tactile images used for training, Sparsh represents a significant leap forward in tactile perception technology. In addition to Sparsh, Meta is also introducing Digit Plexus, a hardware-software platform designed to streamline the integration of various tactile sensors into robotic applications. Image Credit: Meta

Digit Plexus enables the encoding and transmission of tactile data from multiple sensors to a single host computer, facilitating advanced robotic dexterity research. Alongside these innovations, Meta is releasing PARTNR (Planning And Reasoning Tasks in Human-Robot collaboration), a benchmark aimed at evaluating AI models’ effectiveness in collaborating with humans on household tasks. This benchmark includes a comprehensive dataset with 100,000 natural language tasks set in simulated environments, providing a robust framework for assessing AI’s capability to follow human instructions in dynamic settings. The release of these tools comes at a time when foundational models, including large language models (LLMs) and vision-language models (VLMs), are garnering renewed interest for their potential to facilitate complex human-robot interactions. Meta’s recent advancements signal a broader industry trend where AI is increasingly being designed to operate in the physical world, enabling robots to execute tasks that require advanced reasoning and planning.

The implications of Meta’s advancements in tactile sensing and robotics are vast. The Digit 360 and Allegro Hand could find applications in a wide range of sectors, including healthcare, where robotic assistance can improve patient outcomes, and in industries like telepresence and virtual reality, where the need for realistic interaction is paramount. By openly releasing the code and designs for the Digit 360 and Digit Plexus, Meta aims to foster community-driven research and innovation, inviting researchers worldwide to leverage this technology for further advancements in touch perception.

As Meta continues to push the boundaries of AI and robotics, the collaboration with GelSight and Wonik Robotics represents a pivotal step toward creating machines that can not only see and reason but also feel—bringing us closer to a future where robots can seamlessly integrate into our daily lives and work alongside us in increasingly complex ways. As Meta unveils these promising technologies, the future of robotics looks brighter than ever, paving the way for new discoveries that could redefine our interaction with machines. With the commercial launch of Digit 360 and Allegro Hand on the horizon, researchers and industries alike are poised to explore the potential of tactile sensing in reshaping the landscape of robotics and artificial intelligence.

DIVE IN!
Join our mailing list and start your AI journey with us today! AI news, skills and resources for real-world success!

We’ll never send you spam or share your email address.Find out more in our Privacy Policy.