Researchers from the GRAIL lab at NYU have developed AnySense, an app for gathering visual training data for robotics models.
The post AI researchers launch AnySense app for gathering visual training data appeared first on The Robot Report.
A group of researchers at New York University have developed and launched AnySense, an iOS app optimized for gathering visual training data for generalizable robotics models. Raunaq Bhirangi, Zeyu Bian, Venkatesh Pattabiraman, Haritheja Etukuru, Mehmet Enes Erciyes, Nur Muhammad Mahi Shafiullah, and Prof. Lerrel Pinto developed the AnySense app.
Late last year, Prof. Pinto and his team launched an open-source research project called Robot Utility Models (RUM) that attempts to generalize training for robots so that developers don’t have to train thousands of examples of a task. Instead, they can have the operation succeed in zero-shot situations or previously unseen environments.
To support the capture of RUM training data, the Generalizable Robotics and AI Lab (GRAIL lab) at NYU created “the Stick.” It is an open-source, 3D-printed gripper device that employs an iPhone for visual feedback.

“The Stick” is an inexpensive gripper used for training RUM data. It uses an iPhone 12 Pro and a standard off-the-shelf “reacher.” | Credit: Robot Utility Models team
AnySense app released on Apple App store
The AnySense app was designed for the robotics community for multi-sensory data collection and learning, and it is a direct result of the RUM project. Training such generalizable models for robotics is bottlenecked by an inability to collect diverse, high-quality data in the real world, asserted the NYU researchers.
One way to address this bottleneck is to create tools combining scalable, intuitive data-collection interfaces with cheap, accessible sensors, they said.
Here are three noteworthy features of the AnySense iPhone application:
- It integrates the iPhone’s sensors with external multisensory inputs via Bluetooth and wired interfaces.
- It can interface with AnySkin, a versatile tactile sensor capable of multi-axis contact force measurement.
- It is fully open-sourced and available to the robotics community for multi-sensory data collection and learning.
The AnySense app is available for download in the Apple App Store.
AnySkin provides tactile feedback
The research team developed a touch-sensitive pad called AnySkin, that can be used with the AnySense app to provide feedback for gripping. AnySkin is designed to be easy to assemble, compatible with different robotic end effectors, and generalizable to new skin instances.
AnySkin senses contact through distortions in the magnetic field generated by magnetized iron particles in the sensing surface. The flexible surface is physically separated from its electronics, which allows for easy replaceability when damaged, explained the NYU researchers.
The post AI researchers launch AnySense app for gathering visual training data appeared first on The Robot Report.