MIT’s new robot can identify things by sight and by touch
Jarvis
- June 17, 2019
The team took a KUKA robot arm and added a tactile sensor called GelSight, which was created by Ted Adelson’s group at CSAIL. The information collected by GelSight was then fed to an AI so it could learn the relationship between visual and tactile information.
To teach the AI how to identify objects by touch, the team recorded 12,000 videos of 200 objects like fabrics, tools and household objects being touched. The videos were broken down into still images and the AI used this dataset to connect tactile and visual data.
“By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge”, says Yunzhu Li, CSAIL PhD student and lead author on a new paper about the system. “By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings. Bringing these two senses together could empower the robot and reduce the data we might need for tasks involving manipulating…
https://www.engadget.com/2019/06/17/robot-identify-sight-touch/
The Huawei Ban: Explained!
The STORNADO – Classic Unboxing
Recent Posts
Bank Negara Eases Rising Health Insurance Costs
Tune Protect Launches Delay Lounge Pass for AirAsia
Santan Upgrades Inflight Retail with New Technology Solutions
Nothing OS 3.0 Update Brings Custom Features and
realme C75 at RM699 Built for Durability and
ASUS Vivobook S 14 Launches with AI Intel
Kaspersky Detects 467000 Malicious Files Daily in 2024
YTL AI Labs Launches Ilmu 0.1 Malaysia’s AI
Lotus’s Rimbayu Grand Opening Offers Exciting Deals and
HONOR X9c Smart Launches in Malaysia at Affordable
Categories
Archives
Stats
- 209,266
- 1,768,486