{"id":20650,"date":"2021-05-26T19:54:38","date_gmt":"2021-05-26T14:09:38","guid":{"rendered":"https:\/\/www.revoscience.com\/en\/?p=20650"},"modified":"2021-05-26T21:30:41","modified_gmt":"2021-05-26T15:45:41","slug":"slender-robotic-finger-senses-buried-items","status":"publish","type":"post","link":"https:\/\/www.revoscience.com\/en\/slender-robotic-finger-senses-buried-items\/","title":{"rendered":"Slender robotic finger senses buried items"},"content":{"rendered":"\n<p><strong><em>The technology uses tactile sensing to identify objects underground, and might one day help disarm land mines or inspect cables.<\/em><\/strong><\/p>\n\n\n\n<p><strong>By Daniel Ackerman<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-image size-large is-resized\"><img loading=\"lazy\" decoding=\"async\" sizes=\"auto, (max-width: 675px) 100vw, 675px\" src=\"https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0-675x450.jpg\" alt=\"\" class=\"wp-image-20651\" width=\"847\" height=\"565\" title=\"\" srcset=\"https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0-675x450.jpg 675w, https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0-600x400.jpg 600w, https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0-768x512.jpg 768w, https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0-174x116.jpg 174w, https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0.jpg 900w\" \/><figcaption><em>MIT researchers developed a \u201cDigger Finger\u201d robot that digs through granular material, like sand and gravel, and senses the shapes of buried objects.<\/em> <\/figcaption><\/figure>\n\n\n\n<p>CAMBRIDGE, Mass( <em>MIT News Office<\/em>)&#8211;&nbsp;Over the years, robots have gotten&nbsp;<a href=\"http:\/\/mit.pr-optout.com\/Tracking.aspx?Data=HHL%3d8446A8-%3eLCE9%3b4%3b8%3f%26SDG%3c90%3a.&amp;RE=MC&amp;RI=4334046&amp;Preview=False&amp;DistributionActionID=99690&amp;Action=Follow+Link\" target=\"_blank\" rel=\"noreferrer noopener\">quite good<\/a>&nbsp;at identifying objects \u2014 as long as they\u2019re out in the open.<\/p>\n\n\n\n<p>Discerning buried items in granular material like sand is a taller order. To do that, a robot would need fingers that were slender enough to penetrate the sand, mobile enough to wriggle free when sand grains jam, and sensitive enough to feel the detailed shape of the buried object.<\/p>\n\n\n\n<p>MIT researchers have now designed a sharp-tipped robot finger equipped with tactile sensing to meet the challenge of identifying buried objects. In experiments, the aptly named&nbsp;<a href=\"http:\/\/mit.pr-optout.com\/Tracking.aspx?Data=HHL%3d8446A8-%3eLCE9%3b4%3b8%3f%26SDG%3c90%3a.&amp;RE=MC&amp;RI=4334046&amp;Preview=False&amp;DistributionActionID=99689&amp;Action=Follow+Link\" target=\"_blank\" rel=\"noreferrer noopener\">Digger Finger<\/a>&nbsp;was able to dig through granular media such as sand and rice, and it correctly sensed the shapes of submerged items it encountered. The researchers say the robot might one day perform various subterranean duties, such as finding buried cables or disarming buried bombs.<\/p>\n\n\n\n<p>The research will be presented at the next International Symposium on Experimental Robotics. The study\u2019s lead author is Radhen Patel, a postdoc in MIT\u2019s Computer Science and Artificial Intelligence Laboratory (CSAIL). Co-authors include CSAIL PhD student Branden Romero, Harvard University PhD student Nancy Ouyang, and Edward Adelson, the John and Dorothy Wilson Professor of Vision Science in CSAIL and the Department of Brain and Cognitive Sciences.<\/p>\n\n\n\n<p>Seeking to identify objects buried in granular material \u2014 sand, gravel, and other types of loosely packed particles \u2014 isn\u2019t a brand new quest. Previously, researchers have used technologies that sense the subterranean from above, such as Ground Penetrating Radar or ultrasonic vibrations. But these techniques provide only a hazy view of submerged objects. They might struggle to differentiate rock from bone, for example.<\/p>\n\n\n\n<p>\u201cSo, the idea is to make a finger that has a good sense of touch and can distinguish between the various things it\u2019s feeling,\u201d says Adelson. \u201cThat would be helpful if you\u2019re trying to find and disable buried bombs, for example.\u201d Making that idea a reality meant clearing a number of hurdles.<\/p>\n\n\n\n<p>The team\u2019s first challenge was a matter of form: The robotic finger had to be slender and sharp-tipped.<\/p>\n\n\n\n<p>In prior work, the researchers had used a tactile sensor called&nbsp;<a href=\"http:\/\/mit.pr-optout.com\/Tracking.aspx?Data=HHL%3d8446A8-%3eLCE9%3b4%3b8%3f%26SDG%3c90%3a.&amp;RE=MC&amp;RI=4334046&amp;Preview=False&amp;DistributionActionID=99688&amp;Action=Follow+Link\" target=\"_blank\" rel=\"noreferrer noopener\">GelSight<\/a>. The sensor consisted of a clear gel covered with a reflective membrane that deformed when objects pressed against it. Behind the membrane were three colors of LED lights and a camera. The lights shone through the gel and onto the membrane, while the camera collected the membrane\u2019s pattern of reflection. Computer vision algorithms then extracted the 3D shape of the contact area where the soft finger touched the object. The contraption provided an excellent sense of artificial touch, but it was inconveniently bulky.<\/p>\n\n\n\n<p>For the Digger Finger, the researchers slimmed down their GelSight sensor in two main ways. First, they changed the shape to be a slender cylinder with a beveled tip. Next, they ditched two-thirds of the LED lights, using a combination of blue LEDs and colored fluorescent paint. \u201cThat saved a lot of complexity and space,\u201d says Ouyang. \u201cThat\u2019s how we were able to get it into such a compact form.\u201d The final product featured a device whose tactile sensing membrane was about 2 square centimeters, similar to the tip of a finger.<\/p>\n\n\n\n<p>With size sorted out, the researchers turned their attention to motion, mounting the finger on a robot arm and digging through fine-grained sand and coarse-grained rice. Granular media have a tendency to jam when numerous particles become locked in place. That makes it difficult to penetrate. So, the team added vibration to the Digger Finger\u2019s capabilities and put it through a battery of tests.<\/p>\n\n\n\n<p>\u201cWe wanted to see how mechanical vibrations aid in digging deeper and getting through jams,\u201d says Patel. \u201cWe ran the vibrating motor at different operating voltages, which changes the amplitude and frequency of the vibrations.\u201d They found that rapid vibrations helped \u201cfluidize\u201d the media, clearing jams and allowing for deeper burrowing \u2014 though this fluidizing effect was harder to achieve in sand than in rice.<\/p>\n\n\n\n<p>They also tested various twisting motions in both the rice and sand. Sometimes, grains of each type of media would get stuck between the Digger-Finger\u2019s tactile membrane and the buried object it was trying to sense. When this happened with rice, the trapped grains were large enough to completely obscure the shape of the object, though the occlusion could usually be cleared with a little robotic wiggling. Trapped sand was harder to clear, though the grains\u2019 small size meant the Digger Finger could still sense the general contours of target object.<\/p>\n\n\n\n<p>Patel says that operators will have to adjust the Digger Finger\u2019s motion pattern for different settings \u201cdepending on the type of media and on the size and shape of the grains.\u201d The team plans to keep exploring new motions to optimize the Digger Finger\u2019s ability to navigate various media.<\/p>\n\n\n\n<p>Adelson says the Digger Finger is part of a program extending the domains in which robotic touch can be used. Humans use their fingers amidst complex environments, whether fishing for a key in a pants pocket or feeling for a tumor during surgery. \u201cAs we get better at artificial touch, we want to be able to use it in situations when you\u2019re surrounded by all kinds of distracting information,\u201d says Adelson. \u201cWe want to be able to distinguish between the stuff that\u2019s important and the stuff that\u2019s not.\u201d<\/p>\n\n\n\n<p>Funding for this research was provided, in part, by the Toyota Research Institute through the Toyota-CSAIL Joint Research Center; the Office of Naval Research; and the Norwegian Research Council.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Over the years, robots have gotten quite good at identifying objects \u2014 as long as they\u2019re out in the open.<\/p>\n","protected":false},"author":2,"featured_media":20651,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[14,17],"tags":[],"class_list":["post-20650","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-innovation","category-research"],"featured_image_urls":{"full":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0.jpg",900,600,false],"thumbnail":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0-200x200.jpg",200,200,true],"medium":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0-600x400.jpg",600,400,true],"medium_large":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0-768x512.jpg",750,500,true],"large":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0-675x450.jpg",675,450,true],"1536x1536":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0.jpg",900,600,false],"2048x2048":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0.jpg",900,600,false],"ultp_layout_landscape_large":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0.jpg",900,600,false],"ultp_layout_landscape":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0.jpg",855,570,false],"ultp_layout_portrait":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0.jpg",600,400,false],"ultp_layout_square":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0.jpg",600,400,false],"newspaper-x-single-post":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0-760x490.jpg",760,490,true],"newspaper-x-recent-post-big":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0-550x360.jpg",550,360,true],"newspaper-x-recent-post-list-image":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0-95x65.jpg",95,65,true],"web-stories-poster-portrait":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0.jpg",640,427,false],"web-stories-publisher-logo":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0.jpg",96,64,false],"web-stories-thumbnail":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2021\/05\/MIT-digger-finger-01-press_0.jpg",150,100,false]},"author_info":{"info":["Daniel Ackerman"]},"category_info":"<a href=\"https:\/\/www.revoscience.com\/en\/category\/innovation\/\" rel=\"category tag\">Innovation<\/a> <a href=\"https:\/\/www.revoscience.com\/en\/category\/news\/research\/\" rel=\"category tag\">Research<\/a>","tag_info":"Research","comment_count":"0","_links":{"self":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/posts\/20650","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/comments?post=20650"}],"version-history":[{"count":0,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/posts\/20650\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/media\/20651"}],"wp:attachment":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/media?parent=20650"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/categories?post=20650"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/tags?post=20650"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}