{"id":3898,"date":"2015-04-09T05:10:20","date_gmt":"2015-04-09T05:10:20","guid":{"rendered":"http:\/\/revoscience.com\/en\/?p=3898"},"modified":"2015-04-09T05:10:20","modified_gmt":"2015-04-09T05:10:20","slug":"vest-helps-deaf-feel-understand-speech","status":"publish","type":"post","link":"https:\/\/www.revoscience.com\/en\/vest-helps-deaf-feel-understand-speech\/","title":{"rendered":"VEST helps deaf feel, understand speech"},"content":{"rendered":"<p style=\"text-align: justify;\">\n<p style=\"text-align: justify;\">\n<figure id=\"attachment_3899\" aria-describedby=\"caption-attachment-3899\" style=\"width: 600px\" class=\"wp-caption alignnone\"><a href=\"http:\/\/revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2.jpg\" target=\"_blank\" rel=\"noopener\"><img loading=\"lazy\" decoding=\"async\" class=\"wp-image-3899\" src=\"http:\/\/revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2-1024x681.jpg\" alt=\"Showing prototypes at neuroscientist David Eagleman&#039;s Baylor office are, from left, students John Yan, Eric Kang, Abhipray Sahoo and Edward Luckett, graduate student Scott Novich, student Evan Dougal, advisers Eagleman and Gary Woods, and student Zihe Huang. (Credit: Jeff Fitlow\/Rice University)\" width=\"600\" height=\"399\" title=\"\" srcset=\"https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2-1024x681.jpg 1024w, https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2-300x199.jpg 300w\" sizes=\"auto, (max-width: 600px) 100vw, 600px\" \/><\/a><figcaption id=\"caption-attachment-3899\" class=\"wp-caption-text\">Showing prototypes at neuroscientist David Eagleman&#8217;s Baylor office are, from left, students John Yan, Eric Kang, Abhipray Sahoo and Edward Luckett, graduate student Scott Novich, student Evan Dougal, advisers Eagleman and Gary Woods, and student Zihe Huang. (Credit: Jeff Fitlow\/Rice University)<\/figcaption><\/figure>\n<p style=\"text-align: justify;\"><span style=\"color: #000000;\">HOUSTON\u2013 A vest that allows the profoundly deaf to &#8220;feel&#8221; and understand speech is under development by engineering students and their mentors at Rice University and Baylor College of Medicine.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #000000;\">Under the direction of neuroscientist and best-selling author David Eagleman, Rice students are refining a vest with dozens of embedded sensors that vibrate in specific patterns to represent words. The vest responds to input from a phone or tablet app that isolates speech from ambient sound.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #000000;\">Eagleman introduced VEST \u2013 Versatile Extra-Sensory Transducer \u2013 to the world at a<a style=\"color: #1155cc;\" href=\"http:\/\/rice.pr-optout.com\/Tracking.aspx?Data=HHL%3d8%2c%3b3%3b4-%3eLCE59.%3a0%40%26SDG%3c90%3a.&amp;RE=MC&amp;RI=4344083&amp;Preview=False&amp;DistributionActionID=72260&amp;Action=Follow+Link\" target=\"_blank\" rel=\"noopener\"><span style=\"color: #000000;\">TED Conference talk<\/span><\/a>\u00a0in March. He is director of the Laboratory for Perception and Action at Baylor College of Medicine and an adjunct assistant professor of electrical and computer engineering at Rice, of which he is also an alumnus. His lab studies the complex mechanisms of perception through psychophysical, behavioral and computational approaches as well as neuroscience and the law.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #000000;\">The Rice students working on VEST, all electrical and computer engineering majors, call themselves the Eagleman Substitution Project (ESP) team. They include seniors Zihe Huang, Evan Dougal, Eric Kang and Edward Luckett and juniors Abhipray Sahoo and John Yan.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #000000;\">They are aiding Scott Novich, a doctoral student in electrical and computer engineering at Rice who works in Eagleman&#8217;s lab. Novich devised the algorithm that enables the VEST to &#8220;hear&#8221; only the human voice and screen out distracting sounds.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #000000;\">The low-cost, noninvasive vest collects sounds from a mobile app and converts them into tactile vibration patterns on the user&#8217;s torso. Haptic feedback supplants auditory input.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #000000;\">The first VEST prototype put together by the team has 24 actuators sewn into the back. A second version, already in production, will include 40 of the actuators Eagleman calls &#8220;vibratory motors.&#8221; He described the experience, at least for a hearing person, as &#8220;feeling the sonic world around me.&#8221;<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #000000;\">&#8220;Along with all the actuators, the system includes a controller board and two batteries,&#8221; said Gary Woods, the team&#8217;s adviser and a Rice professor in the practice of computer technology. &#8220;The actuators vibrate in a very complicated pattern based on audio fed through a smartphone. The patterns are too complicated to translate consciously.&#8221;<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #000000;\">With training, the brains of deaf people adapt to the &#8220;translation&#8221; process, Eagleman said. Test subjects, some of them deaf from birth, &#8220;listened&#8221; to spoken words and wrote them on a white board. &#8220;They can start understanding the &#8216;language&#8217; of the vest,&#8221; he said.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #000000;\">&#8220;We&#8217;ve already run some simple experiments with both hearing and deaf people,&#8221; Novich said. &#8220;As they use the vest more, they get feedback and know whether they are right or wrong and start to memorize patterns. People are able to identify words they have never encountered before.&#8221;<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #000000;\">The project has also prompted students to learn skills they wouldn&#8217;t necessarily acquire in engineering classrooms. Huang became the team&#8217;s tailor when he learned to sew via YouTube. &#8220;I&#8217;m an electrical engineer,&#8221; he said. &#8220;I didn&#8217;t know anything about sewing.&#8221; But the teammates&#8217; quick-study abilities have paid dividends already.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #000000;\">Last November, ESP placed second in the sixth-annual\u00a0<a style=\"color: #1155cc;\" href=\"http:\/\/rice.pr-optout.com\/Tracking.aspx?Data=HHL%3d8%2c%3b3%3b4-%3eLCE59.%3a0%40%26SDG%3c90%3a.&amp;RE=MC&amp;RI=4344083&amp;Preview=False&amp;DistributionActionID=72259&amp;Action=Follow+Link\" target=\"_blank\" rel=\"noopener\"><span style=\"color: #000000;\">Undergraduate Elevator Pitch Competition<\/span><\/a>\u00a0sponsored by the\u00a0<a style=\"color: #1155cc;\" href=\"http:\/\/rice.pr-optout.com\/Tracking.aspx?Data=HHL%3d8%2c%3b3%3b4-%3eLCE59.%3a0%40%26SDG%3c90%3a.&amp;RE=MC&amp;RI=4344083&amp;Preview=False&amp;DistributionActionID=72258&amp;Action=Follow+Link\" target=\"_blank\" rel=\"noopener\"><span style=\"color: #000000;\">Oshman Engineering Design Kitchen<\/span><\/a>\u00a0at Rice. In February, the team placed third in the third-annual\u00a0<a style=\"color: #1155cc;\" href=\"http:\/\/rice.pr-optout.com\/Tracking.aspx?Data=HHL%3d8%2c%3b3%3b4-%3eLCE59.%3a0%40%26SDG%3c90%3a.&amp;RE=MC&amp;RI=4344083&amp;Preview=False&amp;DistributionActionID=72257&amp;Action=Follow+Link\" target=\"_blank\" rel=\"noopener\"><span style=\"color: #000000;\">Owl Open<\/span><\/a>, the Rice student startup competition sponsored by the\u00a0<a style=\"color: #1155cc;\" href=\"http:\/\/rice.pr-optout.com\/Tracking.aspx?Data=HHL%3d8%2c%3b3%3b4-%3eLCE59.%3a0%40%26SDG%3c90%3a.&amp;RE=MC&amp;RI=4344083&amp;Preview=False&amp;DistributionActionID=72256&amp;Action=Follow+Link\" target=\"_blank\" rel=\"noopener\"><span style=\"color: #000000;\">Rice Alliance for Technology and Entrepreneurship<\/span><\/a>. The team will also present its work this month at the annual Design of Medical Devices conference in Minnesota.<\/span><\/p>\n<p style=\"text-align: justify;\"><span style=\"color: #000000;\">&#8220;We see other applications for what we&#8217;re calling tactile sensory substitution,&#8221; Sahoo said. &#8220;Information can be sent through the human body. It&#8217;s not just an augmentative device for the deaf. The VEST could be a general neural input device. You could receive any form of information.&#8221;<\/span><\/p>\n<p><iframe loading=\"lazy\" src=\"https:\/\/www.youtube.com\/embed\/JTSGX6RYuiM\" width=\"617\" height=\"358\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/p>\n","protected":false},"excerpt":{"rendered":"<p>HOUSTON\u2013 A vest that allows the profoundly deaf to &#8220;feel&#8221; and understand speech is under development by engineering students and their mentors at Rice University and Baylor College of Medicine. Under the direction of neuroscientist and best-selling author David Eagleman, Rice students are refining a vest with dozens of embedded sensors that vibrate in specific [&hellip;]<\/p>\n","protected":false},"author":6,"featured_media":3899,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[14,17],"tags":[],"class_list":["post-3898","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-innovation","category-research"],"featured_image_urls":{"full":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2.jpg",2880,1917,false],"thumbnail":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2-150x150.jpg",150,150,true],"medium":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2-300x199.jpg",300,199,true],"medium_large":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2.jpg",750,499,false],"large":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2-1024x681.jpg",750,499,true],"1536x1536":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2.jpg",1536,1022,false],"2048x2048":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2.jpg",2048,1363,false],"ultp_layout_landscape_large":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2.jpg",1200,800,false],"ultp_layout_landscape":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2.jpg",856,570,false],"ultp_layout_portrait":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2.jpg",600,399,false],"ultp_layout_square":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2.jpg",600,399,false],"newspaper-x-single-post":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2.jpg",736,490,false],"newspaper-x-recent-post-big":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2.jpg",541,360,false],"newspaper-x-recent-post-list-image":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2.jpg",95,63,false],"web-stories-poster-portrait":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2.jpg",640,426,false],"web-stories-publisher-logo":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2.jpg",96,64,false],"web-stories-thumbnail":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/04\/unnamed-2.jpg",150,100,false]},"author_info":{"info":["Amrita Tuladhar"]},"category_info":"<a href=\"https:\/\/www.revoscience.com\/en\/category\/innovation\/\" rel=\"category tag\">Innovation<\/a> <a href=\"https:\/\/www.revoscience.com\/en\/category\/news\/research\/\" rel=\"category tag\">Research<\/a>","tag_info":"Research","comment_count":"0","_links":{"self":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/posts\/3898","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/comments?post=3898"}],"version-history":[{"count":0,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/posts\/3898\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/media\/3899"}],"wp:attachment":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/media?parent=3898"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/categories?post=3898"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/tags?post=3898"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}