{"id":2726,"date":"2015-02-16T10:27:50","date_gmt":"2015-02-16T10:27:50","guid":{"rendered":"http:\/\/revoscience.com\/en\/?p=2726"},"modified":"2015-02-16T10:27:50","modified_gmt":"2015-02-16T10:27:50","slug":"bringing-texture-to-touchscreens-how-the-brain-makes-sense-of-data-from-fingers","status":"publish","type":"post","link":"https:\/\/www.revoscience.com\/en\/bringing-texture-to-touchscreens-how-the-brain-makes-sense-of-data-from-fingers\/","title":{"rendered":"Bringing Texture to Touchscreens: How the Brain Makes Sense of Data from Fingers"},"content":{"rendered":"<figure id=\"attachment_2727\" aria-describedby=\"caption-attachment-2727\" style=\"width: 300px\" class=\"wp-caption alignright\"><a href=\"http:\/\/revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg\" target=\"_blank\" rel=\"noopener\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-2727\" src=\"http:\/\/revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml-300x217.jpg\" alt=\"Researchers are reporting a fascinating discovery that provides insight into how the brain makes sense of data from fingers.Courtesy of Quinn Dombrowski\" width=\"300\" height=\"217\" title=\"\" srcset=\"https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml-300x217.jpg 300w, https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg 320w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a><figcaption id=\"caption-attachment-2727\" class=\"wp-caption-text\">Researchers are reporting a fascinating discovery that provides insight into how the brain makes sense of data from fingers.Courtesy of Quinn Dombrowski<\/figcaption><\/figure>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">What if the touchscreen of your smartphone or tablet could touch you back? What if touch was as integrated into our ubiquitous technology as sight and sound? Northwestern University and Carnegie Mellon University researchers now report a fascinating discovery that provides insight into how the brain makes sense of data from fingers.<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">In a study of people drawing their fingers over a flat surface that has two \u201cvirtual bumps,\u201d the research team is the first to find that, under certain circumstances, the subjects feel only one bump when there really are two. Better yet, the researchers can explain why the brain comes to this conclusion.<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">Their new mathematical model and experimental results on \u201chaptic illusions\u201d could one day lead to flat-screen displays featuring active touch-back technology, such as making your touchscreen\u2019s keyboard actually feel like a keyboard. Tactile information also could benefit the blind, users of dashboard technology in cars, players of video games and more.<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">\u201cTouch is so important in our real world, but it is neglected in the digital world,\u201d said\u00a0<a style=\"color: #6a4985;\" href=\"http:\/\/segal.northwestern.edu\/people\/profiles\/colgate-j-edward.html\" target=\"_blank\" rel=\"noopener\"><span style=\"color: #000000;\">J. Edward Colgate<\/span><\/a>, an expert in touch-based (haptic) systems. He is the Allen and Johnnie Breed University Professor of Design at Northwestern\u2019s\u00a0<a style=\"color: #6a4985;\" href=\"http:\/\/www.mccormick.northwestern.edu\/\" target=\"_blank\" rel=\"noopener\"><span style=\"color: #000000;\">McCormick School of Engineering and Applied Science<\/span><\/a>. \u201cWe want to create something that will make touch a reality for people interacting with their screens, and this work is a step in that direction.\u201d<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">Forces felt by the fingers as they travel along a flat surface can lead to the illusion that the surface actually contains bumps. This so-called \u201cvirtual bump illusion\u201d is well known in the haptics field, Colgate said, and the researchers were able to make use of it.<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">\u201cBy leveraging the virtual bump illusion, we were able to design a meaningful experiment that shed light on the way the brain integrates information from multiple fingers,\u201d Colgate said. \u201cOur big finding was \u2018collapse\u2019 \u2014 the idea that separate bumps felt in separate fingers are nonetheless experienced as one bump if their separation happens to match that of the fingers.\u201d<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">The study, which will be published the week of February 9, 2015, by the\u00a0<a style=\"color: #6a4985;\" href=\"http:\/\/www.pnas.org\/\" target=\"_blank\" rel=\"noopener\"><span style=\"color: #000000;\">Proceedings of the National Academy of Sciences (PNAS)<\/span><\/a>, is about how the brain makes sense of data from the fingers.<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">Colgate, the paper\u2019s corresponding author, and longtime Northwestern haptics collaborator\u00a0<a style=\"color: #6a4985;\" href=\"http:\/\/segal.northwestern.edu\/people\/profiles\/peshkin-michael.html\" target=\"_blank\" rel=\"noopener\"><span style=\"color: #000000;\">Michael A. Peshkin<\/span><\/a>\u00a0joined forces with Carnegie Mellon\u2019s\u00a0<a style=\"color: #6a4985;\" href=\"http:\/\/www.psy.cmu.edu\/people\/klatzky.html\" target=\"_blank\" rel=\"noopener\"><span style=\"color: #000000;\">Roberta Klatzky<\/span><\/a>\u00a0to work on filling the digital world\u2019s functional gap by enabling flat screens to engage the haptic perceptual system. This is known as \u201csurface haptic\u201d technology.<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">The research team\u2019s experiment presented two virtual bumps, with the distance between them varying across trials, to subjects participating in the study. When bump and finger spacing were identical, subjects reported feeling two bumps as one. In this case, the brain thinks it is too coincidental that there should be two bumps at the same time, so it registers the bumps as one.<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">\u201cHow does your body and mind interpret something flat and \u2018see\u2019 it as having shape and texture?\u201d said Klatzky, a world-renowned expert in cognitive psychology and haptic perception. \u201cAn important step toward effective surface haptics is to understand what kinds of stimulation might lead you to feel something other than uniform flatness when you touch the surface of your device. Our study contributes to this understanding.\u201d<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">Klatzky is the Charles J. Queenan Jr. Professor of Psychology and Human-Computer Interaction at Carnegie Mellon.<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">\u201cOur findings will help us and other researchers figure out how to design haptic technology to produce certain tactile effects,\u201d said Peshkin, a professor of mechanical engineering at the McCormick School. \u201cHaptics \u2014 giving a feel to objects \u2014 just enhances the physicality of a person\u2019s experience.\u201d<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">Steven G. Manuel, the study\u2019s first author and a Northwestern alumnus, developed the model of where the \u201cillusion of protrusion\u201d comes from. It describes how the brain constructs a mental depiction of the surface using sensory signals from two fingers as they explore a surface over time and space.<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">A critical feature of the model, and one found in theories of perception more generally, is that it assumes the brain is biased toward inferring causes rather than registering coincidences. In essence, as the fingers encounter forces while they explore a flat surface, the brain creates virtual bumpiness that is most consistent with the physical bumps that would produce the same sensations.<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">The work was supported by a National Science Foundation Division of Information and Intelligent Systems grant, Surface Haptics via Tractive Forces (IIS0854100).<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\"><span style=\"color: #000000;\">The title of the paper is \u201cA Coincidence-Avoidance Principle in Surface Haptic Interpretation.\u201d Manuel, Colgate, Peshkin and Klatzky are authors of the paper.<\/span><\/p>\n<p style=\"font-weight: normal; color: #000000; text-align: justify;\">\n","protected":false},"excerpt":{"rendered":"<p>What if the touchscreen of your smartphone or tablet could touch you back? What if touch was as integrated into our ubiquitous technology as sight and sound? Northwestern University and Carnegie Mellon University researchers now report a fascinating discovery that provides insight into how the brain makes sense of data from fingers. In a study [&hellip;]<\/p>\n","protected":false},"author":6,"featured_media":2727,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[17],"tags":[],"class_list":["post-2726","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-research"],"featured_image_urls":{"full":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg",320,232,false],"thumbnail":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml-150x150.jpg",150,150,true],"medium":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml-300x217.jpg",300,217,true],"medium_large":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg",320,232,false],"large":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg",320,232,false],"1536x1536":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg",320,232,false],"2048x2048":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg",320,232,false],"ultp_layout_landscape_large":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg",320,232,false],"ultp_layout_landscape":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg",320,232,false],"ultp_layout_portrait":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg",320,232,false],"ultp_layout_square":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg",320,232,false],"newspaper-x-single-post":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg",320,232,false],"newspaper-x-recent-post-big":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg",320,232,false],"newspaper-x-recent-post-list-image":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg",90,65,false],"web-stories-poster-portrait":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg",320,232,false],"web-stories-publisher-logo":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg",96,70,false],"web-stories-thumbnail":["https:\/\/www.revoscience.com\/en\/wp-content\/uploads\/2015\/02\/Bringing_Texture_to_Touchscreens_How_the_Brain_Makes_Sense_of_Data_from_Fingers_ml.jpg",150,109,false]},"author_info":{"info":["Amrita Tuladhar"]},"category_info":"<a href=\"https:\/\/www.revoscience.com\/en\/category\/news\/research\/\" rel=\"category tag\">Research<\/a>","tag_info":"Research","comment_count":"0","_links":{"self":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/posts\/2726","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/comments?post=2726"}],"version-history":[{"count":0,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/posts\/2726\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/media\/2727"}],"wp:attachment":[{"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/media?parent=2726"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/categories?post=2726"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.revoscience.com\/en\/wp-json\/wp\/v2\/tags?post=2726"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}