{"id":38724,"date":"2023-04-24T11:40:57","date_gmt":"2023-04-24T15:40:57","guid":{"rendered":"https:\/\/www.bu.edu\/cise\/?p=38724"},"modified":"2023-04-24T12:14:08","modified_gmt":"2023-04-24T16:14:08","slug":"advancing-assistive-technologies-with-ai","status":"publish","type":"post","link":"https:\/\/www.bu.edu\/cise\/advancing-assistive-technologies-with-ai\/","title":{"rendered":"Eshed Ohn-Bar: Advancing Assistive Technologies with AI"},"content":{"rendered":"<div class=\"responsive-video responsive-youtube\"><iframe loading=\"lazy\" title=\"[ECCV 2022] ASSISTER: Assistive Navigation via Conditional Instruction Generation\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/bcgD5wHZmSs?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" allowfullscreen><\/iframe><\/div>\n<p><span style=\"font-weight: 400;\">For someone who is visually impaired, navigating an unfamiliar street can be challenging. Even going straight can be tough in an open space. Encountering obstacles, stairs, and intersections can potentially result in an unsafe situation. While aids, such as white canes or guide dogs are helpful, they can\u2019t exactly tell someone who is visually impaired what is in front of them or where to go.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Professor Eshed Ohn-Bar (ECE), a Faculty Affiliate of CISE and Hariri Institute of Computing, works on <\/span><span style=\"font-weight: 400;\">developing AI technologies that can seamlessly collaborate and help humans. His most recent paper titled <\/span><a href=\"https:\/\/eshed1.github.io\/papers\/assister_eccv2022.pdf\"><span style=\"font-weight: 400;\">ASSISTER: Assistive Navigation via Conditional Instruction Generation<\/span><\/a><span style=\"font-weight: 400;\">, <\/span><span style=\"font-weight: 400;\">introduces a language generation AI for providing intuitive, human-like assistance to individuals with visual impairments. <\/span><span style=\"font-weight: 400;\">The goal is for ASSISTER to be able to determine what and when to tell a person who is visually impaired, such as what obstacles they are facing, while verbally directing the person to their destination by planning out a path.\u00a0<\/span><\/p>\n<figure id=\"attachment_38725\" aria-describedby=\"caption-attachment-38725\" style=\"width: 225px\" class=\"wp-caption alignleft\"><img loading=\"lazy\" src=\"\/cise\/files\/2023\/04\/ESHED-003-424x636.jpg\" alt=\"\" width=\"215\" height=\"323\" class=\" wp-image-38725\" srcset=\"https:\/\/www.bu.edu\/cise\/files\/2023\/04\/ESHED-003-424x636.jpg 424w, https:\/\/www.bu.edu\/cise\/files\/2023\/04\/ESHED-003-683x1024.jpg 683w, https:\/\/www.bu.edu\/cise\/files\/2023\/04\/ESHED-003-768x1152.jpg 768w, https:\/\/www.bu.edu\/cise\/files\/2023\/04\/ESHED-003-1024x1536.jpg 1024w, https:\/\/www.bu.edu\/cise\/files\/2023\/04\/ESHED-003-1366x2048.jpg 1366w, https:\/\/www.bu.edu\/cise\/files\/2023\/04\/ESHED-003-scaled.jpg 1707w\" sizes=\"(max-width: 215px) 100vw, 215px\" \/><figcaption id=\"caption-attachment-38725\" class=\"wp-caption-text\">Professor Eshed Ohn-Bar (ECE) works on developing AI technologies that can seamlessly collaborate and help humans.<\/figcaption><\/figure>\n<p><span style=\"font-weight: 400;\">\u201cThere&#8217;s a grand engineering challenge of assisting individuals with visual impairments to navigate and get to their destination safely and seamlessly,\u201d Ohn-Bar said. \u201cIndividuals with disabilities often say that transportation and getting to places can affect their quality of life because they may want to go to restaurants or the gym. But, they may end up staying home because they feel it&#8217;s too difficult to get to these places independently.\u201d<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Ohn-Bar and his team wanted the ASSISTER AI to act less like automated guidance, and instead be more natural and geared towards the person\u2019s own interpretation of directions and abilities.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">\u201cThe goal of ASSISTER is to mimic natural human interaction as opposed to providing feedback similar to an autocorrect system that gives you some recommendation you don&#8217;t want to use,\u201d Ohn-Bar said.\u00a0\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">To mimic more natural interactions, Ohn-Bar and his team hired orientation and mobility guides. <\/span><span style=\"font-weight: 400;\">These guides work with individuals who are visually impaired to teach them how to navigate to a destination.<\/span><span style=\"font-weight: 400;\"> While they were navigating, the travelers wore cameras, allowing the researchers to observe how the guides gave directions and their responses. They then took the video and audio data from those experiences and used it to train ASSISTER. Using a speaker-follower model, ASSISTER was programmed to provide conversational instructions through a wearable assistive system.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Ohn-Bar and his team tested ASSISTER in a simulation first, over diverse scenarios and environments, and then with humans following instructions in the real world. The simulation mimics what it\u2019s like to navigate without being able to see anything. The user can use keyboard keys to move around, but all they see on the screen is a colored fan indicating the position of their cane. With just the visual of the fan on the screen, the user must complete the task of finding their rideshare vehicle.<\/span><\/p>\n<figure id=\"attachment_38726\" aria-describedby=\"caption-attachment-38726\" style=\"width: 575px\" class=\"wp-caption alignright\"><img loading=\"lazy\" src=\"\/cise\/files\/2023\/04\/user_assister-636x243.png\" alt=\"\" width=\"565\" height=\"216\" class=\"wp-image-38726 \" srcset=\"https:\/\/www.bu.edu\/cise\/files\/2023\/04\/user_assister-636x243.png 636w, https:\/\/www.bu.edu\/cise\/files\/2023\/04\/user_assister-1024x391.png 1024w, https:\/\/www.bu.edu\/cise\/files\/2023\/04\/user_assister-768x293.png 768w, https:\/\/www.bu.edu\/cise\/files\/2023\/04\/user_assister-1536x586.png 1536w, https:\/\/www.bu.edu\/cise\/files\/2023\/04\/user_assister-2048x782.png 2048w\" sizes=\"(max-width: 565px) 100vw, 565px\" \/><figcaption id=\"caption-attachment-38726\" class=\"wp-caption-text\">A user is navigating the streets while wearing the assistive system.<\/figcaption><\/figure>\n<p><span style=\"font-weight: 400;\">Once they fine-tuned the algorithm, they then transferred the AI to the real-world. Ohn-Bar and his team collaborated with the Carroll Center for the Blind in Boston, with users given instructions to reach an autonomous vehicle 200ft away across busy intersections, stairs, curbs, and pedestrians. The step-by-step assistive system was able to guide users all the way to the door handle of the ride.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Ohn-Bar said that while he always worked on human-machine interaction for assistive technologies, working with a blind computer scientist at Carnegie Mellon University showed him first-hand how technology can alleviate difficulties of individuals who are blind. He also mentioned how his grandmother lost much of her sight a few years ago.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In general, AI and computer vision today has a difficult time understanding humans with disabilities. In a paper titled <\/span><a href=\"https:\/\/openaccess.thecvf.com\/content\/ICCV2021\/papers\/Zhang_X-World_Accessibility_Vision_and_Autonomy_Meet_ICCV_2021_paper.pdf\"><span style=\"font-weight: 400;\">X-World: Accessibility, Vision, and Autonomy Meet<\/span><\/a><span style=\"font-weight: 400;\">, Ohn-Bar and his team discovered that algorithms detected a wheelchair with 30% accuracy, and white canes with less than 1% accuracy, showing there is a lack of research in teaching machines to recognize and interact with individuals with disabilities.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Ohn-Bar and his team recently received an NSF grant to continue the research with ASSISTER. They are currently testing a smartphone app version of ASSISTER and Ohn-Bar said he wants ASSISTER to turn into a complete trip support system that anyone can use. Essentially, the app would guide anyone from one destination to another, and then back home. He hopes his work will make it easier for people with disabilities to navigate and achieve everyday tasks with ease. <\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>For someone who is visually impaired, navigating an unfamiliar street can be challenging. Even going straight can be tough in an open space. Encountering obstacles, stairs, and intersections can potentially result in an unsafe situation. While aids, such as white canes or guide dogs are helpful, they can\u2019t exactly tell someone who is visually impaired [&hellip;]<\/p>\n","protected":false},"author":19737,"featured_media":38738,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[204],"tags":[],"_links":{"self":[{"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/posts\/38724"}],"collection":[{"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/users\/19737"}],"replies":[{"embeddable":true,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/comments?post=38724"}],"version-history":[{"count":8,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/posts\/38724\/revisions"}],"predecessor-version":[{"id":38732,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/posts\/38724\/revisions\/38732"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/media\/38738"}],"wp:attachment":[{"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/media?parent=38724"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/categories?post=38724"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/tags?post=38724"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}