{"id":28330,"date":"2019-02-25T21:31:06","date_gmt":"2019-02-26T01:31:06","guid":{"rendered":"https:\/\/www.bu.edu\/cise\/?p=28330"},"modified":"2021-09-21T19:53:56","modified_gmt":"2021-09-21T23:53:56","slug":"can-technology-eliminate-blind-spots-2","status":"publish","type":"post","link":"https:\/\/www.bu.edu\/cise\/can-technology-eliminate-blind-spots-2\/","title":{"rendered":"Can Technology Eliminate Blind Spots?"},"content":{"rendered":"<h3 style=\"text-align: center;\">New digital-camera-based system shows it\u2019s possible to \u201csee\u201d around corners<\/h3>\n<header>\n<figure id=\"attachment_33569\" aria-describedby=\"caption-attachment-33569\" style=\"width: 646px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" src=\"\/cise\/files\/2019\/02\/Screen-Shot-2021-08-17-at-12.34.51-AM-636x450.png\" alt=\"\" width=\"636\" height=\"450\" class=\"size-medium wp-image-33569\" srcset=\"https:\/\/www.bu.edu\/cise\/files\/2019\/02\/Screen-Shot-2021-08-17-at-12.34.51-AM-636x450.png 636w, https:\/\/www.bu.edu\/cise\/files\/2019\/02\/Screen-Shot-2021-08-17-at-12.34.51-AM-1024x725.png 1024w, https:\/\/www.bu.edu\/cise\/files\/2019\/02\/Screen-Shot-2021-08-17-at-12.34.51-AM-768x544.png 768w, https:\/\/www.bu.edu\/cise\/files\/2019\/02\/Screen-Shot-2021-08-17-at-12.34.51-AM-1536x1087.png 1536w, https:\/\/www.bu.edu\/cise\/files\/2019\/02\/Screen-Shot-2021-08-17-at-12.34.51-AM.png 1842w\" sizes=\"(max-width: 636px) 100vw, 636px\" \/><figcaption id=\"caption-attachment-33569\" class=\"wp-caption-text\">The \u201cpenumbra\u201d or partial shadow seen on the far wall\u2014created by a bright scene displayed on an LCD monitor (left) and a chair (center)\u2014gives enough light information that a computer program can reconstruct the original scene by analyzing a photograph of the wall taken by a digital camera (right) located around a 180-degree corner. Photo courtesy of the Goyal lab at Boston University<\/figcaption><\/figure>\n<p>What if your car possessed technology that warned you not only about objects in clear view of your vehicle\u2014the way that cameras, radar, and laser can do now in many standard and autonomous vehicles\u2014but also warned you about objects hidden by obstructions. Maybe it\u2019s something blocked by a parked car, or just out of sight behind a building on a street corner. This ability to see things outside your line of sight sounds like science fiction, but researchers have made strides in the last decade to bring what\u2019s called non-line-of-sight imaging to reality.<\/p>\n<\/header>\n<p>Until now, they\u2019ve had to rely on expensive and stationary equipment. But<span>\u00a0<\/span><a href=\"https:\/\/www.bu.edu\/eng\/profile\/vivek-goyal\/\" target=\"_blank\" rel=\"noopener noreferrer\">Vivek Goyal<\/a><span>\u00a0<\/span>and a team of researchers from Boston University have developed a system employing a computer algorithm and a simple digital camera\u00a0that can give us a more affordable and agile look at what\u2019s around the corner.<\/p>\n<figure id=\"attachment24733\" aria-describedby=\"caption-attachment24733\" class=\"wp-caption alignleft\"><\/figure>\n<figure id=\"attachment_24733\" aria-describedby=\"caption-attachment-24733\" style=\"width: 256px\" class=\"wp-caption alignleft\"><img loading=\"lazy\" src=\"http:\/\/www.bu.edu\/systems\/files\/2019\/02\/Prof-Goyal.jpg\" alt=\"\" width=\"246\" height=\"246\" class=\"wp-image-24733\" srcset=\"http:\/\/www.bu.edu\/systems\/files\/2019\/02\/Prof-Goyal.jpg 426w, http:\/\/www.bu.edu\/systems\/files\/2019\/02\/Prof-Goyal-150x150.jpg 150w, http:\/\/www.bu.edu\/systems\/files\/2019\/02\/Prof-Goyal-300x300.jpg 300w, http:\/\/www.bu.edu\/systems\/files\/2019\/02\/Prof-Goyal-100x100.jpg 100w\" sizes=\"(max-width: 246px) 100vw, 246px\" \/><figcaption id=\"caption-attachment-24733\" class=\"wp-caption-text\">CISE Affiliate Faculty Prof. Vivek Goyal (ECE)<\/figcaption><\/figure>\n<p>\u201cThere\u2019s a bit of a research community around non-line-of-sight imaging,\u201d says Goyal, a College of Engineering associate professor of electrical and computer engineering. \u201cIn a dense urban area, if you could get greater visibility around the corner, that could be significant for safety. For example, you might be able to see that there\u2019s a child on the other side of that parked car. You can also imagine plenty of scenarios where seeing around obstructions would prove extremely useful, such as taking surveillance from the battlefield, and in search and rescue situations where you might not be able to enter an area because it\u2019s dangerous to do so.<\/p>\n<p>In a paper published January 23, 2019, in<span>\u00a0<\/span><a href=\"https:\/\/www.nature.com\/articles\/s41586-018-0868-6\" target=\"_blank\" rel=\"noopener noreferrer\"><em>Nature<\/em><\/a>, Goyal and a team of researchers say they are able to compute and reconstruct a scene from around a corner by capturing information from a digital photograph of a penumbra, which is the partially shaded outer region of a shadow cast by an opaque object.<\/p>\n<p>\u201cBasically, our technique allows you to see what\u2019s around the corner by looking at a penumbra on a matte wall,\u201d Goyal says.<\/p>\n<figure class=\"wp-caption alignright\"><\/figure>\n<h3>When shadows turn ordinary walls into mirrors<\/h3>\n<p>Against a matte wall, he explains, light scatters equally rather than being concentrated or reflected back in one direction like a mirror. Normally, that wouldn\u2019t give enough organized information for a computer program to translate what\u2019s happening in a visible scene around the corner. But Goyal\u2019s team discovered that when there is a known solid object around the corner, the partially obstructed scene creates a blurry penumbra. The object can really be anything as long as it\u2019s not see-through. In this case, the researchers opted to use an ordinary chair. To the human eye, the resulting penumbra may not look like much. For a computer program, it\u2019s highly informative.<\/p>\n<p>By inputting the dimensions and placement of the object, the team found that their computer program can organize the light scatter and determine what the original scene looks like\u2014all from a digital photograph of a seemingly blurry shadow on a wall.<\/p>\n<p>\u201cBased on light ray optics, we can compute and understand which subsets of the scene\u2019s appearance influence the camera pixels,\u201d Goyal says, and \u201cit becomes possible to compute an image of the hidden scene.\u201d Could the image of a human being standing around the corner, for example, be reconstructed using their approach? <span>Goyal says there\u2019s no conceptual barrier preventing it, but that they haven\u2019t tried it yet. They did, however, make additional scenes by cutting out colored pieces of construction paper and pasting them on foam board to see if their system could detect the shapes and colors. He says their \u201ckindergarten art project\u201d scenes were indeed able to be interpreted.<\/span><\/p>\n<h3>Seeing potential all around<\/h3>\n<figure id=\"attachment29796\" aria-describedby=\"caption-attachment29796\" class=\"wp-caption alignleft\"><\/figure>\n<figure id=\"attachment_29796\" aria-describedby=\"caption-attachment-29796\" style=\"width: 898px\" class=\"wp-caption alignleft\"><img loading=\"lazy\" src=\"http:\/\/www.bu.edu\/research\/files\/2019\/01\/calibration-reconstruction-grid1.jpg\" alt=\"\" class=\"wp-image-29796\" width=\"888\" height=\"482\" \/><figcaption id=\"caption-attachment-29796\" class=\"wp-caption-text\">These two series of images show the original scenes that Goyal\u2019s team displayed on the LCD monitor (left), the resulting penumbras as seen by the naked eye and captured by the digital camera (center), and the final reconstructed images created by the computer program (right). Images courtesy of the Goyal lab at Boston University<\/figcaption><\/figure>\n<p>The most fundamental limitation is the contrast between the penumbra and the surrounding environment, Goyal explains. \u201cThe results we present are for a relatively darkened room.\u201d When the team increased the levels of ambient light in the lab, they observed that the penumbra became harder to see and the system\u2019s ability to precisely reconstruct the around-the-corner scene gradually became worse.<\/p>\n<p>While real-world applications for using non-line-of-sight imaging are still a ways off, he says, the breakthrough is in the proof of concept.<\/p>\n<p>\u201cIn the future, I imagine there might be some sort of hybrid method, in which the system is able to locate foreground opaque objects and factor that into the computational reconstruction of the scene.\u201d<\/p>\n<p>The most exciting aspect of their findings is the discovery that so much information can be extracted from penumbras, Goyal says, which are literally found everywhere.<\/p>\n<p>\u201cWhen you realize how much light can be extracted from them, you just can\u2019t look at shadows the same way again,\u201d he says.<\/p>\n<div class=\"page\" title=\"Page 4\">\n<div class=\"layoutArea\">\n<div class=\"column\">\n<p><em>This work was supported by the Defense Advanced Research Projects Agency (DARPA).<\/em><\/p>\n<p><em>This story was originally published in <a href=\"https:\/\/www.bu.edu\/articles\/2019\/can-technology-eliminate-blind-spots\/\" target=\"_blank\" rel=\"noopener noreferrer\">The Brink<\/a> on Feb. 12, 2019.<\/em><\/p>\n<\/div>\n<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>New digital-camera-based system shows it\u2019s possible to \u201csee\u201d around corners What if your car possessed technology that warned you not only about objects in clear view of your vehicle\u2014the way that cameras, radar, and laser can do now in many standard and autonomous vehicles\u2014but also warned you about objects hidden by obstructions. Maybe it\u2019s something [&hellip;]<\/p>\n","protected":false},"author":18605,"featured_media":28567,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[76],"tags":[147],"_links":{"self":[{"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/posts\/28330"}],"collection":[{"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/users\/18605"}],"replies":[{"embeddable":true,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/comments?post=28330"}],"version-history":[{"count":10,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/posts\/28330\/revisions"}],"predecessor-version":[{"id":33572,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/posts\/28330\/revisions\/33572"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/media\/28567"}],"wp:attachment":[{"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/media?parent=28330"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/categories?post=28330"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/tags?post=28330"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}