{"id":12300,"date":"2013-03-21T10:02:16","date_gmt":"2013-03-21T14:02:16","guid":{"rendered":"http:\/\/www.bu.edu\/systems\/?p=12300"},"modified":"2021-02-09T12:48:53","modified_gmt":"2021-02-09T17:48:53","slug":"opening-movements","status":"publish","type":"post","link":"https:\/\/www.bu.edu\/cise\/opening-movements\/","title":{"rendered":"Opening Movements &#8211; J Konrad and P. Ishwar"},"content":{"rendered":"<p>Your next security ID may be a defining gesture<br \/>\nby Rich Barlow<\/p>\n<p><img loading=\"lazy\" src=\"\/cise\/files\/2013\/03\/thumb_l.jpg\" alt=\"\" width=\"550\" height=\"310\" class=\"size-full wp-image-30660\" \/><\/p>\n<p>In the video above, two College of Engineering professors explain, and demonstrate, the computer software they are developing to recognize a gesture, from your torso, your hand, or perhaps just your fingers. They hope this could be the future security portal to your smartphone, tablet, laptop, or the locked door to authorized personnel\u2013only spaces.<br \/>\nTo the casual passerby, Janusz Konrad seems a bit fanatical about tai chi: standing in his office, waving one arm to and fro, then spreading both arms and bringing them together. Duck inside, however, and you\u2019ll notice he\u2019s not stretching for his health; he\u2019s stretching for a camera, and images on a computer monitor are responding to each gesture\u2014zooming in and out of photos or leapfrogging through a photo series.<br \/>\nKonrad, a College of Engineering professor of electrical and computer engineering, and Prakash Ishwar, an associate professor, designed the computer\u2019s software to recognize specific body motions. They\u2019re not making video games. This, they hope, is the future security portal to your smartphone, tablet, laptop, or the locked door: software programmed to recognize a gesture, from your torso, your hand, or perhaps just your fingers.<br \/>\nArmed with an $800,000 grant from the National Science Foundation and collaborating with colleagues at the Polytechnic Institute of New York University, the BU duo is developing algorithms for ever-smarter motion sensors. In doing so, they have to thread a tricky technological needle. \u201cOn the one hand,\u201d says Ishwar, \u201cyou want security and privacy; nobody else should be able to authenticate on your behalf\u201d by aping your gesture. On the other hand, if the system demands a perfectly precise gesture, you may have to flail your arms or other parts 10 times to get into your own account. \u201cThat\u2019s annoying,\u201d says Ishwar. (And people may think you\u2019re either crazy or infested with lice.)<br \/>\nA workable system must be able to screen out distractions, like the motion of someone moving behind you or of the backpack you\u2019re wearing, or changes in ambient lighting.<br \/>\nYet using gestures as keys to cyber-locks would have some great advantages. A gesture, like a lateral swipe of your hand, has \u201csubtle differences in the way people do it,\u201d Ishwar says\u2014and people vary in arm length, musculature, and other traits that might help a detector distinguish between you and Arnold Schwarzenegger or Elle Macpherson. True, gestures aren\u2019t as unique as fingerprints or as irises or faces, for which there are authentication scanners. But unlike those traits, which theoretically are vulnerable if someone hacks the database storing them, an authenticating gesture that\u2019s been compromised by an impostor can be replaced immediately, whereas getting a new fingerprint\u2014well, \u201cyou wouldn\u2019t like it,\u201d says Ishwar.<br \/>\nSecurity passwords pose another problem: the most effective ones tend to be inconveniently complex. Konrad surveyed one of his classes and found that no one used a smartphone passcode longer than four digits. An effective motion sensor could \u201csimplify, make more secure and more pleasant the process of logging in,\u201d he says. He and Ishwar are working to develop gesture-based authentication software to be test-run on Microsoft\u2019s motion-sensing Kinect camera, used with the Xbox video game and the Windows computer operating system. \u201cIt can track your body,\u201d says Ishwar, \u201cget some skeleton approximation for your body, and then that information is provided to you in some real-time format.\u201d<br \/>\nThey also hope to use start-up company Leap Motion\u2019s smaller motion-sensing device for notepads and laptops. The company claims that its device, the size of an iPod, will be able to read \u201cmicro-motions of your fingers,\u201d says Konrad. In the next three to four years, \u201cwe want to develop something that\u2019s extremely simple, inexpensive, and can be imbedded into other products and could be used daily by millions of people.\u201d<br \/>\nOne thing that is clear is that certain body parts, like hands, lend themselves to identity authentication better than others. \u201cThe degree of freedom that you have with your hands is significantly higher,\u201d Ishwar says. \u201cMaybe if I\u2019m a yoga master, I can move my right leg and put it across my left shoulder, but most people can\u2019t do that.\u201d They\u2019d like to experiment also with the torso, says Konrad, since people\u2019s posture can vary. Then there\u2019s Leap Motion and its potential finger recognition.<br \/>\n\u201cWe plan to involve more and more body parts\u201d as the research progresses, Konrad says. If that sounds vaguely Frankenstein-ish, consider that today\u2019s security technology already involves fingerprints, iris scans, and face recognition. \u201cWouldn\u2019t it be nice,\u201d muses Ishwar, \u201cif we could do that using our everyday body language or gestures?\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Your next security ID may be a defining gesture by Rich Barlow In the video above, two College of Engineering professors explain, and demonstrate, the computer software they are developing to recognize a gesture, from your torso, your hand, or perhaps just your fingers. They hope this could be the future security portal to your [&hellip;]<\/p>\n","protected":false},"author":1500,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[26],"tags":[],"_links":{"self":[{"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/posts\/12300"}],"collection":[{"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/users\/1500"}],"replies":[{"embeddable":true,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/comments?post=12300"}],"version-history":[{"count":3,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/posts\/12300\/revisions"}],"predecessor-version":[{"id":30661,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/posts\/12300\/revisions\/30661"}],"wp:attachment":[{"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/media?parent=12300"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/categories?post=12300"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.bu.edu\/cise\/wp-json\/wp\/v2\/tags?post=12300"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}