ECE prof advised US government on developing exascale computing
By, Rich Barlow, BU Today
Some time ago, Roscoe Giles gave a talk to BU computer scientists where he used his iPad 3—a handheld, aging technology, he noted, that nevertheless “had the same arithmetic power” as BU’s first, $2 million supercomputer from the 1990s.
“What was this giant, aspirational thing in 1990 is like nothing now,” says Giles, a College of Engineering professor of electrical and computer engineering. “A Sony PlayStation 3 had the same power as the Cray supercomputers of 1985.”
The next step in this exponential expansion of computing is the exascale computer—for now, just a dream of computer scientists—that would run one billion billion (yes, two “billions”) calculations per second. It would enhance everything from analyzing climate change to developing better plane and car engines to enhanced national security, says Giles, a 30-year veteran of the BU faculty.
He stepped down last October as a member and chairman of the US Department of Energy’s Advanced Scientific Computing Advisory Committee (ASCAC), which is helping advance President Obama’s executive order last year green-lighting development of exascale capability. ASCAC predicts that the United States will develop such a computer by 2023. When operational, the computer “is not going to be made up of a billion processors,” Giles says, “it’s going to be made up of fewer that are each more powerful and run at lower energy than most we have now.
“You probably won’t own an exascale computer—maybe in 40 years—but you’ll have devices that have exascale computing technology in them. Your smartphone will someday be a hundred times smarter than it is now, for not much more money.”
Giles discussed the brave new world of exascale with BU Today.
BU Today: What’s the technological challenge to developing exascale computing?
Giles: Up until about 2004, exponential growth in computing power meant that everything went faster. That stopped; computing didn’t get faster. Instead, you are getting computer chips with multiple processors on the chip. You can put more stuff on a chip; what hasn’t continued is the part that says you can make the chip go faster, which means the energy you need goes up faster.
Is it correct then that the main impediment is that exascale requires such immense energy?
That’s sort of true. It takes much too much power. But if you were willing to expend that power, it’s not clear that it makes sense to do it. What problems will I solve? If you took a ship filled with cell phones and said how much computing power is in that ship, it might be an exascale, but that doesn’t mean that they can work together to solve any particular problem.
So why are we pursuing exascale computing?
You may be able to solve new problems or old problems in new ways. For example, simulating how matter will behave inside a car engine to help engineer a better engine. You need to simulate millions or billions of molecules. Historically, people could never get close to the number of molecules in a real engine in a computer model. So what we do know tends to be based on artfully chosen, very tiny computational samples. Exascale computing enables larger simulations that can have greater fidelity to real systems.
On a larger scale, think about the simulation of climate and climate science. You have to represent individual patches of the Earth inside the computer. In the early days, you’d be dividing the world up into a large number of 100-kilometer squares. What is the temperature in that square? How much ice is there, and what is the reflectivity in that square? As computers got more powerful, they made the square smaller and smaller. The current number is, like, 20 kilometers. You get more understanding in the narrower regions, particularly things like the ocean flows, where at a greater level you can’t see what’s going on. That’ll give me a greater picture of the little eddies and flows. Exascale computing hopes to enable simulations at a one-kilometer scale.
Exascale also helps engineering companies doing production. The classic example is numerical wind tunnels. In design of airplanes, the test was that you build some mock-up of a plane or section, you put it in a wind tunnel, you measure its properties, you may tweak the engineering. The capability that computing already has offered is to replace some of those simulations by computer simulation. Your ideal would be sitting at a drawing board and saying, “I want to make a wing with this shape out of this material,” and then you press a button and the computer tells you how well that would perform.
It’s the same thing with looking at the airflow around trucks, based on how you design the body of the truck, to save fuel. They design trucks with 20 to 30 percent better gas efficiency through computer modeling with the highest computers we have now. What we’re looking for in exascale is to include more science and understanding of materials in design and production.
What about search engines like Google?
One of the hottest things going is data science. There are problems where exascale computing, married to the right level of data science support, will lead to breakthroughs to do lots of cross-correlation of data.
A Google data center uses a unit of computing that’s basically a tractor-trailer truckful of computers. The data center is a roomful of these containers. Each of those processors is doing a share of the searching, but it’s a challenge to get 10,000 of those things to be communicating rapidly, back and forth, to solve my problem. That’s the kind of problem where exascale computing directly affects the data science, to bring lots of computing power to bear on a single problem.
Are governments and business the only entities likely to own an exascale computer?
I think that’s probably true. But you’d get a machine comparable to some of today’s midsize supercomputers that could be on your desk once exascale technology is around.
The other use is national security. The National Security Agency is really interested in this.
Because you could read more data from terrorists’ cell phones or computers?
Right. They want to be number one in the world at that. That actually parallels Google, in terms of it being a data-centric application, the ability to handle large amounts of data and make correlations.
Why does government need to be involved—why can’t Google or industry develop exascale on their own?
The economic incentives are not aligned to make it possible to develop the exascale machine. There’s not a big enough market. Companies would like better roads and faster railroads, but that doesn’t mean they’re going to create the railroad. They’ll contribute to help building the road, but by themselves, they won’t be able to do it.
What are other governments doing to develop exascale?
They intervene to make it happen, like China. There’s a European consortium that’s investing in exascale. That’s actually a software initiative.
I used to joke that we say, “You want to start a program that’s $200 million a year? Oh my God, where are we going to get the money?” My impression is in China, they say, “Oh, it only costs money? We’ll buy slightly less real estate in New York to pay for this.”
Is it conceivable that China or some other country might get exascale before us?
Oh, sure. It’s conceivable. I don’t know if it’s likely. We have the best ecosystem for scientific computing, meaning that not only do we have powerful computers, but we have people who develop the algorithms. We have the laboratories where the computers live.
Is there a debate in academia about the role of government in developing exascale?
There’s always debate. There’s a debate first about the role of universities versus the role of government labs. And there’s the overall question of how you spend government money.
Are there dangers from exascale computing?
A danger comes when one part of society has access to a technology that no one else has. We will have that technology distributed to more than just the government. We could not get industry to participate if the only end point was to make 10 machines for the government. Our committee had people from industry and from academia.
This goes with having stuff in data centers and cloud services. One vision is you deploy exascale technology and the market for it is going to be data centers. The way people benefit is from their access through the data centers. The iPad 200 will have stuff in it that is exascale.
Human society is incredibly adaptive. The internet itself—I was around before it existed at all. We adapted to that technology. That doesn’t mean the adaptation is not painful. The issues of privacy we’re addressing only arose after the technology came in. We’re going to figure it out; our institutions will adapt. Those of us who are older are horrified by what people put out on Facebook and that they take their cell phone to the bathroom with them and live-stream it. But it’s clear other people are not going to feel that way: “Everybody has naked pictures of themselves in mud on the internet from when they were in college. So what?”
This article originally appeared on BU Today.
Transforming Living Cells into Computers
By Sara Elizabeth Cody
Whether it’s artificial skin that mimics squid camouflage or an artificial leaf that produces solar energy, a common trend in engineering is to take a page out of biology to inspire design and function. However, an interdisciplinary team of BU researchers have flipped this idea, instead using computer engineering to inspire biology in a study recently published in Science.
“When you think about it, cells are kind of computers themselves. They have to communicate with other cells and make decisions based on their environment,” says Associate Professor Douglas Densmore (ECE, BME), who oversaw the BU research team. “By turning them into circuits, we’ve figured out a way to make cells that respond the way we want them to respond. What we are looking at with this study is how to describe those circuits using a programming language and to transform that programming language into DNA that carries out that function.”
Using a programming language commonly used to design computer chips, ECE graduate student Prashant Vaidyanathan created design software that encodes logical operations and bio-sensors right into the DNA of Escherichia coli bacteria. Sensors can detect environmental conditions while logic gates allow the circuits to make decisions based on this information. These engineered cells can then act as mini processing elements enabling the large scale production of bio-materials or helping detect hazardous conditions in the environment. Former postdoctoral researcher Bryan Der facilitated the partnership between BU and the Massachusetts Institute of Technology to pursue this research study.
“Here at BU we used our strength in computer-aided design for biology to actually design the software and MIT produced the DNA and embedded it into the bacterial DNA,” says Densmore. “Our collaboration is a result of sharing the same vision of standardizing synthetic biology to make it more accessible and efficient.”
Historically, building logic circuits in cells was both time-consuming and unreliable, so fast, correct results are a game changer for research scientists, who get new DNA sequences to test as soon as they hit the “run” button. This novel approach of using a common programming language opens up the technology to anyone, giving them the ability to program a sequence and generate a strand of DNA immediately.
“It used to be that only people with knowledge of computers could build a website, but then resources like WordPress came along that gave people a simple interface to build professional-looking websites. The code was hidden in the back end, but it was still there, powering the site,” says Densmore. “That’s exactly what we are doing here with our software. The genetic code is still there, it is just hidden in the back end and what people see is this simplified tool that is easy, effective and produces immediate results that can be tested.”
According to Densmore, this study is an important first step that lays the foundation for future research on transforming cells into circuits, and the potential for impact is global, with applications in healthcare, ecology, agriculture and beyond. Possible applications include bacteria that can be swallowed to aid in digestion of lactose to bacteria that can live on plant roots and produce insecticide if they sense the plant is under attack.
“The possibilities are endless, and I am excited about it because this is the crucial first step to reach that point where we can do those amazing things,” says Densmore. “We aren’t at that level yet, but this is a stake in the ground that shows us we can do this.”
The BU/MIT collaboration will continue underneath the Living Computing Project which was recently awarded a $10M grant from the National Science Foundation. Future studies will look to improve upon the circuits that were tested, add other computer elements like memory to the circuits and expand into other organisms such as yeast, which will pave the way for implanting the technology into more complex organisms like plant and animal cells.
By Rich Barlow
The College of Engineering has earned its highest-ever ranking from US News & World Report, placing 35th among its peer American schools in the magazine’s latest rankings. It’s a two-slot advance from last year and a long jump from a decade ago, when the school placed 52nd, says Kenneth Lutchen, dean of ENG.
Additionally, ENG’s biomedical engineering instruction ranked ninth among such programs nationally. The ratings of 194 engineering schools considered peer assessments, student selectivity, student-faculty ratios, the number of doctoral degrees granted, and research funding, among other factors.
Lutchen attributes his school’s success to several strengths, starting with a commitment to interdisciplinary research across both the college and the University, “recruiting complementary faculty in areas such as photonics, information and cyber-physical systems, the intersection of engineering and biology, advanced materials, and nanotechnology.” That approach, he says, has garnered “tremendous extramural funding success among our faculty.”
Second, in recent years, ENG boosted research and educational partnerships with industry, using assessments of the school’s programs by these partners to improve them. Meanwhile, Lutchen says, the ENG faculty has matched prowess at securing funding with “scholarship in their field, and in how that scholarship eventually impacts societal challenges.”
Over the past decade, ENG’s rankings have marked “the largest single improvement of any engineering school in the country” among those that made the top 52 in 2006, he notes. Every one of its degree programs now scores in the top 20 in its discipline among private universities, he says, adding that that has real-world effects, helping “attract ever-higher quality in our faculty and our PhD students.”
A version of this story originally appeared on BU Today.
Leading Engineers Visit BU as Part of the ECE Distinguished Lecture Series to Discuss Research with Students and Faculty
By Rebecca Jahnke, COM ’17
BU’s Electrical & Computer Engineering department draws renowned leaders of the field to present as part of the ECE Distinguished Lecture Series. The topics presented are always changing, but consistently span diverse research areas. The Fall 2015 lineup included academics Daniel Fleetwood, Kevin Skadron and Ralph Etienne-Cummings.
Despite Fleetwood, Skadron and Etienne-Cummings’ varying research focuses, the trio has much in common. All are highly decorated IEEE Fellows with many accolades to their names. They hold a collective ten patents between them. Through the groundbreaking publications they’ve authored, the group has effectively written the science today’s students are learning. Work conducted at posts throughout the country – and for some, on sabbatical abroad – further reflects the breadth of their influence.
Fleetwood kicked off this season’s series with a lecture entitled “Moore’s Law and Radiation Effects on Microelectronics” in September. Fleetwood is the Chair of Vanderbilt University’s Department of Electrical Engineering & Computer Science as well as the university’s Olin H. Landreth Professor of Engineering. His lecture examining the effects of Moore’s Law Size and voltage scaling followed his research in nano science and technology as well as risk and reliability. A Fellow of the American Physical Society and an IEEE Fellow, Fleetwood also received the IEEE Nuclear and Plasma Sciences Society’s Merit Award. Having authored over 380 publications, Fleetwood received ten Outstanding Paper Awards and has his research cited upwards of 7000 times.
The series continued with a lecture by Kevin Skadron, University of Virginia Department of Computer Science Chair and Harry Douglas Forsyth Professor. His October presentation, “Automata Processing: Massively-Parallel Acceleration for Approximate Pattern Matching,” provided an overview of the AP architecture and observations from accelerating its applications. Skadron cites his research as exploring processor design techniques for managing power, thermal and reliability constraints, all with a focus on manycore and heterogeneous architectures. He has achieved two patents of his own and over 100 peer-reviewed publications and counting since his college summers spent interning for Microsoft and Intel.
Ralph Etienne-Cummings, Professor and Chair of Johns Hopkins University’s Department of Electrical and Computer Engineering, closed out this semester’s series in December. This final presentation – “I, Robot: Blurring the lines between Mind, Body and Robotics” – suggested new approaches to brain-machine interfaces (BMI). Etienne-Cummings’ research interests include systems and algorithms for biologically inspired and low-power processing, biomorphic robots, applied neuroscience, neutral prosthetics and computer integrated surgical systems and technologies. His high level of curiosity has been evident since he was a child and repaired his own short wave radio to listen to a soccer match. Now the holder of seven patents, Etienne-Cummings is known to make time for diversity and mentoring initiatives intended to awaken a similar curiosity in others.
Computer engineer Densmore and team aim to advance synthetic biology
By Michael G Seele
The rapidly growing field of synthetic biology has made long strides in recent years as researchers have modified the genetic makeup of living organisms to get them to behave in different ways — flagging the presence of toxins in the environment, for example. Researchers have done this by breaking down biology into basic building blocks. However, using these building blocks has been increasingly difficult without a clear design methodology and supporting quantitative metrics researchers could use to make decisions.
Associate Professor Douglas Densmore (ECE, BME) would like to take the guess work out of biological design and speed the development of synthetic biology in the process. Working under a new $10 million National Science Foundation “Expeditions in
Computing” grant, Densmore will lead the Living Computing Project, a comprehensive effort to quantify synthetic biology, using a computing engineering approach to create a toolbox of carefully measured and catalogued biological parts that can be used to engineer organisms with predictable results. These parts will allow the entire field to understand better what computing principles can be applied repeatedly and reliably to synthetic biology.
Densmore and assistant professors Ahmad Khalil (BME) and Wilson Wong (BME), and Research Assistant Professor Swapnil Bhatia (ECE) will take the lead on the project, partnering with colleagues at MIT and Lincoln Laboratory over the course of the five-year grant. The award marks the first time explicitly exploring computing principles in multiple living organisms and openly archiving the results has been funded.
“This puts a stake in the ground to make synthetic biology more rigorous,” Densmore said. “We want to build a foundation that’s well understood, built to use software tools, and that can serve as an open-source starting place for many advanced applications.”
Synthetic biologists take snippets of DNA and combine them in novel ways to produce defined behavioral characteristics in organisms. For instance, Densmore envisions a day when one might engineer a cell to change state when it detects cancer. The cell could be introduced into a patient, retrieved after a time and read like the memory of a computer, enabling detection of disease much earlier and less invasively than is now possible. Engineering that cell could be far easier and faster if researchers had a detailed inventory of parts and corresponding software tools they could use to create it.
Densmore is a core member of — and the only computer engineer in — BU’s new Biological Design Center. He has long been applying the kinds of tools used in computer engineering to synthetic biology. His software aims to identify and characterize biological parts — segments of DNA — and assemble them into complex biological systems. The NSF Expeditions in Computing grant will allow for expansion of that effort, but there are significant challenges in applying computer engineering principles to natural systems.
“What is power consumption in biology?” Densmore cites as an example. “What are the metrics in biology that make sense, can be repeated, and are reliable? You can’t make decisions in engineering without metrics and quantifiable information.”
“Programming a flower to change color, a cell to repair damaged tissue, or a mosquito to defeat malaria, is likely to require a different computational model than programming an app for your laptop,” said Bhatia. “Discovering this new type of computational thinking in partnership with synthetic biologists is what I am most excited about.”
Densmore hopes this project will take synthetic biology from an artisanal endeavor to a true engineering discipline with a solid, quantified foundation.
“Computation is important for moving any field forward and that’s what we’re trying to do with synthetic biology,” Densmore said. “We’re trying to build a library based on computing principles for the whole community, an open-source repository of biological pieces that use those principles reliably, repeatedly, and with broad applicability.”
“The Expeditions in Computing program enables the computing research community to pursue complex problems by supporting large project teams over a longer period of time,” said Jim Kurose, NSF’s head for Computer and Information Science and Engineering. “This allows these researchers to pursue bold, ambitious research that moves the needle for not only computer science disciplines, but often many other disciplines as well.”
ECE Professor’s LED Discovery at Heart of Case
By Joel Brown, published in BU Today
A US District Court jury has awarded Boston University more than $13 million after finding that three companies infringed on a BU patent for blue LEDs (light emitting diodes), used in countless cell phones, tablets, laptops, and lighting products.
After a highly technical three-week trial in November, the 10-person jury unanimously found that the companies had willfully infringed on BU’s patent for the invention by 2013 Innovator of the Year Theodore Moustakas, College of Engineering Distinguished Professor of Photonics and Optoelectronics Emeritus. Because the jury found the infringement to be willful, the $13,665,000 award could be doubled or tripled by Judge Patti B. Saris. No date has yet been announced for further proceedings.
Despite the amount of damages awarded, “the best part of this is that it validates Professor Moustakas’ work,” says Michael Pratt (Questrom’12), interim managing director of BU’s Technology Development office. “The story is really not about the money. The first thing we want is recognition of his seminal contribution to this field.”
Moustakas, who became a professor emeritus when he retired in June but continues to conduct research at the Photonics Center, testified extensively at the trial and was present in court every day. When the judge read the jury’s verdict, “I put my head down,” he says. “I cried.” He describes the jury’s decision as “amazing…everything we asked,” saying also that his lifetime’s work was being challenged.
“Fundamental to our mission as a global research institution is nurturing an environment of discovery that supports our faculty and the incredibly important work they do,” says Jean Morrison, provost and chief academic officer. “We are delighted with the verdict in this case. Boston University has successfully fought, and will continue to fight, for our faculty members and the intellectual property they create here.”
The three primary defendants, all Taiwan-based, were Epistar Corporation, Everlight Electronics Co., Ltd., and Lite-On Technology Corporation, along with various subsidiaries, most located in the United States. Each is involved in manufacturing or packaging LEDs for use in consumer electronics. A number of big-name electronics manufacturers were initially part of the University’s case, but they avoided litigation by joining a settlement that includes licensing and confidentiality agreements.
The University was represented by Michael Shore, a partner at Shore Chan DePumpo LLP, in Dallas, specialists in intellectual property cases, and Erik Belt, a partner specializing in patent disputes at the Boston law firm McCarter and English LLC, which has represented BU before. While it is possible for the defendants to appeal the verdict, Belt says it would be difficult to overturn the jury’s clear finding of fact.
The University will receive less than half of the final award, after the attorneys, who took the case on a contingency basis, and previous patent licensees are paid. Moustakas will receive 30 percent of the University’s share.
Moustakas joined BU in 1987 and was named the University’s inaugural Distinguished Professor of Photonics and Optoelectronics in 2014. A search is under way for his successor, and the Distinguished Professorship will be renamed the Theodore Moustakas Professorship of Photonics and Optoelectronics.
Moustakas’ invention dates to June 22, 1990, when researchers in his lab were trying to produce microscopically thin layers of gallium nitride to be used in the LEDs, growing crystals of the substance at high temperatures. They discovered that a heater used in the experiment had malfunctioned and the material had cooled to 270 degrees Celsius, far below the intended 600 degrees. But instead of aborting the experiment, Moustakas told them to fix the heater and continue. The snafu led to the growth of a smoother, more translucent gallium nitride layer that also grew much faster when crystallized at the higher temperature, a result replicated—deliberately—the very next day.
The main patent for the LED was issued in 1997, based on an application first submitted in 1991. Since then, blue LEDs have become a key component in many products, because they can generate white light when coated with phosphor.
“The real story is the robustness of Moustakas’ technology,” says Pratt. “It really did become a personal story. There was an attack, an affront to his creation. They had two experts saying it didn’t exist…and the jury wasn’t buying that at all.”
“To infringe in patent law, you don’t have to know about the patent and you don’t have to have an intent,” says Belt. To prove willfulness, “you basically have to show the other side knew of the patent and they were perhaps recklessly disregarding the fact that they were infringing or willfully blind to it. There’s a lot of ways to say it, but you basically have to show that there was willful disregard for BU’s patent rights.
“I think this really validates Professor Moustakas’ scientific breakthrough and establishes him as one of the great scientists in his field,” Belt says.
By Rebecca Jahnke (COM ‘17) and Bhumika Salwan (Questrom ’16)
Boston University hosted over 300 attendees November 12-15th at the Metcalf Trustee Center for the Students for the Exploration and Development of Space (SEDS) SpaceVision 2015 Conference. The conference is entirely student-run and space-centric. It bills itself as connecting present with future space leaders and is part of international nonprofit SEDS’ larger mission to empower students through the high school, undergraduate and graduate levels to impact space exploration.
BU Engineering seniors Mehmet Akbulut (ME ‘16) and Dean De Carli (EE ‘16) spearheaded conference planning. Both Akbulut and De Carli, who served as the Chair of Operations and Chair of Programming, respectively, had attended the 2013 Arizona SpaceVision Conference. After pondering why the conference had yet to be hosted in a major city like Boston, the pair submitted a bid to post the conference at Boston University and successfully secured the 2015 venue nomination.
Akbulut oversaw logistics, registration, personnel, and general operations of the event while De Carli took charge of programming and speakers. Together, they developed an agenda that featured industry speakers, panels, a business plan competition, and a first-ever peer mentor session. By bringing students together with leaders in the aerospace community, the conference offered attendees invaluable networking opportunities and the chance to view the future of space development through an interdisciplinary lens.
The SEDS, SpaceVision, Rocket Propulsion, and small satellite efforts at BU are all truly interdisciplinary and interdepartmental. This creates a forum for students in different concentrations to work as a team and further learning in fields such as space research. Both Akbulut and De Carli attribute their success running SpaceVision 2015 to the education and leadership opportunities they’ve had in the College of Engineering and Department of Electrical and Computer Engineering (ECE).
“ECE has prepared me to help with SpaceVision by giving me the opportunity to lead in student groups such as Boston University Rocket Proposal Group. It’s given me the leadership skills that I have been able to translate into a much larger scale such as being Chair of this conference,” De Carli said.
The College of Engineering, Department of Electrical and Computer Engineering, Department of Mechanical Engineering and Center for Space Physics jointly sponsored the conference. Outside sponsors included Arizona State University School of Earth and Science Exploration and industry sponsors like Lockheed Martin.
ECE Assistant Prof is Rising Star in Machine Learning
By Michael S. Goldberg
To Brian Kulis, advances in machine learning and artificial intelligence bring with them the opportunity to mesh theory with real-world applications, like driverless cars and computers that can describe aloud the objects in front of them.
“You want computers to be able to recognize what they are seeing in images and video,” says Kulis, a College of Engineering assistant professor of electrical and computer engineering. “For instance, can it recognize all the objects in a picture? Or a more difficult problem would be, can it look at a video and describe in English what is happening in the video? That is a major application area for machine learning these days.”
Kulis’ expertise in machine learning, along with his research in computer vision systems and other applications, brought him to BU this fall and has earned him the University’s inaugural Peter J. Levine Career Development Professorship, which will be awarded annually to rising junior faculty in the electrical and computer engineering department. The professorship’s three-year stipend will support scholarly and laboratory work. It was established by a gift from Peter J. Levine (ENG’83), a partner at the Silicon Valley venture capital firm Andreesen Horowitz and a part-time faculty member at Stanford University’s Graduate School of Business.
Kulis is a rising star in the machine learning field and the Levine professorship speaks to BU’s recognition of his achievements thus far, says Kenneth R. Lutchen, dean of ENG, and is a commitment to helping Kulis build on his world-class research and teaching.
Lutchen adds that Kulis, who earlier this year also received a National Science Foundation Faculty Early Career Development (CAREER) Award for research into machine learning systems, will be a critical faculty member of ENG’s new master’s degree specialization in data analytics.
“We think it will be one of the most popular specializations we have, and it will be accessible not just to students in this department, but also to biomedical, mechanical engineering, and systems engineering students who will want to have this same specialization. Brian’s expertise is perfectly aligned with teaching this,” Lutchen says.
Also a College of Arts & Sciences assistant professor of computer science, Kulis earned a bachelor’s degree in computer science and mathematics at Cornell University and a doctorate in computer science at the University of Texas at Austin. He did postdoctoral work at the University of California, Berkeley, then spent three years on the faculty of Ohio State University before coming to BU.
Millions or billions of data points
Data science is about managing huge data sets—think millions or billions of data points, from an array of sources—and programming computers to analyze the data and make predictions based on identified patterns. Advances in storing and analyzing these growing collections of information has made Big Data a hot field in both academia and industry, with Harvard Business Review pronouncing data scientist “the sexiest job of the 21st century.”
Those advances include artificial intelligence and machine learning, and they are what enable Kulis to develop exciting connections between theory and action. “There is a nice combination between the mathematics and the theoretical aspects of machine learning. It’s a very applied field, trying to solve real problems,” he says. “That balance is pretty rare.”
He describes his specific area of research as scalable nonparametric machine learning. While a traditional statistical model for analyzing a large amount of data would establish a model for performing the analysis, Kulis pursues a different method. In his research, the data itself determines how simple or complicated the analysis should be.
An example of this approach, he says, is analyzing a large collection of documents for the content they contain. A parametric model would establish 10 clusters of documents to analyze, one each on a set topic. A nonparametric model would instead analyze all of the documents and determine how many topics should be included in the analysis. “You want the data itself to guide the discovery process, and so if there is a lot to say, then you want your algorithm to reveal that structure,” he says. “It’s a more flexible way to do analysis.”
The field is ripe for approaches that allow researchers from different fields–biology and business, for example–to apply machine learning techniques to develop new ways of looking at the data they collect. Kulis says he is looking forward to working with faculty and students from different BU departments both in research and in his courses. “Machine learning brings together a lot of fields that for a long time have been fairly disjoined. When it comes to teaching, a lot of my excitement is in trying to bridge these different disciplines and to teach courses that bring together people from different areas,” he says.
The curriculum, Lutchen says, has relevance to the world at large: “As an engineering faculty, we want people to understand how these new tools and techniques can help society.”
Michael S. Goldberg can be reached at firstname.lastname@example.org.
Using the strange laws of quantum mechanics to encrypt the world’s most secret messages
By Kate Becker, originally published in BU Research
Just outside Washington, DC, a heavily armored truck, protected by armed guards, rumbles toward the Pentagon. Its cargo is critical to keeping the most sensitive government communications secret. But it’s not what you might expect. That precious cargo is nothing but numbers.
Though the details are a government secret, according to Alexander Sergienko, a Boston University College of Engineering professor of electrical and computer engineering, trucks like this are one likely way that the United States government might transport the numbers that are at the heart of the only unbreakable encryption technique in the world: the one-time pad. The one-time pad is a string of random numbers, also called a key, which a sender uses to encrypt her message.
But the one-time pad has one big weakness: the random numbers that are the key to coding and decoding it have to be physically transported from one place to another. Sending them over the internet, encrypted by traditional security measures, would be like locking the keys to Fort Knox inside a child’s piggy bank. If the numbers are intercepted, the code is worthless. So, how can you get random numbers from place to place with absolute security? The answer isn’t more armed guards and armored trucks, says Sergienko, who also has an appointment as a professor of physics in BU’s College of Arts & Sciences. It’s quantum mechanics, the bizarre set of rules that governs the subatomic world, where the everyday norms we take for granted—that an object should have a well-defined location and speed, for instance, and that it can only be in one place at a time—go out the window.
The one-time pad encryption method dates back to before World War II, and was used to secure diplomatic communiqués and wartime dispatches. The sender encrypts her message by taking each letter, or bit, of the original message and combining it mathematically with successive random numbers from the key, transforming it into a sequence of totally random numbers. (The longer the message, the longer the key must be: a message that’s 100 letters long requires a key of at least 100 digits.) The encrypted message is now absolutely secure: the sender can broadcast it over a radio or even scream it from the rooftops, if she wants. Only someone with an identical copy of the key can crack the code, by subtracting a matching set of numbers from the broadcast to unlock it. But the key is unbreakable only if it is used just once; if used a second time, code breakers can begin to reverse-engineer the random-number list. With every additional use, the code gets weaker and weaker, so the bank of random numbers must be constantly refilled to keep secure government communication going. That means more numbers, more armored trucks—and more effort and expense.
Sergienko is one of a group of physicists and computer scientists at BU and beyond working to solve this problem with an encryption technique called secure quantum key distribution. They are harnessing cutting-edge technology to implement basic protocols that are some 30 years old. Quantum key distribution exploits the strange laws of quantum mechanics to create a truly random key that is totally secure from eavesdroppers.
Here’s how it works. Each “bit” of the key is encoded in the polarization of a single photon—essentially, the direction in which the light particle is “waving.” It can be up, down, or anything in between. In this case, though, each photon is prepared set in only one of two “bases”—horizontal/vertical, where horizontal might represent a one and vertical a zero; or tilted at an angle, with 45 degrees up representing one and 45 degrees down representing zero.
Sergienko maps out how it works using three characters well known to physics students: Alice, who’s sending the message; Bob, who is receiving it; and Eve, an eavesdropper out to covertly intercept it. To read out the state of each incoming photon, Bob has to pick the correct base. Alice can’t tell him the bases in advance, so he guesses randomly. Later, Alice reports the bases she used for each photon, and Bob throws away the readings for which he picked the wrong base. The result: Bob and Alice end up with identical, random strings of ones and zeros that they can use as a fresh key for their future communications.
If eavesdropper Eve tries to intercept photons traveling from Alice to Bob, Bob will notice a shortage of incoming photons. Eve could attempt to hide the theft by copying the polarization of each stolen photon and sending it on to Bob, but the laws of quantum mechanics, which make it impossible to perfectly “clone” the quantum state of a photon, get in her way, so she is bound to make mistakes that betray her presence. So, not only do Alice and Bob have truly random keys in hand, they also have the ultimate security against eavesdroppers: the laws of physics.
That’s the easy part, from Sergienko’s point of view. The hard part: making this technique work over practical distances. That’s because, to retain the quantum properties that make them so useful for secure communication, photons have to be kept isolated from all external disturbances. Another challenge: the same “no-cloning” law that thwarted Eve prohibits the use of any amplifiers, standard in traditional telecommunications, on the optical lines that transmit the photons. “One single photon has to travel from point A to point B,” says Sergienko. It’s as if the code were written on eggshells. How can you send millions or billions of those eggshells, far and fast?
“It’s a dilemma,” says Sergienko. “The quantum realm gives you more opportunities, but to make these opportunities work for people, you have to solve the problem of how the quantum state will survive in the classical environment,” the messy reality in which it’s nearly impossible to avoid interacting with other fields and particles.
Today’s “best of the best” technology can create a few million quantum states per second, says Sergienko. But the farther you try to send them, the more of them will “crack” like broken eggshells—that is, get absorbed into the line and disappear—before they reach their destination. So while some physicists are chasing distance records, dispatching quantum states across hundreds of kilometers, Sergienko is more interested in finding the optimal balance between transmission distance and the rate at which new states can be created. Today, data rates of about 100 kHz are possible within a modest city-sized network. Not exactly telecom speed—typical home broadband connections run 10 or 100 times faster—but good enough to transmit the bits of a robust key that guarantees the highest level of secure communication.
In 2003 and 2004, Sergienko and Gregg Jaeger, an associate professor of natural sciences and mathematics in BU’s College of General Studies, led a BU team that partnered with researchers at Harvard University and BBN Technologies (now a part of Raytheon) to build just such a system. With support from the Defense Advanced Research Projects Agency (DARPA), the military’s advanced research arm, they used standard commercial fiber optic cables in the ground to send photons between three sites in the greater Boston area: one at BU, one at Harvard, and a third at BBN’s headquarters, near Fresh Pond in Cambridge. The system spanned about 18 miles end-to-end. “We showed that this secure communication can be established between three nodes through the metropolitan fiber, and can go 24/7,” says Sergienko. Even though the data rate was not high—just about 1,000 bits per second, slower than a dial-up modem—over time, each site would build up a long enough key to enable secure communication on demand. The system ran for three years, and was followed several years later by similar, independent demonstration networks in Europe, Japan, and China.
What happened next? That’s a government secret. But Sergienko is confident that secure quantum key distribution networks are live today somewhere in the United States. The likeliest spots: Washington, DC, where such a network could enable secure communication between government agencies, eliminating the need for all those trucks; and Wall Street, where it would guarantee absolute privacy for transactions between financial institutions.
Today, Sergienko is trying to narrow the gap between quantum and classical data rates. With fiber quality nearly as good as it can get, and the rate at which new quantum states can be created almost maxed out, Sergienko and his colleagues around the world are taking a new tack: encoding more bits of information in a single photon. While photon polarization can only represent zero or one, a different property of photons, called orbital angular momentum, can encode at least 10 different distinguishable states, and possibly more. Instead of simple binary bits, cryptographers would have a whole mini alphabet to work with.
As for those armored trucks? Though they might still be standard for transporting secret keys to remote locations, Sergienko wouldn’t be surprised if they are no longer pulling up to the Pentagon. But the secrets of the unbreakable code are still just that: secret.
This research is a continuation of work done by Professor Sergienko in 2013, information on which can be found here.
Densmore’s Contributions Part of a $32 Million DARPA Contract to Cutting Edge Synthetic Biology Effort
By Rebecca Jahnke (COM ’17)
A $32 million contract from the Defense Advanced Research Projects Agency (DARPA) was awarded to “The Foundry” (http://web.mit.edu/foundry/), a DNA design and manufacturing facility at the Broad Institute of MIT to support the engineering of novel biological systems. Boston University Computer Engineering Professor Douglas Densmore’s role in automating the facility’s design process with software inspired by electrical and computer engineering was key in establishing novel, large scale, parallel design processes that landed the contract.
The Foundry focuses on designing, testing and fabricating large sequences of genetic information. The intent is to create DNA nucleotide arrangements that can be applied widely for medical, industrial and agricultural purposes.
Engineers at the Foundry work with chains containing millions of nucleotides, all of which are specified using only the letters A, T, G and C. The Foundry sought Densmore’s computer aided design expertise to help automate complex processes because the feat is impossible for an engineer writing out such vast sequences by hand.
Densmore’s contributions will allow the Foundry to significantly increase its output of DNA designs beyond what would have been possible relying on conventional design techniques. The Foundry’s work will lead to greater advances faster – tackling issues like delivering nitrogenous fertilizer to cereal crops and converting compounds that naturally occur in human bacteria into therapeutic drugs.
Douglas Densmore is a Kern Faculty Fellow, Hariri Institute for Computing and Computational Science and Engineering Junior Faculty Fellow, and Professor in the Department of Electrical and Computer Engineering at Boston University. He also acts as the director of the Cross-disciplinary Integration of Design Automation Research (CIDAR) group at Boston University, where his team develops computational and experimental tools for synthetic biology. His research facilities include both a computational workspace in the Department of Electrical and Computer Engineering as well as experimental laboratory space in the Boston University Biological Design Center. Densmore is the President of the Bio-Design Automation Consortium, Nona Research Foundation, and Lattice Automation, Inc.
For more information, please see the Broad Institute of MIT press release.