Transforming Living Cells into Computers
By Sara Elizabeth Cody
Whether it’s artificial skin that mimics squid camouflage or an artificial leaf that produces solar energy, a common trend in engineering is to take a page out of biology to inspire design and function. However, an interdisciplinary team of BU researchers have flipped this idea, instead using computer engineering to inspire biology in a study recently published in Science.
“When you think about it, cells are kind of computers themselves. They have to communicate with other cells and make decisions based on their environment,” says Associate Professor Douglas Densmore (ECE, BME), who oversaw the BU research team. “By turning them into circuits, we’ve figured out a way to make cells that respond the way we want them to respond. What we are looking at with this study is how to describe those circuits using a programming language and to transform that programming language into DNA that carries out that function.”
Using a programming language commonly used to design computer chips, ECE graduate student Prashant Vaidyanathan created design software that encodes logical operations and bio-sensors right into the DNA of Escherichia coli bacteria. Sensors can detect environmental conditions while logic gates allow the circuits to make decisions based on this information. These engineered cells can then act as mini processing elements enabling the large scale production of bio-materials or helping detect hazardous conditions in the environment. Former postdoctoral researcher Bryan Der facilitated the partnership between BU and the Massachusetts Institute of Technology to pursue this research study.
“Here at BU we used our strength in computer-aided design for biology to actually design the software and MIT produced the DNA and embedded it into the bacterial DNA,” says Densmore. “Our collaboration is a result of sharing the same vision of standardizing synthetic biology to make it more accessible and efficient.”
Historically, building logic circuits in cells was both time-consuming and unreliable, so fast, correct results are a game changer for research scientists, who get new DNA sequences to test as soon as they hit the “run” button. This novel approach of using a common programming language opens up the technology to anyone, giving them the ability to program a sequence and generate a strand of DNA immediately.
“It used to be that only people with knowledge of computers could build a website, but then resources like WordPress came along that gave people a simple interface to build professional-looking websites. The code was hidden in the back end, but it was still there, powering the site,” says Densmore. “That’s exactly what we are doing here with our software. The genetic code is still there, it is just hidden in the back end and what people see is this simplified tool that is easy, effective and produces immediate results that can be tested.”
According to Densmore, this study is an important first step that lays the foundation for future research on transforming cells into circuits, and the potential for impact is global, with applications in healthcare, ecology, agriculture and beyond. Possible applications include bacteria that can be swallowed to aid in digestion of lactose to bacteria that can live on plant roots and produce insecticide if they sense the plant is under attack.
“The possibilities are endless, and I am excited about it because this is the crucial first step to reach that point where we can do those amazing things,” says Densmore. “We aren’t at that level yet, but this is a stake in the ground that shows us we can do this.”
The BU/MIT collaboration will continue underneath the Living Computing Project which was recently awarded a $10M grant from the National Science Foundation. Future studies will look to improve upon the circuits that were tested, add other computer elements like memory to the circuits and expand into other organisms such as yeast, which will pave the way for implanting the technology into more complex organisms like plant and animal cells.
Leading Engineers Visit BU as Part of the ECE Distinguished Lecture Series to Discuss Research with Students and Faculty
By Rebecca Jahnke, COM ’17
BU’s Electrical & Computer Engineering department draws renowned leaders of the field to present as part of the ECE Distinguished Lecture Series. The topics presented are always changing, but consistently span diverse research areas. The Fall 2015 lineup included academics Daniel Fleetwood, Kevin Skadron and Ralph Etienne-Cummings.
Despite Fleetwood, Skadron and Etienne-Cummings’ varying research focuses, the trio has much in common. All are highly decorated IEEE Fellows with many accolades to their names. They hold a collective ten patents between them. Through the groundbreaking publications they’ve authored, the group has effectively written the science today’s students are learning. Work conducted at posts throughout the country – and for some, on sabbatical abroad – further reflects the breadth of their influence.
Fleetwood kicked off this season’s series with a lecture entitled “Moore’s Law and Radiation Effects on Microelectronics” in September. Fleetwood is the Chair of Vanderbilt University’s Department of Electrical Engineering & Computer Science as well as the university’s Olin H. Landreth Professor of Engineering. His lecture examining the effects of Moore’s Law Size and voltage scaling followed his research in nano science and technology as well as risk and reliability. A Fellow of the American Physical Society and an IEEE Fellow, Fleetwood also received the IEEE Nuclear and Plasma Sciences Society’s Merit Award. Having authored over 380 publications, Fleetwood received ten Outstanding Paper Awards and has his research cited upwards of 7000 times.
The series continued with a lecture by Kevin Skadron, University of Virginia Department of Computer Science Chair and Harry Douglas Forsyth Professor. His October presentation, “Automata Processing: Massively-Parallel Acceleration for Approximate Pattern Matching,” provided an overview of the AP architecture and observations from accelerating its applications. Skadron cites his research as exploring processor design techniques for managing power, thermal and reliability constraints, all with a focus on manycore and heterogeneous architectures. He has achieved two patents of his own and over 100 peer-reviewed publications and counting since his college summers spent interning for Microsoft and Intel.
Ralph Etienne-Cummings, Professor and Chair of Johns Hopkins University’s Department of Electrical and Computer Engineering, closed out this semester’s series in December. This final presentation – “I, Robot: Blurring the lines between Mind, Body and Robotics” – suggested new approaches to brain-machine interfaces (BMI). Etienne-Cummings’ research interests include systems and algorithms for biologically inspired and low-power processing, biomorphic robots, applied neuroscience, neutral prosthetics and computer integrated surgical systems and technologies. His high level of curiosity has been evident since he was a child and repaired his own short wave radio to listen to a soccer match. Now the holder of seven patents, Etienne-Cummings is known to make time for diversity and mentoring initiatives intended to awaken a similar curiosity in others.
Computer engineer Densmore and team aim to advance synthetic biology
By Michael G Seele
The rapidly growing field of synthetic biology has made long strides in recent years as researchers have modified the genetic makeup of living organisms to get them to behave in different ways — flagging the presence of toxins in the environment, for example. Researchers have done this by breaking down biology into basic building blocks. However, using these building blocks has been increasingly difficult without a clear design methodology and supporting quantitative metrics researchers could use to make decisions.
Associate Professor Douglas Densmore (ECE, BME) would like to take the guess work out of biological design and speed the development of synthetic biology in the process. Working under a new $10 million National Science Foundation “Expeditions in
Computing” grant, Densmore will lead the Living Computing Project, a comprehensive effort to quantify synthetic biology, using a computing engineering approach to create a toolbox of carefully measured and catalogued biological parts that can be used to engineer organisms with predictable results. These parts will allow the entire field to understand better what computing principles can be applied repeatedly and reliably to synthetic biology.
Densmore and assistant professors Ahmad Khalil (BME) and Wilson Wong (BME), and Research Assistant Professor Swapnil Bhatia (ECE) will take the lead on the project, partnering with colleagues at MIT and Lincoln Laboratory over the course of the five-year grant. The award marks the first time explicitly exploring computing principles in multiple living organisms and openly archiving the results has been funded.
“This puts a stake in the ground to make synthetic biology more rigorous,” Densmore said. “We want to build a foundation that’s well understood, built to use software tools, and that can serve as an open-source starting place for many advanced applications.”
Synthetic biologists take snippets of DNA and combine them in novel ways to produce defined behavioral characteristics in organisms. For instance, Densmore envisions a day when one might engineer a cell to change state when it detects cancer. The cell could be introduced into a patient, retrieved after a time and read like the memory of a computer, enabling detection of disease much earlier and less invasively than is now possible. Engineering that cell could be far easier and faster if researchers had a detailed inventory of parts and corresponding software tools they could use to create it.
Densmore is a core member of — and the only computer engineer in — BU’s new Biological Design Center. He has long been applying the kinds of tools used in computer engineering to synthetic biology. His software aims to identify and characterize biological parts — segments of DNA — and assemble them into complex biological systems. The NSF Expeditions in Computing grant will allow for expansion of that effort, but there are significant challenges in applying computer engineering principles to natural systems.
“What is power consumption in biology?” Densmore cites as an example. “What are the metrics in biology that make sense, can be repeated, and are reliable? You can’t make decisions in engineering without metrics and quantifiable information.”
“Programming a flower to change color, a cell to repair damaged tissue, or a mosquito to defeat malaria, is likely to require a different computational model than programming an app for your laptop,” said Bhatia. “Discovering this new type of computational thinking in partnership with synthetic biologists is what I am most excited about.”
Densmore hopes this project will take synthetic biology from an artisanal endeavor to a true engineering discipline with a solid, quantified foundation.
“Computation is important for moving any field forward and that’s what we’re trying to do with synthetic biology,” Densmore said. “We’re trying to build a library based on computing principles for the whole community, an open-source repository of biological pieces that use those principles reliably, repeatedly, and with broad applicability.”
“The Expeditions in Computing program enables the computing research community to pursue complex problems by supporting large project teams over a longer period of time,” said Jim Kurose, NSF’s head for Computer and Information Science and Engineering. “This allows these researchers to pursue bold, ambitious research that moves the needle for not only computer science disciplines, but often many other disciplines as well.”
Giles has recently accepted key roles aimed at progressing the field of astronomy and of supercomputing; all while, continuing his role as a STEM diversity advocate.
By Gabriella McNevin and Rebecca Jahnke (COM ‘17)
Roscoe Giles is a Professor of Electrical and Computer Engineering at Boston University (BU). Within the last few months, Giles has become involved with a $864-million cooperative agreement to manage the National Radio Astronomy Observatory (NRAO). He has also accepted an invitation to aid in the development of U.S. supercomputing policies.
In October, 2015, Giles started a two-year term as Chair of the Associated Universities, Inc. (AUI) Board of Trustees. The following month, NSF approved the largest cooperative agreement the astronomy division has ever granted. A 10-year, $864-million cooperative contract with AUI to manage the NRAO. This record breaking contract will tie AUI leadership to the core goals of astronomical research embraced by NRAO.
Also in October, Giles was invited to the White House’s National Strategic Computing Initiative (NCSI) Workshop. NCSI was established by President Obama’s executive order to ensure the United States continues its role as a supercomputing pioneer in the coming decades. The workshop sought to jumpstart ideas for a cohesive, multi-agency strategy. While at the workshop, he and other industry, academic, and government leaders discussed the challenges and opportunities associated with the increase in computing demands and the heightened role of big data in the ever-evolving technological landscape.
Giles is no stranger to government policy. Having served as Chairman of the United States Department of Energy’s Advanced Scientific Computing Advisory Committee from 2008 to 2015, Giles directly influenced the management and direction of federal scientific computing programs.
Giles’ expansive research interests provide a broad foundation to draw upon. Giles started his education studying physics. He obtained his Bachelor’s of Arts degree with honors from the University of Chicago and received Master’s of Science and Ph.D. degrees from Stanford University.
Giles shifted his focus to electrical and computer engineering upon joining Boston University in 1985. Giles is focused on advanced computer architectures, distributed and parallel computing and computation science.
On LinkedIn, Roscoe Giles describes himself simply as an optimist intent to push “the envelope of computing and science in the large.”
Giles is well acquainted with national initiatives to increase diversity in STEM fields. Giles is listed by the Career Communications Group as one of the “50 Most Important Blacks in Research Science,” and was the first African American to earn a theoretical physics PH.D. from Stanford. Additionally, Giles was the first ever African American conference chairman of the Supercomputing Conference, which took place in Baltimore, Maryland in 2002
To that effect, Giles has been lauded not just for his research, but also for his community outreach. Giles was a Founder and Executive Director for the Institute of African American E-Culture. The foundation worked to open access to information technology to minorities and disadvantaged communities across the country. Giles won the Computing Research Association (CRA) A. Nico Habermann Award for his service as a faculty adviser and Minority Engineers Society Mentor.
At the Boston University Department of Electrical and Computer Engineering, Giles has received recognition including Scholar-Teacher of the Year in 1992. In 1996, Giles won Boston University’s College of Engineering Award for Excellence in Teaching.
By Rebecca Jahnke (COM ’17)
ECE PhD student Onur Sahin won first prize this November at the Association for Computing Machinery’s (ACM) Special Interest Group on Design Automation (SIGDA) Student Research Competition. Sahin, who is advised by ECE Professor Ayse Coskun, won for his project on providing sustainable performance to mobile device users, titled “Pushing QoS-Awareness into Thermal Management for Sustainable User Experience in Mobile Devices”
Sahin soared through the competition’s multiple rounds at the International Conference on Computer Aided Design (ICCAD) in Austin. Contestants had entered by submitting a write-up describing their research focuses, the novel aspects of their approaches and the impact their projects could have on society. Sahin was among the 20 entrants invited to the poster presentation at the ICCAD, and the five subsequently selected by industry and academic judges to proceed. Those five delivered 10-minute presentations before a judging panel, where they were assessed for their knowledge of their areas, contributions of their research and the quality of their presentations. Judges named Sahin winner following this round.
Sahin’s project idea is a response to modern mobile devices that have significantly increased computational abilities, but generate significant amounts of heat and power dissipation. Unlike other computation devices, mobile devices’ limited battery-life and small size limit their cooling capabilities. This poses a problem for the many users who run computationally intensive applications – like gaming, browsing, media and data processing – for extended durations.
Currently, mobile devices employ a thermal throttling mechanism to slow the devices and reduce their temperatures. However, this reduces performance levels and degrades the user experience.
Sahin’s project addresses the drawbacks of current thermal throttling techniques to mitigate thermal limitations on smartphones. By instituting techniques that prevent an application from boosting performance beyond what is actually required to run that application, Sahin proposes that heating can be slowed. This will allow users to interact with their devices for longer at higher performance levels. Having experimented with real-life smartphones, Sahin and his team reassure that their technology can be easily integrated into current mobile devices.
This competition is one of the several student research competitions annually co-located with ACM sponsored conferences. Each conference focuses on a different major area of computing. The competition is sponsored by Microsoft Research and allows undergraduate and graduate students across computing disciplines to gain visibility for their research projects and finesse their abilities to effectively communicate their ideas.
Sahin will join winners from all conferences to compete in the ACM Grand Final against researchers from all computing areas. From there, the top three contenders and their advisors will receive formal recognition at the ACM Awards Banquet, where the Turing Award – the highest distinction in computer science – is presented annually.
Further information regarding the competition and the winners are provided at http://src.acm.org/winners.html.
MOC successfully rallies academia, government and industry in developing new cloud.
By Rebecca Jahnke (COM ’17)
The Massachusetts Open Cloud (MOC) project – led by ECE Professor Orran Krieger – just announced a set of core industry partners, spanning key hardware, software and services industry sectors. The MOC is an ambitious project that aims to create a public cloud, based on a revolutionary model for a multi-provider Open Cloud eXchange (OCX).
In existing public clouds one provider operates the entire cloud. In contrast, the OCX model underlying the MOC allows for multiple entities to provide computing resources and services in a level playing field. Having multiple providers – all with their own specialties – participating in the same cloud will enable a broader range of users and applications to be supported.
The core corporate partners of the MOC – Brocade, Cisco, Intel, Lenovo, Red Hat and Two Sigma – have made financial commitments as well as in-kind commitments, ranging from computer infrastructure in support of MOC deployment and operation, to engineering expertise to support the development of OCX functionality. The companies have also pledged executive sponsors to keep company and project goals aligned and to support MOC’s development. These new partnerships underscore the strong and growing industry support for the project, which has already secured in excess of $14 million of funding – more than quadruple the $3 million in seed funding that the MOC received from the Mass Tech Collaborative in 2014.
Incubated at and seed-funded by the Hariri Institute for Computing at BU (as part of the Cloud Computing Initiative led by its Director, Orran Krieger), this complex project has benefitted from strong BU institutional and administrative support, including the offices of the Provost, Corporate Relations, General Council, and IS&T Research Computing. Anchored at BU, the project is a collaboration that also involves Harvard University, MIT, UMass, and Northeastern University, as well as the Massachusetts Green High-Performance Computing Center (MGHPCC). The project leverages and builds on current and prior research by a number of ECE and CS faculty members at BU including Jonathan Appavoo, Azer Bestavros, Ran Canetti, Ayse Coskun, and Orran Krieger.
ECE Assistant Prof is Rising Star in Machine Learning
By Michael S. Goldberg
To Brian Kulis, advances in machine learning and artificial intelligence bring with them the opportunity to mesh theory with real-world applications, like driverless cars and computers that can describe aloud the objects in front of them.
“You want computers to be able to recognize what they are seeing in images and video,” says Kulis, a College of Engineering assistant professor of electrical and computer engineering. “For instance, can it recognize all the objects in a picture? Or a more difficult problem would be, can it look at a video and describe in English what is happening in the video? That is a major application area for machine learning these days.”
Kulis’ expertise in machine learning, along with his research in computer vision systems and other applications, brought him to BU this fall and has earned him the University’s inaugural Peter J. Levine Career Development Professorship, which will be awarded annually to rising junior faculty in the electrical and computer engineering department. The professorship’s three-year stipend will support scholarly and laboratory work. It was established by a gift from Peter J. Levine (ENG’83), a partner at the Silicon Valley venture capital firm Andreesen Horowitz and a part-time faculty member at Stanford University’s Graduate School of Business.
Kulis is a rising star in the machine learning field and the Levine professorship speaks to BU’s recognition of his achievements thus far, says Kenneth R. Lutchen, dean of ENG, and is a commitment to helping Kulis build on his world-class research and teaching.
Lutchen adds that Kulis, who earlier this year also received a National Science Foundation Faculty Early Career Development (CAREER) Award for research into machine learning systems, will be a critical faculty member of ENG’s new master’s degree specialization in data analytics.
“We think it will be one of the most popular specializations we have, and it will be accessible not just to students in this department, but also to biomedical, mechanical engineering, and systems engineering students who will want to have this same specialization. Brian’s expertise is perfectly aligned with teaching this,” Lutchen says.
Also a College of Arts & Sciences assistant professor of computer science, Kulis earned a bachelor’s degree in computer science and mathematics at Cornell University and a doctorate in computer science at the University of Texas at Austin. He did postdoctoral work at the University of California, Berkeley, then spent three years on the faculty of Ohio State University before coming to BU.
Millions or billions of data points
Data science is about managing huge data sets—think millions or billions of data points, from an array of sources—and programming computers to analyze the data and make predictions based on identified patterns. Advances in storing and analyzing these growing collections of information has made Big Data a hot field in both academia and industry, with Harvard Business Review pronouncing data scientist “the sexiest job of the 21st century.”
Those advances include artificial intelligence and machine learning, and they are what enable Kulis to develop exciting connections between theory and action. “There is a nice combination between the mathematics and the theoretical aspects of machine learning. It’s a very applied field, trying to solve real problems,” he says. “That balance is pretty rare.”
He describes his specific area of research as scalable nonparametric machine learning. While a traditional statistical model for analyzing a large amount of data would establish a model for performing the analysis, Kulis pursues a different method. In his research, the data itself determines how simple or complicated the analysis should be.
An example of this approach, he says, is analyzing a large collection of documents for the content they contain. A parametric model would establish 10 clusters of documents to analyze, one each on a set topic. A nonparametric model would instead analyze all of the documents and determine how many topics should be included in the analysis. “You want the data itself to guide the discovery process, and so if there is a lot to say, then you want your algorithm to reveal that structure,” he says. “It’s a more flexible way to do analysis.”
The field is ripe for approaches that allow researchers from different fields–biology and business, for example–to apply machine learning techniques to develop new ways of looking at the data they collect. Kulis says he is looking forward to working with faculty and students from different BU departments both in research and in his courses. “Machine learning brings together a lot of fields that for a long time have been fairly disjoined. When it comes to teaching, a lot of my excitement is in trying to bridge these different disciplines and to teach courses that bring together people from different areas,” he says.
The curriculum, Lutchen says, has relevance to the world at large: “As an engineering faculty, we want people to understand how these new tools and techniques can help society.”
Michael S. Goldberg can be reached at firstname.lastname@example.org.
Using the strange laws of quantum mechanics to encrypt the world’s most secret messages
By Kate Becker, originally published in BU Research
Just outside Washington, DC, a heavily armored truck, protected by armed guards, rumbles toward the Pentagon. Its cargo is critical to keeping the most sensitive government communications secret. But it’s not what you might expect. That precious cargo is nothing but numbers.
Though the details are a government secret, according to Alexander Sergienko, a Boston University College of Engineering professor of electrical and computer engineering, trucks like this are one likely way that the United States government might transport the numbers that are at the heart of the only unbreakable encryption technique in the world: the one-time pad. The one-time pad is a string of random numbers, also called a key, which a sender uses to encrypt her message.
But the one-time pad has one big weakness: the random numbers that are the key to coding and decoding it have to be physically transported from one place to another. Sending them over the internet, encrypted by traditional security measures, would be like locking the keys to Fort Knox inside a child’s piggy bank. If the numbers are intercepted, the code is worthless. So, how can you get random numbers from place to place with absolute security? The answer isn’t more armed guards and armored trucks, says Sergienko, who also has an appointment as a professor of physics in BU’s College of Arts & Sciences. It’s quantum mechanics, the bizarre set of rules that governs the subatomic world, where the everyday norms we take for granted—that an object should have a well-defined location and speed, for instance, and that it can only be in one place at a time—go out the window.
The one-time pad encryption method dates back to before World War II, and was used to secure diplomatic communiqués and wartime dispatches. The sender encrypts her message by taking each letter, or bit, of the original message and combining it mathematically with successive random numbers from the key, transforming it into a sequence of totally random numbers. (The longer the message, the longer the key must be: a message that’s 100 letters long requires a key of at least 100 digits.) The encrypted message is now absolutely secure: the sender can broadcast it over a radio or even scream it from the rooftops, if she wants. Only someone with an identical copy of the key can crack the code, by subtracting a matching set of numbers from the broadcast to unlock it. But the key is unbreakable only if it is used just once; if used a second time, code breakers can begin to reverse-engineer the random-number list. With every additional use, the code gets weaker and weaker, so the bank of random numbers must be constantly refilled to keep secure government communication going. That means more numbers, more armored trucks—and more effort and expense.
Sergienko is one of a group of physicists and computer scientists at BU and beyond working to solve this problem with an encryption technique called secure quantum key distribution. They are harnessing cutting-edge technology to implement basic protocols that are some 30 years old. Quantum key distribution exploits the strange laws of quantum mechanics to create a truly random key that is totally secure from eavesdroppers.
Here’s how it works. Each “bit” of the key is encoded in the polarization of a single photon—essentially, the direction in which the light particle is “waving.” It can be up, down, or anything in between. In this case, though, each photon is prepared set in only one of two “bases”—horizontal/vertical, where horizontal might represent a one and vertical a zero; or tilted at an angle, with 45 degrees up representing one and 45 degrees down representing zero.
Sergienko maps out how it works using three characters well known to physics students: Alice, who’s sending the message; Bob, who is receiving it; and Eve, an eavesdropper out to covertly intercept it. To read out the state of each incoming photon, Bob has to pick the correct base. Alice can’t tell him the bases in advance, so he guesses randomly. Later, Alice reports the bases she used for each photon, and Bob throws away the readings for which he picked the wrong base. The result: Bob and Alice end up with identical, random strings of ones and zeros that they can use as a fresh key for their future communications.
If eavesdropper Eve tries to intercept photons traveling from Alice to Bob, Bob will notice a shortage of incoming photons. Eve could attempt to hide the theft by copying the polarization of each stolen photon and sending it on to Bob, but the laws of quantum mechanics, which make it impossible to perfectly “clone” the quantum state of a photon, get in her way, so she is bound to make mistakes that betray her presence. So, not only do Alice and Bob have truly random keys in hand, they also have the ultimate security against eavesdroppers: the laws of physics.
That’s the easy part, from Sergienko’s point of view. The hard part: making this technique work over practical distances. That’s because, to retain the quantum properties that make them so useful for secure communication, photons have to be kept isolated from all external disturbances. Another challenge: the same “no-cloning” law that thwarted Eve prohibits the use of any amplifiers, standard in traditional telecommunications, on the optical lines that transmit the photons. “One single photon has to travel from point A to point B,” says Sergienko. It’s as if the code were written on eggshells. How can you send millions or billions of those eggshells, far and fast?
“It’s a dilemma,” says Sergienko. “The quantum realm gives you more opportunities, but to make these opportunities work for people, you have to solve the problem of how the quantum state will survive in the classical environment,” the messy reality in which it’s nearly impossible to avoid interacting with other fields and particles.
Today’s “best of the best” technology can create a few million quantum states per second, says Sergienko. But the farther you try to send them, the more of them will “crack” like broken eggshells—that is, get absorbed into the line and disappear—before they reach their destination. So while some physicists are chasing distance records, dispatching quantum states across hundreds of kilometers, Sergienko is more interested in finding the optimal balance between transmission distance and the rate at which new states can be created. Today, data rates of about 100 kHz are possible within a modest city-sized network. Not exactly telecom speed—typical home broadband connections run 10 or 100 times faster—but good enough to transmit the bits of a robust key that guarantees the highest level of secure communication.
In 2003 and 2004, Sergienko and Gregg Jaeger, an associate professor of natural sciences and mathematics in BU’s College of General Studies, led a BU team that partnered with researchers at Harvard University and BBN Technologies (now a part of Raytheon) to build just such a system. With support from the Defense Advanced Research Projects Agency (DARPA), the military’s advanced research arm, they used standard commercial fiber optic cables in the ground to send photons between three sites in the greater Boston area: one at BU, one at Harvard, and a third at BBN’s headquarters, near Fresh Pond in Cambridge. The system spanned about 18 miles end-to-end. “We showed that this secure communication can be established between three nodes through the metropolitan fiber, and can go 24/7,” says Sergienko. Even though the data rate was not high—just about 1,000 bits per second, slower than a dial-up modem—over time, each site would build up a long enough key to enable secure communication on demand. The system ran for three years, and was followed several years later by similar, independent demonstration networks in Europe, Japan, and China.
What happened next? That’s a government secret. But Sergienko is confident that secure quantum key distribution networks are live today somewhere in the United States. The likeliest spots: Washington, DC, where such a network could enable secure communication between government agencies, eliminating the need for all those trucks; and Wall Street, where it would guarantee absolute privacy for transactions between financial institutions.
Today, Sergienko is trying to narrow the gap between quantum and classical data rates. With fiber quality nearly as good as it can get, and the rate at which new quantum states can be created almost maxed out, Sergienko and his colleagues around the world are taking a new tack: encoding more bits of information in a single photon. While photon polarization can only represent zero or one, a different property of photons, called orbital angular momentum, can encode at least 10 different distinguishable states, and possibly more. Instead of simple binary bits, cryptographers would have a whole mini alphabet to work with.
As for those armored trucks? Though they might still be standard for transporting secret keys to remote locations, Sergienko wouldn’t be surprised if they are no longer pulling up to the Pentagon. But the secrets of the unbreakable code are still just that: secret.
This research is a continuation of work done by Professor Sergienko in 2013, information on which can be found here.
Densmore’s Contributions Part of a $32 Million DARPA Contract to Cutting Edge Synthetic Biology Effort
By Rebecca Jahnke (COM ’17)
A $32 million contract from the Defense Advanced Research Projects Agency (DARPA) was awarded to “The Foundry” (http://web.mit.edu/foundry/), a DNA design and manufacturing facility at the Broad Institute of MIT to support the engineering of novel biological systems. Boston University Computer Engineering Professor Douglas Densmore’s role in automating the facility’s design process with software inspired by electrical and computer engineering was key in establishing novel, large scale, parallel design processes that landed the contract.
The Foundry focuses on designing, testing and fabricating large sequences of genetic information. The intent is to create DNA nucleotide arrangements that can be applied widely for medical, industrial and agricultural purposes.
Engineers at the Foundry work with chains containing millions of nucleotides, all of which are specified using only the letters A, T, G and C. The Foundry sought Densmore’s computer aided design expertise to help automate complex processes because the feat is impossible for an engineer writing out such vast sequences by hand.
Densmore’s contributions will allow the Foundry to significantly increase its output of DNA designs beyond what would have been possible relying on conventional design techniques. The Foundry’s work will lead to greater advances faster – tackling issues like delivering nitrogenous fertilizer to cereal crops and converting compounds that naturally occur in human bacteria into therapeutic drugs.
Douglas Densmore is a Kern Faculty Fellow, Hariri Institute for Computing and Computational Science and Engineering Junior Faculty Fellow, and Professor in the Department of Electrical and Computer Engineering at Boston University. He also acts as the director of the Cross-disciplinary Integration of Design Automation Research (CIDAR) group at Boston University, where his team develops computational and experimental tools for synthetic biology. His research facilities include both a computational workspace in the Department of Electrical and Computer Engineering as well as experimental laboratory space in the Boston University Biological Design Center. Densmore is the President of the Bio-Design Automation Consortium, Nona Research Foundation, and Lattice Automation, Inc.
For more information, please see the Broad Institute of MIT press release.
U.S. Department of Energy’s SunShot Initiative Awards Boston University in Partnership with Sandia National Laboratories $1.15 million
By Rebecca Jahnke (COM ’17) and Bhumika Salwan (Questrom ’16)
Boston University has been awarded $1.15 million from the U.S. Department of Energy SunShot Initiative to advance self-cleaning solar collector technology and bring the new application to solar fields across the country. With partner Sandia National Laboratories, BU aims to improve high efficiency operations of large solar plants in semi-arid and desert lands. Industrial partners of BU include Corning Inc., Eastman Kodak, Industrial Technology Research Institute (Taiwan) and Geodrill (Chile).
When solar mirrors are first placed in fields, they have very high efficiency rates. However, when dust accumulates on the surface of these solar collectors, their efficiency decreases – the dust obstructs sunlight, thus reducing the amount of energy a solar plant can produce and, in turn, the revenue the plant can generate.
Students and faculty from the Electrical and Computer Engineering Department and Questrom School of Business are developing a transparent electrodynamic screen (EDS) film technology that can retrofit solar collectors with a transparent film and protect them from dust. Leading the graduate students are ECE research professor of electrical and computer engineering and materials science and engineering Malay Mazumder, ECE professor of electrical and computer engineering Mark Horenstein and Questrom associate professor of operations and technology management Nitin Joglekar. A team of four graduate and five undergrad students are working on this project at BU.
The team’s developments are especially important given that, in the United States, solar plants are most commonly located in southwestern states with dry and semi-dry climates that have a high dust deposition rate and little rain. Until now, the most common solution has been to clean solar collectors by deluge spray, washing with water and detergent. Under this process, cleaning a 300 MW plant in the southwest would require more than one million gallons of water and cost upwards of $1 million dollars every year in an area already subject to drought. By advancing solar mirrors’ self-cleaning abilities, solar plants could significantly lower their costs.
Through the EDS film technology, voltage pulses would activate the EDS film and allow the electric field to charge dust particles on its surface. Electrodynamic traveling wave motion created by the pulsed phase voltages would then remove the particles. The intent is for the EDS film to activate the film as frequently as needed without requiring water, thus allowing the solar devices to maintain maximum efficiency. By keeping panels clean, heightening operational efficiency and conserving water, the EDS film would have the double effect of driving down solar electricity’s cost and conserving natural resources.
The SunShot Initiative is a collaborative national effort that supports innovation by private companies, universities and national laboratories seeking to make solar energy fully cost-competitive with traditional energy sources before the end of the decade.