With the governor of Massachusetts pledging $3 million in state support, BU leaders Friday announced plans for development of a pathbreaking computing cloud that could spur economic growth and technology innovation.
Azer Bestavros, director of BU’s Rafik B. Hariri Institute for Computing and Computational Science & Engineering, says the University will spend the next three years developing software for the Massachusetts Open Cloud (MOC) in collaboration with the commonwealth, technology companies, and BU’s university partners at the Massachusetts Green High Performance Computing Center(MGHPCC) in Holyoke, where the announcement was made.
In cloud computing, users rent access to massive off-site computational power. Companies such as Amazon and Google offer clouds; like those, the MOC would be public, meaning anyone could purchase computing power. But unlike those closed clouds (each operated by a single provider), the MOC would be open to multiple vendors of software, hardware, and computer services, all of whom would have access to operational data about the MOC: what programs were running on it, as well as any performance or problem reports.
No such public, open cloud currently exists. “The MOC will be the first realization of this model,” says Orran Krieger, director of the Cloud Computing Initiative at the Hariri Institute, who will lead the MOC development. “If it’s successful, we expect other clouds to follow our model, fundamentally changing the nature of cloud computing.” Krieger is also a College of Arts & Sciences research professor of computer science.
The plan calls for hosting the MOC at the MGHPCC data center, where it would tap the computational power of BU and its MOC partners, who have jointly contributed $16 million to MGHPCC, leveraging the $3 million matching grant from the state. Besides the participating universities, MOC partners are tech firms Red Hat, Cisco, EMC, Juniper Networks, SGI, Mellanox, Plexxi, Riverbed, Enterprise DB, Cambridge Computer Services, and DataDirect Networks.
“Investing in innovative sectors that are becoming a prominent part of our economy is critical to meeting the demands of the 21st century,” says Greg Bialecki, Massachusetts housing and economic development secretary.
Bestavros, who is also a CAS computer science professor, compares the MOC to a shopping mall, whose physical mall would be the MGHPCC, eliminating the expense of building a separate data center. And with the MGHPCC’s university partners doing research that could use the MOC, the Holyoke center is “a little petri dish in which the MOC could develop,” says center executive director John Goodhue.
The MOC’s corporate partners would be the equivalent of mall stores, selling their products and services at the Holyoke center. Banding together in a mall-like organization would permit them collectively to draw enough traffic, leveraging economies of scale, Bestavros says. The vendors’ customers would range from academic researchers to corporations and others. For example, Harvard is one of the MGHPCC partners helping to develop the MOC; its Research Computing arm, the conduit to computing services for the Faculty of Arts and Sciences, is planning on using the MOC as “one part of our strategy to provide lasting computer resources to our faculty and researchers,” says James Cuff, Harvard’s assistant dean for research computing.
The MOC concept of a cloud marketplace grew out of BU research in 2009. In a recent paper, Bestavros and Krieger argue that closed clouds usually have a single provider, who “alone has access to the operational data.” For this and other reasons, they write, “in the long run, if only a handful of major providers continue to dominate the public cloud marketplace, then any innovation can only be realized through one of them.”
With a cloud designed like the MOC, their paper says, “many stakeholders, rather than just a single provider, participate in implementing and operating the cloud. This creates a multisided marketplace in which participants freely cooperate and compete with each other, and customers can choose among numerous competing services and solutions.”
Another advantage: an open cloud would be more secure than a closed one, Bestavros and Krieger say. It is “the best way to make sure that software is clean,” according to Bestavros, especially as American tech companies complain that federal computer snooping might scare off billions of dollars’ worth of cloud computing customers. With a public, open cloud like the MOC, “the National Security Agency cannot put backdoors in an open-source code, because you can see what the software is doing,” he says.
The state money is a matching grant from the Massachusetts Technology Collaborative, and it will pay for developing software and equipment for the MOC. “The commonwealth’s participation allows BU to create a neutral ground that allows the industry and university partners to collaborate in an area where they also compete,” the MGHPCC’s Goodhue says. That is, the money will pay for development of software that all the MOC investors can use, so that “no partner will be advantaged more than any other.”
Patrick Larkin, director of the Innovation Institute at the Massachusetts Technology Collaborative, the source of the state grant, says the MOC will be “a virtual sandbox that will empower the commonwealth’s researchers, start-ups, industry, and the public sector to explore, develop, and release big data and cloud computing innovations.” Those innovations could spur discoveries spanning the state’s economy, he says, “such as transportation, health care, energy, finance, life sciences, and manufacturing.”
- Professor Roscoe Giles, executive steering committee member.
- Assistant Professor Douglas Densmore, Junior Faculty Fellow 2012-14
- Assistant Professor Ayse Coskun, Junior Faculty Fellow 2011-12
- Additionally, the 2014 Hariri Award for Innovative Computing Models, Algorithms, and Systems was awarded to an ECE graduate student, Eran Simhon, for his poster entitled, “Advance Reservation Games”.
ENG team has a creative solution to a costly problem
By Leslie Friday (Video by Joe Chan), BU Today
The energy from the sun that hits the Earth in a single hour could power the planet for an entire year, according to the US Department of Energy (DOE). One of the best places to harness that free, abundant, and environmentally friendly energy is a desert, but deserts, it turns out, come with a nemesis to solar panels: sand. The particulate matter that constantly blows across deserts settles on solar panels, decreasing their efficiency by nearly 100 percent in the middle of a dust storm. The current solution is for solar field operators to spray the dust with desalinated, distilled water.
“That might not sound like a big deal, but if you have millions of square feet of solar panels out in a desert, it ends up being costly—especially if water is a scarce resource,” says John Noah Hudelson (ENG’14), one of several graduate students working to find a better solution with Malay Mazumder, a College of Engineering research professor of electrical and computer engineering and of materials science and engineering, and Mark Horenstein, an ENG professor of electrical and computer engineering. “We’re looking to use just a small amount of electricity to statically push the dust off the surface of the solar panel or the solar mirror.”
The BU team’s answer, called a transparent electrodynamic system (EDS), is a self-cleaning technology that can
be embedded in the solar device or silkscreen-printed onto a transparent film adhered to the solar panel or mirror. The EDS exposes the dust particles to an electrostatic field, which causes them to levitate, dipping and rising in alternating waves (the way a beach ball bounces along the upturned hands of fans in a packed stadium) as the electric charge fluctuates.
The entire process takes seconds and uses a minuscule amount of power, generated by the solar device itself—about 1/100th of what it produces daily. In its final version, the EDS will be programmable or will automatically detect the presence of surface dust and switch on. “There’s nothing like this on the market,” Horenstein says.
The inspiration for the EDS came to Mazumder more than a decade ago from an unlikely source: human lungs. He remembers thinking that the organs, outfitted with self-cleaning hairs that sweep dust up and out of the respiratory system, were “ingenious defense mechanisms.” He thought he could mimic that tidy biological system and apply it to other mechanisms.
In 2003, NASA, whose scientists thought the technology could be used on future Mars missions to keep equipment free of cosmic dust, gave him a three-year, $750,000 grant. When that funding expired, a $50,000 Ignition Award from BU’s Office of Technology Development kept Mazumder’s research afloat while he searched for alternative funding. His big break came in 2010, when he gave a presentation on the EDS at an American Chemical Society conference in Boston. News of the technology spread through articles in such publications as the New York Times.
Mazumder received a call from David Powell, a research and development manager at Abengoa Solar, a global pioneer in the construction of CSP (concentrated solar power) and PV (photovoltaic) power plants. The company operates the Solana Generating Station in Gila Bend, Ariz., and the soon-to-open Mojave Solar Project near Barstow, Calif. Each has the capacity to produce 280 megawatts—or the ability to power more than 100,000 homes. With at least two plants in desert locations, Abengoa was keenly interested in the success of the EDS and eager to test Mazumder’s prototypes.
In 2012, Mazumder and Abengoa landed a two-year, $945,000 grant from the DOE Office of Energy Efficiency and Renewable Energy to further test and expand the capacity of the EDS. Horenstein and Nitin Joglekar, a School of Management associate professor of operations and technology management, are co–principal investigators of the grant, and Sandia National Laboratories in Albuquerque, N.M., signed on to help evaluate the prototype’s efficiency and develop larger-scale models. With a $40,000 grant from the Mass Clean Energy Council, the team’s total funding rose to nearly $1 million.
For two months last year, Hudelson and doctoral candidate Jeremy Stark (ENG’14) tested nearly 20 EDS prototypes at the Abengoa and Sandia sites before rain and snow cut their work short. They found that the system performed as expected, removing at least 90 percent of dust particles from solar panel surfaces. Next, the BU team must figure out how to protect the EDS from Mother Nature and to upscale to industrial-sized models.
Mazumder estimates that the United States would need to produce one terawatt (one trillion watts) of solar power to meet household and industry demand. That kind of output is a distant goal, but he sees great potential in getting started by building solar plants in the Southwest—specifically the Mojave Desert. The arid region has an elevation of nearly 5,000 feet, receives regular sun, and has fewer dust storms than other desert regions.
“The Mojave Desert and the Southwest, if fully utilized and assuming the existence of a reliable distribution system,” he says, “could provide most of the US demand with respect to our energy needs.”
Mazumder will submit a proposal soon to the DOE for renewed funding, but he must first identify a manufacturing partner willing to produce industry-scale panels equipped with EDS technology. Once that goal is reached, he thinks, the self-cleaning system could hit the market after two years.
“We must proceed fast,” he says. “The need is there.”
In pursuit of the Hariri Institute’s mission to catalyze and propel collaborative, interdisciplinary research through the use of computational and data-driven approaches, the Institute supports a portfolio of ambitious computational research projects, as well as forward-looking educational and outreach initiatives at Boston University.
In line with this mission, we are pleased to announce the Call for selecting and funding 2014 Institute portfolio projects. The process is designed to be fairly lightweight, imposing minimal overhead on proposing investigators, while ensuring that the process itself acts as a catalyst for the exchange and development of research ideas among Institute affiliates.
The process for exploring and developing projects to be sponsored by the Institute encourages principal investigators to involve the Institute in shaping and refining their research ideas, suggesting potential collaborations, identifying additional or alternative sources of funding, and finding other creative ways to help support the project.
Eligibility: Faculty affiliates of the Hariri Institute are eligible to submit proposals for support from the Institute for research and other activities by completing the Research Funding Application.
Process: For details, please check the project proposal development, submission, and evaluation process and complete the Research Funding Application.
Deadline: April 4, 2014 is the deadline for Summer/Fall start dates. There will be a November 2014 deadline for Spring 2015 start date projects.
For more information: please contact Linda Grosser, Director, Program & Project Development, of the Hariri Institute, by email at email@example.com.
Many engineers have great ideas for products, but unfortunately, they don’t often have a background in business that will allow them to bring their designs to market.
To help with this problem, two Boston University research teams recently participated in the National Science Foundation (NSF) Innovation Corps (I-Corps), a program that encourages scientists and engineers to broaden their focus beyond lab work through entrepreneurship training.
“We had been trying to bring some of our ideas to a commercial state when we heard about the program,” said David Freedman, a BU research associate in the Department of Electrical & Computer Engineering. “It seemed like a great fit for us.”
Freedman and postdoctoral associate, George Daaboul, had been working closely with Professor Selim Ünlü’s (ECE, BME, MSE) research group trying to determine how their technology, IRIS, used to detect viruses and pathogens, might be applied in doctors’ offices, hospitals, and emergency care centers. They soon decided that forming an I-Corps team would allow them to evaluate the commercial potential.
Teams receive $50K in grant money and consist of an Entrepreneurial Lead (Daaboul), a Principal Investigator (Freedman), and a business mentor. The researchers asked BU lecturer and entrepreneur, Rana Gupta (SMG), to take on the latter role.
Also participating from BU were Assistant Professor Douglas Densmore (ECE) and Research Assistant Professor Swapnil Bhatia (ECE). They pitched Lattice Automation, technology that will allow technology by the Cross-disciplinary Integration of Design Automation Research (CIDAR) group to transition into commercial products. Ultimately, they hope to create software that will help synthetic biologists work more efficiently.
“Our technology is building upon state-of-the-art techniques in computer science, electrical engineering, and bioengineering,” explained Densmore.
Over eight weeks in the fall, participants attended workshops in Atlanta, Ga., met with researchers from the 21 teams, followed an online curriculum, and spoke with up to 100 different potential consumers of their technology – a process known as “customer discovery.”
Through this experience, Freedman and Daaboul quickly learned that introducing a new technology to customers might not be the right approach for their research.
“We decided instead to focus on the pains customers had with existing technologies and hone in on how we could alleviate those,” said Freedman.
Added Daaboul: “Finding out what people really needed before developing a technology really allowed for a much different perspective than what I’m used to.”
Much of the knowledge gained through I-Corps will be used to advance science and engineering research. Some products tested during the workshops even show immediate market potential by the conclusion of the curriculum.
“I would recommend this program to anyone working in science or industry,” said Freedman. “Not only did this change how we think about our research, we also learned how to better tell our narrative.”
-Rachel Harrington (firstname.lastname@example.org)
Imagine two hiring managers sizing up an applicant. The first gathers all the information she can before forming a first impression. The second collects the bare minimum but does so strategically, arriving at virtually the same impression with far less effort and in far less time.
It turns out that the latter approach can be taken to produce reasonably accurate photos of objects under low lighting conditions using a remote sensing technology such as LIDAR, which bounces pulsed laser light off of a targeted object to form an image. Rather than waiting to collect and compare hundreds of reflected photons to generate each pixel of the image, as is typically done, you can instead count the number of laser pulses it takes to detect the first photon at each pixel. The lower the number, the greater the intensity of the light reflected off the object’s surface — and thus, the brighter the pixel.
Assistant Professor Vivek Goyal (ECE), who joined the College of Engineering faculty in January, and who, along with former colleagues at MIT’s Research Laboratory of Electronics, demonstrated the concept in a recent issue of the journal Science, calls his method “first-photon imaging.”
“The project started out as a thought experiment,” said Goyal, whose research was funded by the Defense Advanced Research Projects Agency’s (DARPA) Information in a Photon Program, and the National Science Foundation. “We wondered what we could infer about a scene from detecting only one photon from each pixel location, and eventually realized that when the intensity of light is very low, the amount of time until you detect the photon gives you information about the intensity of the light at each pixel.”
First-photon imaging may ultimately improve night vision and low-light remote sensing technologies by extending the distance at which images may be taken. The new method may also dramatically increase the speed of biological imaging and the variety of samples — many of which degrade when subjected to higher-intensity lighting — that can be photographed.
To produce a high-quality image from the raw, single-photon-per-pixel data, Goyal’s method applies a computer model of surfaces and edges typically encountered in three-dimensional, real-world objects, correcting the intensity and depth of neighboring pixels as needed to fit the model; and filters out noise coming from ambient light sources.
While many researchers are pursuing new techniques to boost remote sensing and microscopy capabilities, most focus on building more effective detectors. Goyal is working to significantly enhance existing detectors by incorporating accurate physical models in signal processing, and to further explore the potential impact of first-photon imaging on remote sensing and microscopy.
After the Boston Marathon bombings last year, it took authorities just three days to sift through an abundance of footage and find their suspects – light speed compared to the weeks it took to find those responsible for the London bombings in 2005.
Still, can this happen faster? Professor Venkatesh Saligrama (ECE, SE) thinks so, and he’s working to making that vision a reality.
The Office of Naval Research awarded him $900K for his project, Video Search and Retrieval, which will focus on developing a visual search system. Think Google but for security videos.
“Our initial idea was to develop a system that could annotate web videos,” said Saligrama, who collaborated with Pierre-Marc Jodoin at the University of Sherbrooke on early stages of this research. “That project turned out to be extremely challenging so we started to focus on surveillance videos, where the footage is obtained in a controlled environment.”
Manually searching large archives of footage can be both time-consuming and monotonous. Saligrama and Ph.D. students, Greg Castanon (ECE) and Yuting Chen (SE), are now working closely with the U.S. Naval Research Laboratory to help change this.
Chen said she is looking forward to working on this project with Saligrama, who she first encountered while conducting her own research.
“I spent almost a year and a half working on an idea that employs correlating motion clues to calibrating camera networks,” she said. “When I came to BU Systems Engineering and browsed the research papers, I found the exact idea implemented by Venkatesh’s group. I was surprised and just a little bit bitter.”
From there, she knew that she wanted to study with Saligrama.
“He is an experienced researcher and just as passionate and curious as a young freshman,” she said. “I find that one sentence from him can help me through a problem that’s been troubling me for weeks.”
Chen, Castanon and Saligrama hope that together, they can make the process of searching through security footage more automated and responsive to user query video searches.
“Currently, for many YouTube videos, there are textual meta-tags that are used in the search process,” Saligrama explained. “For surveillance videos, we do not often have this so our searches need to be based purely on visual features and patterns.”
One of the challenges in video search is that activity patterns can be highly inconsistent and can occur for unpredictable amounts of time.
“Unlike image search though, videos have some temporal patterns we can exploit,” said Saligrama.
In the future, Saligrama hopes that the research will not only improve security but improve medical database searches as well.
For more information about the project, visit our Research Spotlight page.
-Rachel Harrington (email@example.com)
New Laser Technique Boosts Accuracy of DNA Sequencing Method
Low-cost, ultra-fast DNA sequencing would revolutionize healthcare and biomedical research, sparking major advances in drug development, preventative medicine and personalized medicine. By gaining access to the entire sequence of your genome, a physician could determine the probability that you’ll develop a specific genetic disease or tolerate selected medications. In pursuit of that goal, Associate Professor Amit Meller (BME, MSE) has spent much of the past decade spearheading a method that uses solid state nanopores — two-to-five-nanometer-wide holes in silicon chips that read DNA strands as they pass through — to optically sequence the four nucleotides (A, C, G, T) encoding each DNA molecule.
Now Meller and a team of researchers at Boston University — Professor Theodore Moustakas (ECE, MSE) and research assistants Nicolas Di Fiori (Physics, PhD ’13) and Allison Squires (BME, PhD ’14) — and Technion-Israel Institute of Technology — have discovered a simple way to improve the sensitivity, accuracy and speed of the method, making it an even more viable option for DNA sequencing or characterization of small proteins.
In the November 3 online edition of Nature Nanotechnology, the team demonstrated that focusing a low-power, commercially available green laser on a nanopore increases current near walls of the pore, which is immersed in salt water. As the current increases, it sweeps the salt water along with it in the opposite direction of incoming samples. The onrushing water, in turn, acts as a brake, slowing down the passage of DNA through the pore. As a result, nanoscale sensors in the pore can get a higher-resolution read of each nucleotide as it crosses the pore, and identify small proteins in their native state that could not previously be detected.
“The light-induced phenomenon that we describe in this paper can be used to switch on and off the ‘brakes’ acting on individual biopolymers, such as DNA or proteins sliding through the nanopores, in real time,” Meller explained. “This critically enhances the sensing resolution of solid-state nanopores and can be easily integrated in future nanopore-based DNA sequencing and protein detection technologies.”
Slowing down DNA is essential to DNA or RNA sequencing with nanopores, so that nanoscale sensors, like sports referees, can make the right call on what’s passing through.
“The goal is to hold a base pair of DNA nucleotides in the nanopore’s sensing volume long enough to ‘call the base’ (i.e, determine if it’s an A, C, G or T),” said Squires, who fabricated nanopores and ran experiments in the study. “The signal needs to be sufficiently different for each base for sensors in the nanopore to make the call. If the sample proceeds through the sensing volume too quickly, it’s hard for the sensors to interpret the signal and make the right call.”
Other methods designed to slow down DNA in nanopores change the sensing properties of the pore, making it more difficult to ensure accuracy of detected base pairs. Shining laser light on the nanopore alters only the local surface charge, an effect that’s completely reversible within milliseconds by switching the laser off.
As an added bonus, the researchers found that the sudden increase in surface charge and resulting flow of water reliably unblocks clogged nanopores, which can take a long time to clean, significantly extending their lifetime.
Meller and his team characterized the amount of increase in current under varying illumination in many different-sized nanopores. They next aim to explore in greater detail the mechanism underlying the increase in surface current when the green laser is applied to a nanopore, information that could lead to even more sensitivity and accuracy in DNA sequencing.
The research is funded by a $4.2 million grant from the National Institute of Health’s National Human Genome Research Institute under its “Revolutionary Sequencing Technology Development — $1,000 Genome” program, which seeks to reduce the cost of sequencing a human genome to $1,000.
Enhancing the functionality of cyber-physical systems — systems that integrate physical processes with networked computing — could significantly improve our quality of life, from reducing car collisions to upgrading robotic surgeries to mounting more effective search and rescue missions.
Recognizing Boston University as a key contributor to this effort, the National Science Foundation has awarded Professors Venkatesh Saligrama (ECE, SE) and David Castañón (ECE, SE), and Assistant Professor Mac Schwager (ME, SE), nearly $1M for their project, “CPS: Synergy: Data Driven Intelligent Controlled Sensing for Cyber Physical Systems.”
Drawing on earlier work by Saligrama and Castañón investigating machine learning under cost and budget constraints, the researchers will focus on improving sensors that collect data in transportation, security and manufacturing applications. A key challenge in such applications is to choose the most effective physical sensors from the vast amount available and develop systems that can efficiently process large quantities of collected data.
“Many of these systems are energy-hungry,” Saligrama explained. “The goal is to use such sensors only when they are needed by using feedback control of the sensing actions to obtain the best information possible given energy budget constraints.”
Castañón, who has developed some of the leading theories used in controlled sensing studies, sees the project as “an opportunity to extend that theory to big data environments with high-dimensional measurements.”
The team plans to validate its techniques through archaeological surveying, working with Associate Professor Chris Roosevelt (Archaeology). Determining where to deploy the sensors on a smaller scale — for example, finding where best to dig — could lead to far-reaching solutions for deep-sea exploration, firefighting and traffic monitoring.
-Rachel Harrington (firstname.lastname@example.org)
New Algorithms Could Cut Costs, Add Renewables
When power transmission lines reach their capacity in a particular region during high demand periods, controllers have little choice but to tap local power plants to keep the electricity flowing and prevent blackouts. This practice, which favors expensive, local generation sources such as coal and natural gas over cheaper, typically more remote, renewable sources such as wind farms and solar arrays, adds an estimated $5 billion to $10 billion per year to the cost of running the US power grid. As more and more renewable generation sources join the grid and increase transmission line congestion, that price is expected to rise substantially.
To mitigate this cost, College of Engineering researchers and collaborators at Tufts University and Northeastern University have a plan that could enable controllers to exploit cheaper, renewable generation sources when transmission lines become congested. Supported by a $1.2 million grant from the Department of Energy’s Advanced Research Programs Agency (ARPA-E) in 2012 and an additional $1 million as of September, the researchers are developing algorithms and software that can produce short-term changes in the power transmission network that redistribute power across the network and utilize renewable sources without overloading transmission lines.
They estimate that the algorithms they’re developing will save $3 billion to $7 billion annually and significantly improve the resilience of today’s power transmission network. Based on a fundamental law of physics dictating that electric current is distributed along the paths of least resistance, the algorithms are designed to discover, in real time, preferred reconfigurations of the transmission network.
“By removing a small number of critical transmission lines, you change the relative resistances across alternative network paths, and electric power redistributes itself, relieving the congestion,” said Professor Michael Caramanis (ME, SE), the project’s co-principal investigator along with Research Associate Professor Pablo Ruiz (ME), who is leading the research effort. “If you disconnect the right lines, you can relieve congestion, increase use of inexpensive power sources and lower congestion costs.”
Having already implemented their algorithms in reproducing real-life situations in collaboration with the PJM transmission system, the largest power market in the US covering many eastern states, the researchers – with the recent addition of Professor Yannis Paschalidis (ECE, SE) – are now fine-tuning their software. Their immediate goal is to provide new ways of integrating wind generation with lower costs while strengthening the power transmission network. But to achieve that goal entails wrestling with a lot of computational complexity. Out of tens of thousands of transmission lines, the software must select a few, perhaps four or five, whose connection or disconnection will minimize the “spilling” or waste of inexpensive wind generation that might occur during high-congestion periods.
“Based on our understanding of power markets, in which prices can vary every five minutes at each node of the network, we can infer which lines would be beneficial to disconnect and which not,” said Caramanis. “When we disconnect a line, we also know how it will change the power flow over every other line, and how much we will gain by relieving the transmission network capacity a little bit. The idea is to optimize the network to reduce costly congestion.”
Over the next two-and-a-half years, the team plans to continue refining its algorithms in collaboration with PJM, two software companies and an energy consulting firm. It will also design tests and procedures to ensure that the dynamic reconfiguration of the transmission network causes no disruption in the security and stability of the power system. If the software is adopted across PJM or other vast transmission networks, controllers seeking to relieve congestion will have the capability to connect and disconnect selected transmission lines every half hour or hour as needed, rather than once or twice a month, as they do now – or even automate the process.
Crane was recently named a recipient of a Clare Boothe Luce Scholarship, given for two academic years to advanced degree candidates. Each fellowship covers the cost of tuition, medical insurance, mandatory fees, a $20,000 stipend and $4,000 for allowance to cover educational and professional development expenses.
The Clare Boothe Luce Program (CBL), the largest source of private funding for women in science, mathematics, and engineering, aims to increase women’s participation in science and engineering at every level of higher education.
Given the recent honor, it’s hard to believe that Crane, who earned her master’s degree through the Late Entry Accelerated Program (LEAP), only began studying engineering three years ago after graduating with an English degree summa cum laude from Clark University.
“I was unsure how long it would take to fulfill the many course requirements, as I was coming in with virtually none of them completed,” said Crane. “I dove in headfirst though and often overloaded on courses to finish in a timely fashion.”
Crane said that earning her master’s in a short timeframe motivated her to apply for her doctorate at BU.
“I didn’t even apply anywhere else,” she said. “There is tremendous value in students having familiarity with the faculty and vice versa, and in having an established rapport with a doctoral advisor right at the outset of research. There is no other school in the world where I would have had that advantage.”
At BU, Crane has been working closely with her advisor, Professor Hamid Nawab (ECE), who nominated her for the award.
“Molly is precisely the type of person who would help to further shatter the glass ceiling in the male-dominated world of electrical engineering research and academia,” said Nawab. “I wouldn’t be surprised if she wound up becoming a tenured faculty member in a leading ECE department or an internationally renowned leader in her field.”
Crane said she was taken by surprise when she won the award, especially since she had a very non-traditional path into engineering.
“The foundation’s support has allowed me to move into a coveted realm in doctoral research, where the student is free to define the problem on which her research will focus without having to worry about focusing solely on a problem as defined in a grant,” said Crane.
Crane’s research at BU focuses on signal processing, though her work overlaps into other areas.
“We’re at the point now where artificial intelligence is really exploding, and fields like signal processing are interwoven in that explosion,” said Crane.
Crane said that she hopes her work will help improve the ability of artificial intelligence (AI) applications to work in the face of mutually interfering inputs.
Examples of such AI applications include Apple’s Siri or Google’s voice recognition. Both work if a user is speaking clearly into a microphone, but if there are signals like music or other voices superimposed on the input speech signal, the results are often inaccurate.
She hopes to find a way to extract the meaningful input even when interfering signals are in the way, and do so in a way that can be applied to multiple applications.
“I’m looking forward to the opportunity to do research on a problem that has far-reaching implications and the potential to contribute something meaningful to the signal processing community at large,” she said.
Crane has been thrilled with her BU experience, describing her professors as “accessible and brilliant.”
“I am happy to be at BU, to call Boston home, and am looking forward to the experiences ahead,” said Crane. “Honestly, I’ve never been happier.”
-Rachel Harrington (email@example.com)