ECE prof advised US government on developing exascale computing
By, Rich Barlow, BU Today
Some time ago, Roscoe Giles gave a talk to BU computer scientists where he used his iPad 3—a handheld, aging technology, he noted, that nevertheless “had the same arithmetic power” as BU’s first, $2 million supercomputer from the 1990s.
“What was this giant, aspirational thing in 1990 is like nothing now,” says Giles, a College of Engineering professor of electrical and computer engineering. “A Sony PlayStation 3 had the same power as the Cray supercomputers of 1985.”
The next step in this exponential expansion of computing is the exascale computer—for now, just a dream of computer scientists—that would run one billion billion (yes, two “billions”) calculations per second. It would enhance everything from analyzing climate change to developing better plane and car engines to enhanced national security, says Giles, a 30-year veteran of the BU faculty.
He stepped down last October as a member and chairman of the US Department of Energy’s Advanced Scientific Computing Advisory Committee (ASCAC), which is helping advance President Obama’s executive order last year green-lighting development of exascale capability. ASCAC predicts that the United States will develop such a computer by 2023. When operational, the computer “is not going to be made up of a billion processors,” Giles says, “it’s going to be made up of fewer that are each more powerful and run at lower energy than most we have now.
“You probably won’t own an exascale computer—maybe in 40 years—but you’ll have devices that have exascale computing technology in them. Your smartphone will someday be a hundred times smarter than it is now, for not much more money.”
Giles discussed the brave new world of exascale with BU Today.
BU Today: What’s the technological challenge to developing exascale computing?
Giles: Up until about 2004, exponential growth in computing power meant that everything went faster. That stopped; computing didn’t get faster. Instead, you are getting computer chips with multiple processors on the chip. You can put more stuff on a chip; what hasn’t continued is the part that says you can make the chip go faster, which means the energy you need goes up faster.
Is it correct then that the main impediment is that exascale requires such immense energy?
That’s sort of true. It takes much too much power. But if you were willing to expend that power, it’s not clear that it makes sense to do it. What problems will I solve? If you took a ship filled with cell phones and said how much computing power is in that ship, it might be an exascale, but that doesn’t mean that they can work together to solve any particular problem.
So why are we pursuing exascale computing?
You may be able to solve new problems or old problems in new ways. For example, simulating how matter will behave inside a car engine to help engineer a better engine. You need to simulate millions or billions of molecules. Historically, people could never get close to the number of molecules in a real engine in a computer model. So what we do know tends to be based on artfully chosen, very tiny computational samples. Exascale computing enables larger simulations that can have greater fidelity to real systems.
On a larger scale, think about the simulation of climate and climate science. You have to represent individual patches of the Earth inside the computer. In the early days, you’d be dividing the world up into a large number of 100-kilometer squares. What is the temperature in that square? How much ice is there, and what is the reflectivity in that square? As computers got more powerful, they made the square smaller and smaller. The current number is, like, 20 kilometers. You get more understanding in the narrower regions, particularly things like the ocean flows, where at a greater level you can’t see what’s going on. That’ll give me a greater picture of the little eddies and flows. Exascale computing hopes to enable simulations at a one-kilometer scale.
Exascale also helps engineering companies doing production. The classic example is numerical wind tunnels. In design of airplanes, the test was that you build some mock-up of a plane or section, you put it in a wind tunnel, you measure its properties, you may tweak the engineering. The capability that computing already has offered is to replace some of those simulations by computer simulation. Your ideal would be sitting at a drawing board and saying, “I want to make a wing with this shape out of this material,” and then you press a button and the computer tells you how well that would perform.
It’s the same thing with looking at the airflow around trucks, based on how you design the body of the truck, to save fuel. They design trucks with 20 to 30 percent better gas efficiency through computer modeling with the highest computers we have now. What we’re looking for in exascale is to include more science and understanding of materials in design and production.
What about search engines like Google?
One of the hottest things going is data science. There are problems where exascale computing, married to the right level of data science support, will lead to breakthroughs to do lots of cross-correlation of data.
A Google data center uses a unit of computing that’s basically a tractor-trailer truckful of computers. The data center is a roomful of these containers. Each of those processors is doing a share of the searching, but it’s a challenge to get 10,000 of those things to be communicating rapidly, back and forth, to solve my problem. That’s the kind of problem where exascale computing directly affects the data science, to bring lots of computing power to bear on a single problem.
Are governments and business the only entities likely to own an exascale computer?
I think that’s probably true. But you’d get a machine comparable to some of today’s midsize supercomputers that could be on your desk once exascale technology is around.
The other use is national security. The National Security Agency is really interested in this.
Because you could read more data from terrorists’ cell phones or computers?
Right. They want to be number one in the world at that. That actually parallels Google, in terms of it being a data-centric application, the ability to handle large amounts of data and make correlations.
Why does government need to be involved—why can’t Google or industry develop exascale on their own?
The economic incentives are not aligned to make it possible to develop the exascale machine. There’s not a big enough market. Companies would like better roads and faster railroads, but that doesn’t mean they’re going to create the railroad. They’ll contribute to help building the road, but by themselves, they won’t be able to do it.
What are other governments doing to develop exascale?
They intervene to make it happen, like China. There’s a European consortium that’s investing in exascale. That’s actually a software initiative.
I used to joke that we say, “You want to start a program that’s $200 million a year? Oh my God, where are we going to get the money?” My impression is in China, they say, “Oh, it only costs money? We’ll buy slightly less real estate in New York to pay for this.”
Is it conceivable that China or some other country might get exascale before us?
Oh, sure. It’s conceivable. I don’t know if it’s likely. We have the best ecosystem for scientific computing, meaning that not only do we have powerful computers, but we have people who develop the algorithms. We have the laboratories where the computers live.
Is there a debate in academia about the role of government in developing exascale?
There’s always debate. There’s a debate first about the role of universities versus the role of government labs. And there’s the overall question of how you spend government money.
Are there dangers from exascale computing?
A danger comes when one part of society has access to a technology that no one else has. We will have that technology distributed to more than just the government. We could not get industry to participate if the only end point was to make 10 machines for the government. Our committee had people from industry and from academia.
This goes with having stuff in data centers and cloud services. One vision is you deploy exascale technology and the market for it is going to be data centers. The way people benefit is from their access through the data centers. The iPad 200 will have stuff in it that is exascale.
Human society is incredibly adaptive. The internet itself—I was around before it existed at all. We adapted to that technology. That doesn’t mean the adaptation is not painful. The issues of privacy we’re addressing only arose after the technology came in. We’re going to figure it out; our institutions will adapt. Those of us who are older are horrified by what people put out on Facebook and that they take their cell phone to the bathroom with them and live-stream it. But it’s clear other people are not going to feel that way: “Everybody has naked pictures of themselves in mud on the internet from when they were in college. So what?”
This article originally appeared on BU Today.
Vivek Goyal Creates Images from Single Photons
By Sara Elizabeth Cody
When you take a photo on a cloudy day with your average digital camera, the sensor detects trillions of photons. Photons, the elementary particles of light, strike different parts of the sensor in different quantities to form an image, with the standard four-by-six-inch photo boasting 1,200-by-1,800 pixels. Anyone who has attempted to take a photo at night or at a concert knows how difficult it can be to render a clear image in low light. However, in a recent study published in Nature Communications, one BU researcher has figured out a way to render an image while also measuring distances to the scene using about one photon per pixel.
“It’s natural to think of light intensity as a continuous quantity, but when you get down to very small amounts of light, then the underlying quantum nature of light becomes significant,” says Associate Professor Vivek Goyal (ECE). “When you use the right kind of mathematical modeling for the detection of individual photons, you can make the leap to forming images of useful quality from extremely small amounts of detected light.”
Goyal’s study, “Photon-Efficient Imaging with a Single-Photon Camera,” was a collaboration with researchers at MIT and Politecnico di Milano. It combined new image formation algorithms with the use of a single-photon camera to produce images from about one photon per pixel. The single-photon avalanche diode (SPAD) camera consisted of an array of 1,024 light-detecting elements, allowing the camera to make multiple measurements simultaneously to enable quicker, more efficient data acquisition.
The experimental setup uses infrared laser pulses to illuminate the scene the research team wanted to capture, which is also illuminated by an ordinary incandescent light bulb to accurately reproduce the condition of having a strong competing light source that could be present in a longer-range scenario. Both the uninformative background light and laser light reflected back to the SPAD camera, which recorded the raw photon data with each pulse of the laser. A computer algorithm analyzed the raw data and used it to form an image of the scene. The result is a reconstructed image, cobbled together from single particles of light per pixel.
The method introduced by Goyal’s team comes in the wake of their earlier first-ever demonstration of combined reflectivity and depth imaging from a single photon per pixel. The earlier work used a single detector element with much finer time resolution. The current work demonstrates that creating an image with a single-photon detector can be done more efficiently.
“We are trying to make low-light imaging systems more practical, by combining SPAD camera hardware with novel statistical algorithms,” says Dongeek Shin, the lead author of the publication and a PhD student of Goyal at MIT. “Achieving this quality of imaging with very few detected photons while using a SPAD camera had never been done before, so it’s a new accomplishment in having both extreme photon efficiency and fast, parallel acquisition with an array.”
Though single-photon detection technology may not be common in consumer products any time soon, Goyal thinks this opens exciting possibilities in long-range remote sensing, particularly in mapping and military applications, as well as applications such as self-driving cars where speed of acquisition is critical. Goyal and his collaborators plan to continue to improve their methods, with a number of future studies in the works to address issues that came up during experimentation, such as reducing the amount of “noise,” or grainy visual distortion.
“Being able to handle more noise will help us increase range and allow us to work in daylight conditions,” says Goyal. “We are also looking at other kinds of imaging we can do with a small number of detected particles, like fluorescence imaging and various types of microscopy.”
By Sara Elizabeth Cody
A group of six BU students were the sole team from the U. S. to compete in the world’s largest supercomputing hackathon in Wuhan, China in April.
“Supercomputing uses very powerful hardware to run large and complex programs,” explains Hannah Gibson (ECE’17), a member of the BU Green Team who competed at the Asia Supercomputing Community Student Supercomputer Challenge. “It’s used in CGI for movies and for weather modeling-huge programs that require a lot of power. In the competition, the goal is to get the best performance with consideration for power and speed with the setup and software you designed and built.”
The competition featured 16 teams selected from 146 applicants that hailed from around the globe, from China and Russia to Hungary and Colombia. Each team provided a wish list of hardware to the sponsoring company, Inspur, and had to prepare software in advance to bring with them to the competition. Teams had four days total for the competition, including time for setup and installation.
“It was awesome being in a different country and seeing how our team stacked up to teams from all around the world,” says Wasim Khan (ECE’17), a member of the BU Green Team. “It was interesting to compete against other teams who come from schools that have supercomputing as a major and to see that we, an extracurricular student-run group, gave them a run for their money.”
In computing, performance is often measured by floating-point operations per second, or flops. The higher the number of flops, the better the computer performance and, in competition, the higher the score. Teams were given six applications, where they were tasked with rewriting portions of each program to work better on the target hardware, optimizing it to work on their architecture and complete real-world scientific workloads while obeying the competition constraint of 3,000 watts of power maximum.
Five of the applications were programmed to run on their own hardware setup, or cluster, to measure the number of flops it generated. The other application was run on the Tianhe-2, currently the world’s fastest supercomputer. The score was an algorithm that was based on the number of problem sets, or workloads, that were completed, with consideration for accuracy, timing and flops generated, if applicable. Awards were given to top scorers, “most innovative,” and “best overall.” In order to support the ASC mission to promote supercomputing outreach, teams were encouraged to tweet throughout the competition and the team with the most retweets was awarded the “most popular” designation.
“This is an impressive and highly motivated group of students who had to specify and acquire equipment, optimize the configurations, tune, and in some cases refactor the applications, and ultimately qualify for these competitions entirely of their own volition,” says Professor Martin Herbordt (ECE), who is the faculty advisor for the group. “It goes without saying that students learn a lot in their classes, but this type of professional, real-world experience that is self-guided takes their learning to a whole other level.”
The BU Green Team represented BU’s High Performance Computing (BUHPC) team, led by Winston Chen (CE’17) and Huy Lee (CS’16), is affiliated with BUILDs, the BU hackerspace that provides resources for students to undertake technology projects. Since their return from China, BUHPC is fundraising to attend the ISC Student Cluster Competition in Germany in June. In addition to competing, the event also includes professional development workshops and networking opportunities for students interested in the field of supercomputing.
By Sara Elizabeth Cody
On Thursday, April 14, Professor M. Selim Ünlü (ECE, BME, MSE), recipient of the 2016 Charles DeLisi Award and Distinguished Lecture, presented “Optical Interference: From Soap Bubbles to Digital Detection of Viral Pathogens” to a packed room of students, faculty and researchers.
The first named endowed lecture in the history of the College of Engineering, the Charles DeLisi Award and Distinguished Lecture recognizes faculty members with extraordinary records of well-cited scholarship, and outstanding alumni who have invented and mentored transformative technologies that impact our quality of life.
When Ünlü arrived at BU in 1992, he was inspired by the collegial interdisciplinary environment, which led him to apply his background in electrical engineering and electromagnetic waves to developing innovative methods for biological imaging and sensing. His presentation, peppered with video and audio messages from past students and mentors who have contributed to his work, chronicled his career path from graduate school to present day and centered on his current research in optical sensing and developing new bioimaging technologies that address the obstacles that currently plague the field of diagnostics.
“When you are trying to look at pathogens, the most distinguishing thing is to look at its genome, but obstacles like logistics and cost are prohibitive and drive scientists to find more compact and affordable ways that have the same functionality,” said Ünlü. “Single particle detection has been the physicist’s dream of addressing these issues, so that’s what we set out to explore.”
Synergy between Engineering and Medicine
In developing his optical detection technology, he drew inspiration from, of all places, a soap bubble. Specifically, the patterns of colors that develop on the surface when light is being reflected through it. According to Ünlü, the same interference phenomenon that gives rainbow colors to soap bubbles can also provide extremely high sensitivity as illustrated by the recent news on detection of gravity waves by optical interferometry.
“Most people don’t realize that just by calling out a certain color, you are making a measurement in the order of nanometers,” said Ünlü.
Ünlü extended this idea to develop his optical detection technology for single nanoscale particles, where the interference of light reflected from the sensor surface is modified by the presence of nanoparticles, producing a distinct signal that reveals the size of the particle that is otherwise not visible under a conventional microscope. Using this technology, Ünlü and his research team demonstrated label-free identification of some of the most deadly viruses in the world, including hemorrhagic viruses like Ebola, Lassa and Marburg, at a high sensitivity on par with state-of-the-art laboratory technologies. They have even been able to detect particles as small as individual protein and DNA molecules by labeling them with gold nanoparticles to provide sufficient visibility.
“Proteins are too small. We can’t see them directly so we decorate them with gold nanoparticles, which are not much bigger than the proteins themselves,” said Ünlü. “Decorating them with gold nanoparticles increases visibility of the molecules bound on the sensor surface, and we are able to count them in serum or whole blood.”
The resulting technological development in biomarker analysis that Ünlü has spearheaded is digital detection, an approach that counts single molecules, which provides resolution and sensitivity beyond the reach of ensemble measurements. Digital detection for medical diagnostics not only provides very high sensitivity, but also has the potential of making the most advanced molecular diagnostic tools broadly accessible at low cost.
Digital detection captures images of individual viruses in real time
“Optical interference is a very powerful sensing technique,” summed up Ünlü. “With this biological imaging technology, we can detect single particles if they are large enough on the nanoscale, such as viruses, and see them directly. If they are proteins or DNA molecules we have to label them with a small, metallic nanoparticle to see them.”
In terms of next steps, Ünlü and his team will continue to refine the technology for commercialization, including applying some of these findings to produce microarray chips that provide calibration and quality control in industry. His laboratory will continue to work on advancing the technology further and gaining a deeper understanding of the theoretical basis in order to enhance the methodology. In particular, they are looking into applying the technology to such areas as real-time DNA detection, rare mutations, and most recently a project to characterize viruses that target cancer cells.
To conclude his presentation, Ünlü expressed his appreciation of the support he received from the College to foster collaboration, and to his students, mentors and family who helped him along the way.
“I’m very thankful to Boston University for providing an incredibly rich environment for research because there are no barriers between disciplines,” said Ünlü. “Multidisciplinary innovation is the driving force of discovering new things and making society better, and ultimately that is my motivation.”
The DeLisi Lecture continues the College’s annual Distinguished Lecture Series, initiated in 2008, which has honored several senior faculty members. The previous recipients are Professors John Baillieul, (ME,SE), Malvin Teich (ECE) (Emeritus), Irving Bigio (BME), Theodore Moustakas (ECE, MSE), H. Steven Colburn (BME), Thomas Bifano (ME, MSE), Christos Cassandras (ECE, SE) and Mark Grinstaff (BME, MSE, Chemistry, MED).
By Sara Elizabeth Cody
Five teams of ECE students competing in the fifth annual Intel-Cornell Cup have advanced to the final round in the competition. The Intel-Cornell Cup is a college-level design competition that aims to empower inventors of the newest innovative applications of embedded technology.
“This is a major national competition and personally I think our teams’ performances reflect highly on the College,” says Associate Professor of Practice Alan Pisano (ECE), who is one of the faculty advisor for the competition. “We have five very interesting projects in the finals, more than any other school, which seek to tackle nationally relevant issues that will benefit society.”
The competition, which alternates between live and online competition annually, is following an online format this year. Initially, six teams from BU advanced to the semifinal round and competed against 31 other teams from around the country. Five teams from BU, comprised of senior design project teams, are competing with 24 other teams in the final round.
The BU finalist teams are:
- BreakerBot An interdisciplinary team of ECE and ME students and sponsored by Consolidated Edison to build an autonomous robot to move 800 pound circuit breakers in their substations.
- Giro dicer A team of ECE students building a drone to locate ice dams and apply melting chemicals to “break the dam.”
- Moove Created by a team of ECE students (with one BME dual-degree student), this device is essentially a “Fitbit” for cows, networking them together and gathering data to analyze in a cloud.
- TED A team of ECE students designing a translating teddy bear toy for young children to help them learn different languages
- E-Fire An ECE team creating a device to measure high-energy electrons in space
Projects will be completed by the end of March, fulfilling both a course requirement and a competition requirement with support from Pisano and the other ECE Senior Design Capstone supporting faculty members, Lecturer Osama Alshaykh and Senior Lecturer Babak Kia. The final judging takes place at the end of April. The competition is sponsored primarily by Intel and Cornell University.
Transforming Living Cells into Computers
By Sara Elizabeth Cody
Whether it’s artificial skin that mimics squid camouflage or an artificial leaf that produces solar energy, a common trend in engineering is to take a page out of biology to inspire design and function. However, an interdisciplinary team of BU researchers have flipped this idea, instead using computer engineering to inspire biology in a study recently published in Science.
“When you think about it, cells are kind of computers themselves. They have to communicate with other cells and make decisions based on their environment,” says Associate Professor Douglas Densmore (ECE, BME), who oversaw the BU research team. “By turning them into circuits, we’ve figured out a way to make cells that respond the way we want them to respond. What we are looking at with this study is how to describe those circuits using a programming language and to transform that programming language into DNA that carries out that function.”
Using a programming language commonly used to design computer chips, ECE graduate student Prashant Vaidyanathan created design software that encodes logical operations and bio-sensors right into the DNA of Escherichia coli bacteria. Sensors can detect environmental conditions while logic gates allow the circuits to make decisions based on this information. These engineered cells can then act as mini processing elements enabling the large scale production of bio-materials or helping detect hazardous conditions in the environment. Former postdoctoral researcher Bryan Der facilitated the partnership between BU and the Massachusetts Institute of Technology to pursue this research study.
“Here at BU we used our strength in computer-aided design for biology to actually design the software and MIT produced the DNA and embedded it into the bacterial DNA,” says Densmore. “Our collaboration is a result of sharing the same vision of standardizing synthetic biology to make it more accessible and efficient.”
Historically, building logic circuits in cells was both time-consuming and unreliable, so fast, correct results are a game changer for research scientists, who get new DNA sequences to test as soon as they hit the “run” button. This novel approach of using a common programming language opens up the technology to anyone, giving them the ability to program a sequence and generate a strand of DNA immediately.
“It used to be that only people with knowledge of computers could build a website, but then resources like WordPress came along that gave people a simple interface to build professional-looking websites. The code was hidden in the back end, but it was still there, powering the site,” says Densmore. “That’s exactly what we are doing here with our software. The genetic code is still there, it is just hidden in the back end and what people see is this simplified tool that is easy, effective and produces immediate results that can be tested.”
According to Densmore, this study is an important first step that lays the foundation for future research on transforming cells into circuits, and the potential for impact is global, with applications in healthcare, ecology, agriculture and beyond. Possible applications include bacteria that can be swallowed to aid in digestion of lactose to bacteria that can live on plant roots and produce insecticide if they sense the plant is under attack.
“The possibilities are endless, and I am excited about it because this is the crucial first step to reach that point where we can do those amazing things,” says Densmore. “We aren’t at that level yet, but this is a stake in the ground that shows us we can do this.”
The BU/MIT collaboration will continue underneath the Living Computing Project which was recently awarded a $10M grant from the National Science Foundation. Future studies will look to improve upon the circuits that were tested, add other computer elements like memory to the circuits and expand into other organisms such as yeast, which will pave the way for implanting the technology into more complex organisms like plant and animal cells.
By Rich Barlow
The College of Engineering has earned its highest-ever ranking from US News & World Report, placing 35th among its peer American schools in the magazine’s latest rankings. It’s a two-slot advance from last year and a long jump from a decade ago, when the school placed 52nd, says Kenneth Lutchen, dean of ENG.
Additionally, ENG’s biomedical engineering instruction ranked ninth among such programs nationally. The ratings of 194 engineering schools considered peer assessments, student selectivity, student-faculty ratios, the number of doctoral degrees granted, and research funding, among other factors.
Lutchen attributes his school’s success to several strengths, starting with a commitment to interdisciplinary research across both the college and the University, “recruiting complementary faculty in areas such as photonics, information and cyber-physical systems, the intersection of engineering and biology, advanced materials, and nanotechnology.” That approach, he says, has garnered “tremendous extramural funding success among our faculty.”
Second, in recent years, ENG boosted research and educational partnerships with industry, using assessments of the school’s programs by these partners to improve them. Meanwhile, Lutchen says, the ENG faculty has matched prowess at securing funding with “scholarship in their field, and in how that scholarship eventually impacts societal challenges.”
Over the past decade, ENG’s rankings have marked “the largest single improvement of any engineering school in the country” among those that made the top 52 in 2006, he notes. Every one of its degree programs now scores in the top 20 in its discipline among private universities, he says, adding that that has real-world effects, helping “attract ever-higher quality in our faculty and our PhD students.”
A version of this story originally appeared on BU Today.
First BU Data Science Day Draws Cross-Disciplinary Crowd
By Sara Rimer, Photos by Dave Green
Azer Bestavros, founding director of the Rafik B. Hariri Institute for Computing and Computational Science & Engineering, was practically giddy. It was the first BU Data Science (BUDS) Day and the Photonics Center ninth-floor conference room, where the institute was hosting the event, was standing room only.
“I thought there might be 80, 90 registrants,” said Bestavros, a College of Arts & Sciences professor of computer science and head of BU’s Data Science Initiative, welcoming the participants with—what else—data. “They told me there were 262. I was shocked—really?”
Not only that, but the data science geeks—faculty and students from physics, mathematics and statistics, computer science, electrical and computer engineering, systems engineering, biostatistics—were there with people from the humanities and the social sciences as well as from CAS, the Questrom School of Business, Sargent College of Health & Rehabilitation Sciences, the College of Engineering, the College of Communication, the School of Social Work, the School of Public Health, the School of Law, and the School of Medicine.
Bestavros had the data. The registrants were from 66 different disciplines, departments, and offices across the University, including the libraries, Information Systems and Technology, and Career Services. It was the sort of diverse, cross-disciplinary crowd that Bestavros and event cochairs Dino P. Christenson, a CAS associate professor of political science and a former Hariri Institute Junior Fellow, and Prakash Ishwar, an ENG associate professor of electrical and computer engineering, had hoped to draw.
“Why are you here?” Bestavros asked his crowd. “Is it because of the whole ‘data science is the sexiest job’ thing? Maybe it’s about how you’re going to make a ton of money. Maybe it’s about the data that’s coming at us and we don’t know what to do with it; we’re drowning in it. Maybe a lot of you are here to figure out how you can float.”
Or maybe they had all come together, on a wintry Friday morning at the end of January 2016, because they knew “that data science has become the common language of all disciplines.” Data science breaks down the walls between disciplines, said Bestavros—“at least we can talk, at least we can all be in the same room.”
For the next seven to eight hours, faculty, students, and staff connected through data science, brainstorming about its possibilities, reporting on how it was transforming their work in an astonishing array of disciplines—physics, neuroscience, health analytics, cancer research, genomics, the social sciences, marketing, law, even art history. They raised big questions: Can a robot learn how to teach physics? How do you know you can trust the data from crowdsourcing? How can you bring all these different networks together with the right information to actually improve people’s lives?
A newcomer’s question: “What is data science?” Christenson explained: “Data science is a broad term—perhaps overly broad—used to characterize a number of different fields, including political science, that are interested in the systems and processes for extracting knowledge from data. It uses statistical and computational tools to collect, curate, store, analyze, model, and visualize various types of data.”
Addressing the audience before lunch, Gloria Waters, vice president and associate provost for research, commended Bestavros for the interdisciplinary community of scholars he has assembled at the Hariri Institute. She said that the day’s events—the talks, the poster sessions—demonstrated “the excellence, the depth of work” in data science at BU. She noted that data science is one of BU’s “research peaks,” an area that Waters, along with President Robert A. Brown and Provost Jean Morrison, are committed to investing in and excelling at.
“It’s absolutely clear we have world-class faculty in basic science—in math and statistics, computer science, electrical and computer engineering—and faculty who are doing amazing work in applications of data science,” Waters said. Recruiting additional top data science faculty is a primary goal of the Data Science Initiative that Bestavros is leading, she added.
At the event, 12 faculty panel speakers from multiple disciplines spoke for 10 minutes each about how their data-driven research related to one of three broad themes: vision, networks, and health, markets, and policy.
Kicking off the panel focused on vision and visual-data-driven research, Jodi Cranston, a CAS professor of Renaissance art, made the case for small data. “Most scholars in humanities fear big data because it involves technology,” she said. She gave a quick slideshow tour of her Mapping Titian project, an archive and teaching web application that documents the relationship between the artwork of 16th-century Venetian Renaissance artist Titian and their changing locations and historical context (the project was funded, in part, by the Hariri Institute).
“You could think about how movements of artwork are affected by disease, natural disasters, population changes, economic crises, political events,” Cranston said. “Recognizing the potential wide applicability of small data in the humanities helps strengthen the human underlying all humanities research.”
Advances in brain imaging have produced a treasure trove of data for neuroscientists. “I study the brain, and the brain is a great problem for big data because the brain has one billion neurons,” said Michael Hasselmo, a CAS professor of psychological and brain sciences and director of BU’s Center for Systems Neuroscience, beginning his vision talk. Hasselmo explained how he is studying the coding of space and time by neurons in rats as part of his work in understanding memory in humans.
“I’m an algorithms guy,” said another vision panel speaker, Brian Kulis, an ENG assistant professor of electrical and computer engineering, who works on machine learning and big data analysis. Kulis defined machine learning as “a set of tools used to make predictions from data.” These tools are useful in many areas, he said, from driverless cars to robotics.
Margrit Betke, a CAS professor of computer science, uses big data to help visually impaired people with things such as navigating busy intersections on foot, reading medication instruction labels, and setting the temperature control in their apartments. She explained how she and a team of students—with the aid of crowdsourcing—insert tags of text on images on a web page. A visually impaired person “takes a photo of their temperature control, uploads it to the internet, and then some friendly person in the world will type the answer back to them: ‘This is your temperature setting.’”
Betke ticked off a few of her other current collaborations: she and Stan Sclaroff, a CAS professor of computer science, are designing a machine-learning text recognition system. She is working on cell tracking with Joyce Wong, an ENG professor of biomedical engineering. She and a team of biologists are tracking and analyzing the behavior of bats in caves in Texas.
Collaboration was the mantra of the day. “We live and die by our collaborations,” said W. Evan Johnson, a MED associate professor of medicine and biostatistics, who underscored the role of team science in his lab’s work in tracking the evolution of cancer tumors and drug response in cancer cells. Some collaborations are more successful than others, he said.
Biostatisticians are after “the best method to do something,” he said. Biologists, on the other hand, “want to be the first person to discover something.” The two goals—best and first—don’t always converge. The key, he said, is to find collaborators who want to contribute their skills to a joint project and who understand what’s in it for everyone involved.
Johnson, whose research falls at the intersection of statistics, computing, biology, and medicine, said his two teenage sons deserved some of the credit for motivating his research. “They get a kick out of telling people, ‘My dad’s a doctor, but not the kind that helps people,’” he said to laughter from the audience. “I’ve made it my goal to do something that helps people,” he went on. “How can we use biological big data to inform and influence how patients are treated in the clinic?”
During a break, Bestavros noted the multitude of ways the speakers managed to collect the data for their research. Michael J. Meurer, a LAW professor, for example, purchases the data he uses to study patent trolls. For his research into the sharing economy, Georgios Zervas (GRS’11), a Questrom assistant professor of marketing, a computer scientist, and a Hariri Junior Faculty Fellow, analyzes publicly available data from sources such as Airbnb and federal, state, and municipal websites.
“It starts with getting the data, cleaning the data, scraping the data,” Bestavros said. “We have to worry about security and privacy, then we have to worry about doing the analytics. We mine it for information that advances our understanding, and we check if our findings make sense. Finally, we have to communicate this in very different ways.”
Speaking of communication, 26 students from colleges and schools across the University—public health, medicine, engineering, business, communication, arts and sciences—participated in the day’s poster session. Sahar Abi Hassan (GRS’19), a doctoral candidate in political science, presented her work on interest groups and the Supreme Court. Abi Hassan said she had been introduced to data science through her department’s required Quantitative Methods 1 course. “From there, I just became fascinated with data science and its great potential for social sciences,” she said. “Working with data allows me to find patterns in political and social phenomena that otherwise would be hidden.”
Commending Abi Hassan’s work, Bestavros said he hoped the day had demonstrated the importance of data science and education. “If you’re a student in political science or sociology or marketing or business or journalism and the whole area is now going to become data-driven, you need to learn at least the basics of data science,” he said. “It’s not something that only computer scientists need to learn. Data touches everything we do.”
A version of this article was originally published in BU Research.
Prominence of ECE Faculty Continues to Grow
By Gabriella McNevin
Boston University Department of Electrical and Computer Engineering Professor Mark Horenstein has been named an IEEE Fellow. He is being recognized for contributions to the modeling and measurements of electrostatics in industrial processes. His experimental and theoretical work has focused on some of the more complex electrostatic problems that relate to instrumentation and safety and well as to an understanding of the fundamental theories behind many industrial processes. His work has spanned such broad subjects at the propagating brush discharge, electrostatic phenomena in MEMS devices, modeling of corona discharge, and the electrostatics of parachutes.
The IEEE grade of Fellow is conferred by the IEEE Board of Directors upon a person with an outstanding record of accomplishments in any of the IEEE fields of interest. The total number selected in any one year cannot exceed one-tenth of one-percent of the total voting membership. IEEE Fellow is the highest grade of membership and is recognized by the technical community as a prestigious honor and an important career achievement.
Until 2015, Horenstein served as the Editor-in-Chief for the Journal of Electrostatics for 14 years, and he is an honorary life member of the Electrostatics Society of America (ESA). He was selected to be the Bill Bright Memorial Lecturer for the Institute of Physics’ Electrostatics 2015 conference, where he discussed “The Contribution of Surface Potential to Diverse Problems in Electrostatics.” He was also named International Fellow by the Electrostatics Working Group of the European Federation of Chemical Engineers at their Electrostatics 2013 conference, where he gave an invited lecture on “Future Trends in Industrial Electrostatics. In 2012, he was named Outstanding Professor of the Year by the College of Engineering at Boston University. Horenstein is a named inventor on five patents. He received his Ph.D. degree in Electrical Engineering from MIT in 1978, and his M.S. in Electrical Engineering from the University of California at Berkeley in 1975.
In addition to Horenstein’s expertise in electrostatics, he is known for his textbooks on microelectronics and engineering design. He currently works on technology for self-cleaning photovoltaic solar panels and concentrating solar mirrors, and ultra-sensitive electrostatic field sensors
The IEEE is the world’s leading professional association for advancing technology for humanity. Through its 400,000 members in 160 countries, the IEEE is a leading authority on a wide variety of areas ranging from aerospace systems, computers and telecommunications to biomedical engineering, electric power and consumer electronics.
Leading Engineers Visit BU as Part of the ECE Distinguished Lecture Series to Discuss Research with Students and Faculty
By Rebecca Jahnke, COM ’17
BU’s Electrical & Computer Engineering department draws renowned leaders of the field to present as part of the ECE Distinguished Lecture Series. The topics presented are always changing, but consistently span diverse research areas. The Fall 2015 lineup included academics Daniel Fleetwood, Kevin Skadron and Ralph Etienne-Cummings.
Despite Fleetwood, Skadron and Etienne-Cummings’ varying research focuses, the trio has much in common. All are highly decorated IEEE Fellows with many accolades to their names. They hold a collective ten patents between them. Through the groundbreaking publications they’ve authored, the group has effectively written the science today’s students are learning. Work conducted at posts throughout the country – and for some, on sabbatical abroad – further reflects the breadth of their influence.
Fleetwood kicked off this season’s series with a lecture entitled “Moore’s Law and Radiation Effects on Microelectronics” in September. Fleetwood is the Chair of Vanderbilt University’s Department of Electrical Engineering & Computer Science as well as the university’s Olin H. Landreth Professor of Engineering. His lecture examining the effects of Moore’s Law Size and voltage scaling followed his research in nano science and technology as well as risk and reliability. A Fellow of the American Physical Society and an IEEE Fellow, Fleetwood also received the IEEE Nuclear and Plasma Sciences Society’s Merit Award. Having authored over 380 publications, Fleetwood received ten Outstanding Paper Awards and has his research cited upwards of 7000 times.
The series continued with a lecture by Kevin Skadron, University of Virginia Department of Computer Science Chair and Harry Douglas Forsyth Professor. His October presentation, “Automata Processing: Massively-Parallel Acceleration for Approximate Pattern Matching,” provided an overview of the AP architecture and observations from accelerating its applications. Skadron cites his research as exploring processor design techniques for managing power, thermal and reliability constraints, all with a focus on manycore and heterogeneous architectures. He has achieved two patents of his own and over 100 peer-reviewed publications and counting since his college summers spent interning for Microsoft and Intel.
Ralph Etienne-Cummings, Professor and Chair of Johns Hopkins University’s Department of Electrical and Computer Engineering, closed out this semester’s series in December. This final presentation – “I, Robot: Blurring the lines between Mind, Body and Robotics” – suggested new approaches to brain-machine interfaces (BMI). Etienne-Cummings’ research interests include systems and algorithms for biologically inspired and low-power processing, biomorphic robots, applied neuroscience, neutral prosthetics and computer integrated surgical systems and technologies. His high level of curiosity has been evident since he was a child and repaired his own short wave radio to listen to a soccer match. Now the holder of seven patents, Etienne-Cummings is known to make time for diversity and mentoring initiatives intended to awaken a similar curiosity in others.