Protecting the Identity in Your Pocket
BU leads in developing cell phone security
The same day that Mark Crovella and his computer science colleagues kicked off a symposium last semester to discuss their $3 million, five-year smartphone security project funded by the National Science Foundation, the New York Times published a front-page article about tabloid reporters in London hacking into the mobile devices of English soccer stars and celebrities, most notably members of the British royal family, including Prince William and Prince Harry.
While the tabloid case, now unfolding in the court system, involves reporters hacking into voice mails with stolen pin numbers, it underscores how cell phones have not only centralized our personal information—storing our schedules, finances, and social networking tools—but exposed that data to such threats as malicious applications, identity theft, and eavesdropping.
Last week, a German politician sued Deutsche Telekom (owner of T-Mobile) for the tracking information the cell phone company had compiled on him. He was astounded to learn that over a six-month period, Deutsche Telekom had recorded and saved his longitude and latitude coordinates more than 35,000 times. It basically knew where he was at all times.
Crovella, a College of Arts & Sciences computer science professor, and his colleagues are looking to do something about this potential compromising of our electronic identities as hardwired features on phones give way to open-source software programs and customized applications. Based at BU’s Center for Reliable Information Systems and Cyber Security (RISCS), the project, called Securing the Open Softphone, includes nine senior investigators from CAS, the College of Engineering, and Metropolitan College (Deutsche Telekom and Raytheon Company’s BBN Technologies are the project’s industrial partners).
BU Today spoke recently with Crovella, one of the project’s lead researchers.
BU Today: What is the difference between a “softphone” and a cell phone?
Crovella: The reason for coining a term like “softphone” is that the nature of the security problem on the phone has changed because the capabilities of the phone have changed. The underlying architecture on the cell phone operating system is very different from five years ago. When we’re talking about a smartphone, we’re basically talking about a softphone. The reason for making the distinction is to draw attention to the wide array of sensors that are available on the phone and the new architecture for the software. The architecture is now closer to what’s on a desktop PC.
Is surfing the web on a phone less safe than on a home computer?
There are just as many weaknesses in a softphone browser. But the effects of the exposure can be more serious. You’re more likely to have a collection of personal data, financial data, location data. Phones are increasingly being used as the medium for financial transactions. The next iPhone will have near-field communication chips, which are used for “cardless debit”—like a VISA you wave at a terminal or Mobil gas station has a key chain you wave at the pump. Those capabilities will be enabled in the next generation of phones. The compromised phone can be used to make unauthorized phone calls. There was an Android app in the last month or two that was sending text messages to a very expensive destination. The app developer was getting a cut of the resulting revenue.
Where does the level of responsibility lie? How much is on the shoulders of the phone manufacturer, the carrier, the user?
That’s actually a struggle that’s playing out right now. I think people have felt instinctively that the responsibility ought to lie with the phone owner. Now I think people are starting to realize maybe we should relinquish some of our control to a central party that can at least provide some measure of security over the applications.
When the iPhone was announced, Apple had already worked out the details of a mechanism for third parties to create software and sell it, and that is the App Store that we know and love. The procedures that Apple put in place require very strong authentication. All developers have to sign their applications with a cryptographically secure certificate. Every application is tied to a developer, and as you know, all of the applications get reviewed by Apple. At the time, there was a hue and cry that this was a walled garden.
So when Google announced the Android app store, they specifically eschewed this Big-Brotherish control that Apple exerts, and they only require applications to be signed by the developer in a way that is not as secure. Most importantly, they don’t require a review for the source code or the application. But in just the last couple of weeks, we’ve seen some really nasty Android apps that were essentially only possible because Google adopted this hands-off strategy. Some bad guys found ways to take an existing application, modify it to send, for example, really expensive text messages, and then upload it back to the store. So you could be downloading your favorite Tetris application, and it looks for all the world like the application was released by the famous developer, but it has in fact been surreptitiously modified to do something very nasty.
Should people feel worried that cell phone companies can track their every move if they want to?
The tracking itself is not a violation. Phone companies are allowed to collect data that helps them engineer their networks, and this data is necessary for deciding where to place cell towers, how to route calls, and so on. The problem occurs when companies use or sell location information for other purposes: marketing, mostly. There’s incredible pressure for companies to monetize their assets, and it’s becoming increasingly clear that data can be a game-changing asset. Phone companies have considered using their data for personalized and location-based advertising, and in some cases are even analyzing social networks based on call records. So people have good cause to be concerned.
BU is working on this project with Deutsche Telekom, recently forced to reveal it had closely tracked a German politician. Does that pose a conflict for your work?
DT is an immense company and we work with folks in the research arm while the story about the politician concerns the operational side. However, when data like this is released into the public domain, it actually aids research, because it provides realistic inputs for experiments and analyses.
What sort of work have you been doing since receiving the NSF grant?
The work is just getting started. There’s the research side and the outreach side. We’ve started a project to do a stronger form of authentication between phones. Say we meet in a bar and I want to send you my contact information. If I sent it through the internet, there’s all sorts of security issues associated with that. It would be nice if I could send this directly from my phone to your phone. We could do this in a way that had no exposure to eavesdropping over the internet. One of the advantages you have in this situation is that both phones are in the same environment, so they can sense the same environment. And a simple way to do that is to take the two phones and shake them at the same time while holding them together. So they’re both sensing acceleration and that can be measured, and that gives you the ability to create encryption keys in both phones, without any data passing between them. So nothing that can be eavesdropped on passed between the two phones, but both phones now possess a secret that allows them to communicate.
What kind of phone do you use?
Do you think that’s the most secure?
Of the softphones? Yes. Of all phones? Not at all. It’s certainly less secure than a phone from the previous generation where the software was provided by the manufacturer and couldn’t be modified.
What kind of security measures have you taken?
Actually, the biggest change that I’ve made is to realize how vulnerable my data is if it doesn’t have a pass code on it. But a pass code only slows down dedicated hackers; it doesn’t prevent them. Studies have shown it’s not hard to disable a pass code if you have access to the phone. But it takes a certain period of time. The idea is that you slow the attacker down enough so that you’d have time to, let’s say, engage the remote wipe feature, if that’s what you decide to do. Apple has now added a remote erase feature, which was a big selling point for the BlackBerry for a long time. The basic idea was that your BlackBerry contained so much company information that the company needs the ability to erase it if you lose it.
Caleb Daniloff can be reached at firstname.lastname@example.org Comments