BU Today

Opinion

POV: Apple vs FBI: Who Should Prevail?

In this particular case, law enforcement ought to be given assistance

13

With recent advances in technology, the power to obfuscate information, which only a short time ago was available just to nation states, is now found in the palm of any person’s hand. This power protects us and our information from both the criminals wishing to get at that information for their own gains and the unnecessary prying eyes of employers, merchants, carriers, and law enforcement agencies.

Everything works out until something bad happens. Then everyone asks: where were the police, why didn’t the FBI know, how come the CIA did not stop this? This is followed by law enforcement’s quest for information in trying to make sure the bad thing doesn’t happen again.

But reality is not so simple. With the new technology, getting to that information is not just a matter of investigation—the traditional gumshoe efforts—but now computer specialists, technology analysts, and a cadre of experts sleuth the electronic realms where people, both good and bad, have their information. Here they now run into encryption and access controls that stymie the best investigative efforts. So where do they turn? To those companies that design and maintain that infrastructure, with hopes that the same technology that locks law enforcement out can be used to give law enforcement what they seek. But, as just noted, this isn’t so simple.

It is not the technology part that is complex, but rather the question of whether those companies should aid law enforcement and national security. Arguments for providing aid stress that without it, investigative efforts will be for naught and more bad things will happen. Arguments against stress that without the encryption and access controls, we will live in a surveillance state and more bad things will happen.

Where is the truth?

Following the San Bernardino terrorist attack in December, law enforcement sought to make sure that the next bad thing did not happen, while trying to understand how this bad thing did happen. Through investigation, they uncovered the iPhone 5c of Syed Rizwan Farook, one of the terrorists in the attack. This is what they want to search, this is where information is kept, this is the unknown. Unfortunately, the device is locked, and as a default for the protection of the phone’s user, after a number of failed log-in attempts, the phone will wipe the information contained within it.

Because they believe the information on the device may be critical in furthering the investigation, and recognizing the risks in running up against the failed log-in attempt limit, the FBI reached out to Apple, the manufacturer of the iPhone, to see what might be done to disable the “self-destruct” mechanism. Apple felt it should not help, and the FBI and US Department of Justice sought the federal courts’ aid to procure Apple’s assistance.

The public is divided on this issue. Some feel that Apple should stand up for privacy rights and just say no. Others feel that law enforcement is in the dark with the new technology, and without the ability to have manufacturers provide access, key crucial information won’t be available.

On one hand, Apple has already given assistance in the case, working with law enforcement to provide access to the iCloud backup of the device, and after it was discovered that the iPhone had not been backed up since two months prior to the attack, developing procedures to try to get the iPhone to automatically back up the current contents. Unfortunately, these attempts have been fruitless, partly because of mishandling of the device, and the information remains locked.

On the other hand, the fear with going further is that the current request to disable the automatic wiping, permitting law enforcement unlimited tries to break the passcode, goes beyond what normal law enforcement assistance would entail. It would open the door to permit any iPhone to be opened by law enforcement or governments. The government argues that its request is for assistance on this particular iPhone and is not designed to create unencumbered access to any device.

I think in this particular instance, law enforcement eventually will get the assistance that it requests.

Notwithstanding some privacy advocates’ fears, the case is providing a useful and strong precedent to prevent falling down the slippery slope.

First, the request is narrowly tailored to a particularly identified space to be searched, with a reasonable level of specificity as to what is being sought. While the Fourth Amendment prohibits unreasonable searches and seizures, it does not abolish them. Rather, when law enforcement wishes to conduct a search, it must demonstrate probable cause for why the search should be permitted, define where the search is to be conducted, and identify what is to be found. An impartial magistrate then decides whether or not law enforcement has met its burden. Here, law enforcement wants only access to a specific device and is seeking information relevant to terrorist activities, meeting all of these elements, and a court has agreed and ordered the assistance.

Second, the request is not asking Apple to disable the encryption or provide a back door to the algorithm. Rather, the request looks only to permit law enforcement to keep guessing at the passcode until it determines the correct one. Unlike the partial analogies that other commentators have provided, such as law enforcement trying to break into a safe and not needing the safe manufacturer’s help, this scenario is different: not only is the device itself important, but so is the infrastructure that supports it, which is under Apple’s control. That infrastructure is a crucial part of the overall effort and cannot be separated, so therefore Apple’s help is instrumental. Additionally, to that end, one could argue that having Apple control the means to permit the unlimited tries would be better for privacy than leaving it in the hands of law enforcement to commandeer the infrastructure and put their own code in place.

These details mean that in this single case, the assistance should be given to law enforcement. Of course, in our civil society, law enforcement should keep the focus on this need for this investigation and not attempt to broaden this to any other opportunity without similar due process.

There is no doubt that what can be done for this case can be done in any other situation. It is crucial that law enforcement continue to adhere to its current position that this request is a single request based upon real and suitable facts. It should unequivocally state that this is not a wedge in the door for full, unfettered access to any device, Apple or otherwise. They should stay within the defined realm of the Fourth Amendment and not try to demand the code or control over the infrastructure, and they should not be asking for back doors. Otherwise, how can we build trust and not have Apple, Google, or anyone else question the veracity of their intentions in the future?

Kenneth P. Mortensen, a School of Law lecturer in law, teaches Privacy Law. He was the Justice Department’s associate deputy attorney general for privacy and civil liberties under President George W. Bush. He can be reached at kmortens@bu.edu.

“POV” is an opinion page that provides timely commentaries from students, faculty, and staff on a variety of issues: on-campus, local, state, national, or international. Anyone interested in submitting a piece, which should be about 700 words long, should contact Rich Barlow at barlowr@bu.edu. BU Today reserves the right to reject or edit submissions. The views expressed are solely those of the author and are not intended to represent the views of Boston University.

13 Comments

13 Comments on POV: Apple vs FBI: Who Should Prevail?

  • Sillie Abbe on 02.29.2016 at 1:11 am

    iPhone and many other smart devices already have valid backdoors, namely, a fingerprint scanner or a set of camera and software for capturing faces, irises and other body features, which can be collected from the unyielding, sleeping, unconscious and dead people. .

    If Apple wants to claim that they are conscious of privacy and security, they should tell consumers to turn off the biometric functions. If the authority wants to have those backdoors open, they should tell consumers to keep them turned on all the times. And, security-conscious consumers should certainly refrain from turning them on.

  • S on 02.29.2016 at 6:06 am

    I think it’s misguided to think that this is an isolated incident. Should Apple give the FBI access to the iphone it will set a precedent and ultimately give the FBI access to any iphone they recover. Recently it was discovered that the FBI subpoenaed researchers at Carnegie Mellon to get access a method of cracking TOR. http://www.wired.com/2016/02/fbis-tor-hack-shows-risk-subpoenas-security-researchers/

    What would stop the FBI from subpoenaing Apple, getting the source code to the cracked OS and applying a gag order to Apple so no one ever knew?

    We have rights to freedom of speech and to ensure that freedom of speech is upheld it’s imperative that privacy is respected. If we don’t allow all citizens access to privacy we’re on the short slope to a Chinese like surveillance state (if we are not already there).

    On the technical side of things, “allowing the FBI to guess passcodes” is really just a plow to make it sound like they’re sitting there typing in passwords. What they’re asking Apple for is quite different. They would use a computer that can guess billions of passcodes each second and since the password space is only 4 digits that means there’s only 10^4 possible passcodes. That’s a fraction of a second at best. What they’re asking for is a backdoor into the OS. I really think that once this OS (signed by apple) gets out there, there’s no stopping the FBI.

    The real technical issue that Apple should prevented is to not allow sideloading of OS’s without user approval. If they had designed the OS like that, this would be a moot issue. No one would ever be able to access the contents of that iphone and we’d be all be more secure for it.

  • James Sandino on 02.29.2016 at 6:50 am

    Show us the San Bernardino surveillance video.

    Apple wants to verify the FBI’s story.

  • Snowden was right on 02.29.2016 at 7:25 am

    Governments have amply demonstrated that they cannot be depended upon to be protectors or reliable dispensers of justice; nor can governments be trusted to safeguard our information, as perpetual lapses in security have proven. Edward Snowden has revealed the extent to which our supposed protectors will to pursue agency-specific agendas, regardless of laws. We have more reason to trust Apple than the U.S. government. Apple cares more about our security, privacy, and safety than those those who run the government. The director of the FBI, in an Orwellian bent, is trying to convince everyone that subverting personal protections is good for us all. Were Apple to enable breaking into personal devices would establish a precedent, where every law enforcement agency in the country would thereafter demand repetition of the feat, and provide means for them to independently do so. Engineering a “back door” into personal devices would be quickly exploited by the many hackers who have demonstrated technological proficiencies far beyond most government agencies, where the sale of stolen personal information has become an economy unto itself. Consider what it means for the safety of your family members if their photographs and information about them fell into the hands of violent criminals. These are the risks. Consider also that this is not the first time this battle has been fought: Blackberry went through much the same things as governments demanded encryption keys so that they could snoop on private communications. This case is just the latest assault on our freedoms. And this is a case about more than domestic terrorism: it is about far-reaching goals of terrorism. Terrorists know well that the greatest victory is achieved where you cause your foe to initiate lasting harm to themselves. It doesn’t matter if your shoe bomber or clothing bomber is inept, because it will nevertheless cause perpetual disruption to airport security in forcing everyone to take of their shoes or submit to body searches. The terrorists are now gleefully awaiting that back door to be opened.

  • Isolated cases can still establish precedents on 02.29.2016 at 8:25 am

    There is no single, independent, isolated case in common law. If the government gets Apple to unlock the iPhone, the country will have to live with this decision. Privacy is a blurred line and by small steps like this, we are losing our freedoms.

    On one hand we have the government, who with good intentions tries to uncover this investigation. There is no doubt that information contained in the iPhone in question would help the law enforcement. On the other hand, however, Apple stands against this breach of privacy. The company’s neck is on the line; its customers follow the situation very carefully. If complied with the government, customers might even lose the confidence they place on their “favorite brand”. Apple has necessary incentives to protect the information, whereas the government would like to subpoena all suspected devices. In my opinion, it’s apparent that relying on Apple’s standpoint, we can protect our freedom.

    Even though the advocates of government’s stance have presented reasonable arguments, it is still unclear that how unlocking this particular phone would help to prevent future disasters. If this case is an isolated incident like the article claims, the decision will not help much for protection but rather it will establish the precedent. There is yet no convincing argument why breaching freedoms “ex-post” of the incident is so crucial to prevent future incidents? By such an action, the damage to privacy might be much greater than intended. The courts should not open the door to similar subpoenas, requesting access to “suspected devices”.

    • Snowden was right on 02.29.2016 at 9:06 am

      Indeed. The government is obsessed with ever more surveillance, regardless of value. The feds were well aware that terrorists who attempted the 1993 takedown of a WTC tower regarded that as unfinished business, and would somehow return to finish the job. Despite all the intelligence capabilities and opportunities that the feds had in the in the intervening eight years, they utterly failed to detect foreign agents taking large-aircraft flying lessons and then hijacking four commercial airliners. A further example of government haplessness was in fighter jets being scrambled — being sent out over the Atlantic rather than towards hijacked aircraft, because the stale threat response dictum was toward Russian attack.

    • Steven on 02.29.2016 at 10:19 am

      “There is no doubt that information contained in the iPhone in question would help the law enforcement.”

      I think there is serious doubt about the information on Farook’s work phone. He destroyed his personal phone but not his work phone. I seriously doubt (as do most people) he was conducting top secret terrorist business on his work phone.

  • Anthony Levatino on 02.29.2016 at 8:25 am

    I have yet to hear anyone suggest the one solution that allows law enforcement access to the information on this one phone and protects the privacy of all other users. Law enforcement should give the phone to Apple who would then break the encryption at the company and hand the data over to law enforcement without giving law enforcement a tool that could then be used to break any phone.

  • Edward Snowden on 02.29.2016 at 10:41 am

    Take a look at the FAQ regarding the letter Apple posted on this topic http://www.apple.com/customer-letter/answers/.

    Look under the 6th heading:

    “One of the strongest suggestions we offered was that they pair the phone to a previously joined network, which would allow them to back up the phone and get the data they are now asking for. Unfortunately, we learned that while the attacker’s iPhone was in FBI custody the Apple ID password associated with the phone was changed. Changing this password meant the phone could no longer access iCloud services.”

    The FBI messed up big time, and now they’re whining about it. Dr. Mortensen, I believe how you portrayed this is inaccurate. The FBI is not asking simply for a way to enter passcodes repeatedly without the threat of locking the phone, they’re asking for Apple to create custom firmware to allow a computer to try thousands of combinations per second in attempts to unlock the phone. While the FBI may claim they’d only use this once, it could easily be reverse engineered if created, and with that, would allow the FBI and law enforcement to get into ANY Apple device they wish to, warrant or not.

  • A on 02.29.2016 at 12:10 pm

    If you forget your passcode, you can just plug your phone into your registered computer and it will unlock it. If they have access to it, can’t the FBI hack her computer (if she has one) and use that to unlock the phone? There are other options too. But what the FBI is asking is to have a backdoor to all iPhones. That is a huge difference! No person or entity should have that power, and I am happy Apple is standing up for the right to privacy.

    • NNO on 03.01.2016 at 2:05 pm

      Why should they? i mean i partly agree with that but still what if someone has a Nuke and might drop it on the us? wouldn’t you want to stop that?

  • dbphillips on 03.01.2016 at 1:18 am

    The simple problem here that seems to escape far too many people is that the government is attempting to define how the data is recovered and delivered. That is what Apple has a responsibility to defend.

    If the FBI took a phone to Tim Cook with a decree that says “get us all the user data off this phone that was used in a crime”, I guarantee they’d have a USB drive in their hands within a week. Of course, Apple have to get paid if it becomes a frequent occurrence and have to be indemnified against the occasional data loss, but these are minor issues.

Post Your Comment

(never shown)