Skip to content
Technology & Innovation

Why Apple and the FBI Won’t Compromise over iPhone Security

Apple and the FBI sat before the House Judiciary Committee. The group heard testimonies from both sides about the issue, including the Director of the Federal Bureau of Investigation James Comey and Apple’s Senior Vice President and General Counsel Bruce Sewell.

The legal feud between Apple and the FBI just went on-record with the House Judiciary Committee. The group heard testimonies from several witnesses about the issue, including the Director of the Federal Bureau of Investigation James Comey and Apple’s Senior Vice President and General Counsel Bruce Sewell.

The hearing convened to examine a debate that has caused tension between law enforcement and security advocates. A statement from House Judiciary Committee Chairman Bob Goodlatte (R-VA) and Ranking Member John Conyers (D-MI) summarized the issue:

“As technology companies have made great strides to enhance the security of Americans’ personal and private information, law enforcement agencies face new challenges when attempting to access encrypted information.”

The phrase “hard cases make bad law” was repeated by several members of the committee during the hearing. This legal maxim means extreme cases serve as a poor bedrock for founding a general law that could cover a wide range of less extreme scenarios.

The acts committed by Syed Rizwan Farook and Tashfeen Malik in San Bernardino were horrible and the immediate reaction is to seek justice for the lives they took. But the broader implications of what the FBI is asking for and what Apple is refusing to do is an issue with no technological middle ground.

“I am extraordinarily sympathetic to Apple,” says Michael Schrage, a research fellow at the MIT Center for Digital Business. “I’m extraordinarily sympathetic to the FBI and Justice Department. I am even more sympathetic to the families of the people who were hurt and killed in that attack, that terrorist attack. But the reality is this is one of those circumstances where there is no good answer. And whatever answer is chosen is the wrong one.”

FBI Director Comey argued there has never been a closet that American law enforcement could not search when equipped with a warrant. He said encryption has created “evidence-free zones” where law enforcement simply cannot go.

“The logic of encryption will bring us, in the not too distant future, to a place where all our conversations, and all of our papers and effects are entirely private. That is where no one can listen to our conversations, read our texts, read our emails unless we say so,” he told the House Judiciary Committee.

But this isn’t true, there’s a wealth of meta-data and information surrounding our phones that law enforcement is free to access (with a warrant). Law enforcement does have a means to get information; what’s changing is the kind of evidence law enforcement can easily access.

“Farook and Malik  do not appear to have been communicating with other  terrorists,” Susan Landau, a professor at Worcester Polytechnic Institute, pointed out in her written testimony. “If they had been, the information about whom they are communicating with was available not only on their phones (personal or work), but also at  the phone company and/or the ISP.”

The FBI wants Apple to take away the “guard dogs,” as Comey put it, so law enforcement can brute-force its way into the San Bernardino shooter’s iPhone. However, the nature of encryption security doesn’t lend itself to being broken; once it is, the system falls apart and becomes susceptible to intrusion.

Smarter faster: the Big Think newsletter
Subscribe for counterintuitive, surprising, and impactful stories delivered to your inbox every Thursday

By creating this piece of software, allowing law enforcement to bypass the security of this phone and request it for other phones, Apple would create a means of bypassing the login screen on its iPhone. This piece of software would be desired by nefarious individuals all over the world, and the big question isn’t “what if it gets into the wrong hands,” it’s “when.”

“How long will that room really stay clean?” Andy Sellars, a lawyer specializing in technology issues at the Cyberlaw Clinic at Harvard Law School, toldTechnology Review. “The privacy benefit right now comes from the fact that nobody knows how to do this. Not Apple, not the FBI, and we think not the NSA, though maybe they do. As soon as Apple does this, there’s no way this wouldn’t get out, be stolen, be leaked. There is no way that would stay a secret.”

There’s a reason such heavy safeguards exist. Our phones contain a grail of information. Never before has so much about one person been contained in one place. Landau pointed out that regular people rely on these secure systems. Default systems are what help protect the privacy of the many. Criminals will always find ways to subvert the law in some way—foreign apps or phones that fall outside US jurisdiction, for example. Once they know the iPhone is no longer safe, they will move on and find other ways to hide their activities.

The truth is there may not be a technological middle-ground that would work to serve the law and protect the security of the consumer. “I can’t think of any [compromise],” Bruce Schneier, a cryptographer and security expert, told Technology Review. “Either Apple weakens security or they do not. There’s no weakening security halfway.”

Check out the full video of the hearing:

***

Photo Credit: Drew Angerer/Getty Images

Natalie has been writing professionally for about 6 years. After graduating from Ithaca College with a degree in Feature Writing, she snagged a job at PCMag.com where she had the opportunity to review all the latest consumer gadgets. Since then she has become a writer for hire, freelancing for various websites. In her spare time, you may find her riding her motorcycle, reading YA novels, hiking, or playing video games. Follow her on Twitter: @nat_schumaker


Related

Up Next