Apple CEO Tim Cook made headlines last week when he released an open letter opposing a recent court order from the FBI.
The order required Apple to build new firmware that would enable the FBI to brute-force its way into one of the encrypted iPhones used by the San Bernardino shooters.
This is important for a litany of reasons, but none more so than the fact that it’s asking Apple to blatantly disregard the security and privacy of its users. If Apple is forced to comply, it will set a precedent which would inevitably be used in future cases concerning the privacy of citizens around the world!
Why This Case Is So Important
We need to understand this is more than just a fight over an iPhone; it’s a fight for the future of privacy. What’s really at stake here is whether or not the U.S. government can legally force a company to create software that deliberately undermines the privacy of its users.
When it comes down to it, the FBI hopes to use this court order as a basis for other privacy-invading cases in the future. But what they fail to realize is the simple fact that undermining encryption will inevitably weaken national security. So the whole argument that it’s in the public’s interest is moot.
Here are just a few examples of what will happen if Apple is forced to comply.
- Apple builds a backdoor and immediately every lost or stolen iPhone becomes exponentially easier to hack. The potential dangers are huge. Who’s to say only the government can access this so-called backdoor? Attackers could use a backdoor to defraud us through calls or our banking apps. They could even steal our identities, as well as the open possibility for blackmail.
- Other tech companies are forced to follow in Apple’s footsteps and create backdoors of their own. Now that the iPhone has legally been sabotaged, the FBI will likely turn its eyes toward Android. Then Facebook Messenger and WhatsApp. Soon every encrypted system you use becomes compromised, and the Orwellian future people have been warning us about is suddenly here and in full force.
- Our digital rights become more ambiguous, and the opportunity to stand up for our privacy is endangered. Even worse, our right to privacy has been compromised. Whistleblowers and privacy advocates receive less protection and are persecuted harshly. Who’s to say people even have a right to privacy anymore?
Again, this is only a taste of what could happen. The real ramifications are too numerous to list.
Hasn’t Apple Complied with the FBI’s Demands in the Past? Why Stop Now?
Apple and other tech companies have come a long way since the PRISM days. In the past, Apple was known to comply with court orders that helped law enforcement agencies unlock devices. But technology was different then.
In iPhones with iOS 7 or earlier, Apple routinely kept a copy of every device’s encryption key in case the user forgot their password. Therefore, in a legal sense, it was easy for a law enforcement agency to ask for a copy of this encryption key and use it to unlock a seized device.
But after the fallout from the Snowden leaks, the perverse scope of citizen surveillance became public, and Apple and other companies made an effort to cater to a more security and privacy-conscious audience.
For iOS 8 and on, the encryption key is kept solely on the phone, meaning the device can only be unlocked with a PIN defined by the user. So now when the FBI asks Apple to hand over sensitive user data, they can’t—because they simply don’t have it.
Cases like this have been going on for years now. What makes this case so unique is now the FBI is justifying its actions by citing a clause in the All Writs Act, a vague and irrelevant law that was created 227 years ago. No, that wasn’t a typo.
Ironic how the FBI is trying to reshape the digital age with 200-year-old logic.
Should Companies Be Required to Build Backdoor Technology?
The FBI thinks so. So does Judge Sheri Pym. Luckily, technology and privacy advocates have been vocal enough in their counterarguments.
The #FBI‘s insecurity mandate on #Apple will hurt ordinary citizens, not criminal masterminds. Here’s why: https://t.co/0IPGlsAeFv
— Edward Snowden (@Snowden) February 18, 2016
What’s to stop the FBI from ordering Google to create backdoors for their Android systems? Or for secure chat apps like WhatsApp and Facebook Messenger to start making user information public?
And if you think this is just a U.S. issue, think again. Other countries have long looked to the U.S. as the basis of privacy laws. If Apple’s forced to comply with this court order, other countries will surely take note.
Everyone Deserves Encryption
When it comes down to it, what’s really at stake here is how our legal system views privacy and our understanding of how programming and technology works. If Apple can be compelled to alter their code, then everyone else can, too.
Encryption matters. That’s why we will continue to support Apple in their fight to protect your privacy. We hope you will too.
Featured image: Viktor Hanacek /PicJumbo
New iPhone: Viktor Hanacek /PicJumbo