His attempt shows how easily any hacking tool can be compromised.
The FBI is in a deadlock with Apple. The FBI wants to be able to access anyone’s phones. They want a backdoor into iOS that they claim would be securely kept with the FBI. Apple believes this is dangerous. On one hand, we have the FBI, who sees the ability to break into these devices as a way to find evidence and put the bad guys behind bars. They see it as a way to prevent those who have done harm or are planning to harm others from being able to carry out their plans. In that sense, they’re right, being able to hack any phone would enable them to carry out investigations far more quickly. It may even save lives.
But there’s a drawback to that. It’s a solution that would solve one problem by creating thousands more, and, in that sense, the cost of human life would likely be greater than those it saves. It could, in effect, dismantle the very security we rely on for our banks, our governments, our militaries, and our spies. A backdoor would endanger those living in oppressive regimes, those who love people they are forbidden to love. It would increase phone theft, increase crime, and lead to a bigger headache for the FBI than the problem it would solve. That’s why the NSA actually backs Apple and the rest of the tech industry.
How does a single backdoor lead to a global collapse of security and privacy? Because leakers will always be able to profit from sharing these backdoors. If there’s a golden rule of cyber security, it’s that, if you make a backdoor, no mater how obscure it is, someone will find it and share it. That just happened with an iPhone hacking company in Israel.
In This Article:
What’s the Worst that Could Happen?
Steve Wozniak, affectionately known as “Woz” in the tech community, was one of the Steves who founded Apple. More than once in his life, he made a Mac virus, something that could spread itself across a network of Macs and execute malicious code. He deleted every bit of that code, not even saving it on his own computer. Why? He knew it would get out, even if it was only tucked away on his machine. If there’s a need for that code, someone will get it.
This is why all of us in tech, from our leaders right down to the code monkeys like me, are against the FBI. We know if a backdoor is made, it will either leak or be independently found. That is inevitable. It’s the tech equivalent of hiding a key under a rock in your garden and only telling the FBI it’s there. If someone really wants to, they will find it, and they will get into your house. In the case of Apple, that “house” is millions of iPhone, iPad, and even Mac users worldwide.
If the FBI wins the “right” to force Apple employees to give up their first amendment rights and write code they do not want to write, then they won’t stop there. They’ll have the legal precedence required to tell Google to do the same for Android, Microsoft for Windows, Linux distributors for their distributions like Ubuntu, Red Hat, CentOS, and others, will also have to follow suit. Your only chance at privacy and security will be to make your own operating system. I learned a few things about doing that once a long time ago in college. It’s tedious and difficult.
Has this Ever Happened?
“Danielle,” you say, “You and the rest of your liberal techie elites are always talking about doom and gloom! Name one time this has happened!”
Well, I’ll do you one better. Here are two examples of a backdoor being leaked just since this debate began. If we want to go further back, hundreds of examples could be found. In fact, many security vulnerabilities start with employees who find security flaws and either leak or exploit them.
Microsoft’s Golden Key
In August of 2016, Microsoft’s “Golden Key” was leaked. The Golden key is an encryption key that unlocks the Windows bootloader. In layman’s terms, this was a key that unlocks Windows. A hacker could create a compromised version of Windows, which shares information with the hacker, bypassing security tools put in place by Microsoft and the user. This could then be installed on a user’s machine. The victim would never know they were compromised, and their entire computer would be vulnerable, with every bit of information leaked. It’s also bad business for Microsoft, as cracked versions of Windows could now be shared on torrent sites, so people could install and use Windows for free.
Microsoft’s Golden Key was leaked by a developer. It was an accident in this case. It’s not as though they tweeted it or something. The developer needed it for a test environment, and accidentally released code that included it. Hackers were able to get it from that release. That’s how easy a mistake can be makes the job easy for hackers, and can break security on an entire operating system.
NSO Group
A lone NSO Group developer disabled the simple McAfee security software on his computer. He then moved software onto an external drive. Next, he researched how to sell these hacks to other hackers on the dark web, looking for as much as $50 million in various cryptocurrencies. One of the people he attempted to sell to reported him, and, before a single sale of the software could take place, the 38-year-old developer was arrested.
Had he been more resourceful, this would be a different story. If he had sold the software to multiple buyers at a lower price and prepared for the sale in advance, he likely would have gotten away with stealing the software. He could have also simply released it to stir up anarchy. Hackers all over the world would have access to the iPhone and other smartphones.
The FBI Would Make this Worse
Apple didn’t assist the NSO Group in the way the FBI wants them to. These are exploits made by hackers who found their own way into iOS and decided to make a “legitimate” business of selling those hacks to governments and police forces. There’s nothing truly special here, anyone could eventually figure out these hacks, and Apple wasn’t even intentionally making holes in their security. If this could be done now, with Apple trying to keep iOS secure, imagine how easy it would be if they made a giant backdoor for the FBI. If leaks are this common when only a small number of people have access to the hacks, imagine what would happen if every police force in the world had the backdoors into your favorite operating systems.
Would it Happen to the FBI?
Beyond information security, if the FBI ever won the right to force Apple to make software, the government will have won the right to violate first amendment rights. It would be the most extensive and dangerous attack against our basic human rights. So, to answer the question, yes, it’s all but certain that the FBI could never keep their cracked version of iOS secure. However, the complete breakdown of our initial security isn’t necessarily the worst that could come out of a ruling in favor of the FBI. We’d also sacrifice our freedom of speech. While the FBI is just looking out to protect people, their authoritarian views, closely aligning with the party in control now, are far more dangerous in the long run than the things they’re trying to protect us from now.
Sources:
- Thomas Fox-Brewster, Forbes
- Ed Hardy, Cult of Mac
- Andrew O’Hara, AppleInsider
- Amar Shekhar, Fossbytes
- Chris Smith, BGR