The feds have once again demanded access to all encrypted data, whether through a front door or a back door. But, what really does this mean? How does this affect our data security?
My name is Ryan, and I am an agorist. Today, we are talking about encryption, security, and trust.
Every few years, we once again hear about the federal government discussing ways to get around encryption. Usually, thankfully, it doesn't head anywhere. The most spectacular failure was in the nineties when the federal government tried to get tech companies to use special chips which would give the feds access to people's data. There was an outcry at the time and it led to the federal government abandoning that proposal and it also led to people open-sourcing hard encryption. Folks realized that the best way to protect encryption technology was to release it into the wild so that it could be implemented anywhere and everywhere.
Last July, attorney general William Barr once again revived this debate. He went after tech companies who use encryption in their products. He claimed that by using unbreakable encryption (at least for now), they create "law-free zones." He said that "...we must ensure that we retain society's ability to gain lawful access to data and communications."
This really blows my mind. "Society" doesn't have a right to my devices, to my data, to my life. Of course, the reason the feds say that they need access to encrypted data is to stop criminals. But, what is a criminal? Who defines what a criminal is? Obviously, it's the government! Do you see the problem here?
But, moving on, the tech press often talks of two ways that companies can give the feds access to our data. One is through back doors and the other is through front doors.
A back door generally means a way around encryption, like a secret API. Until recently, that was generally the kind of access that the federal government was trying to get. But, back doors that provide a way for the government to get around encryption are inherently flawed because a back door by definition is a security vulnerability. It's only a matter of time until it is exploited.
These days, I'll often hear how we should provide the government front door access instead. A front door does not in any way break encryption. It doesn't provide a way around the encryption. The encryption is just as strong as if there wasn't a front door. What this generally means is that when a company encrypts your data, they do so using more than one encryption key. The government has or can be given access to one of those keys if a case arises where they want to get through someone's encryption. Using the special government key, they can decrypt the data and do what they want with it. Proponents of front doors point out how great it is because it doesn't in any way break the encryption. It doesn't add any new exploitable vulnerabilities. It is technically as secure as if the front door wasn't even there.
While that makes sense to statist tech commentators, they seem to miss the major problem with both front and back door approaches. Both introduce trust into what would otherwise be a trustless, algorithmic relationship.
Unlike humans, algorithms are programmed. They are predictable. You put in X and you get out Y. That is how encryption works. With encryption, you don't have to trust anyone. You have a key and an algorithm, and with those you can encrypt and decrypt your data. That is what makes cryptocurrencies so great. You don't have to trust anyone. You just hold your encryption keys and get work done.
When anyone introduces an extra party into encryption, whether through front or back doors, they introduce trust into the system. If that is the case, your data security is no longer under your control. It doesn't matter how well you protect your keys. It doesn't matter how well you protect your data. There is always going to be someone out there who has access to your data and you have to trust that they will be as diligent with your data security as you are.
On a practical level alone, anyone with a brain in their head should know that the government is the very last group of people who you'd ever want to trust with your data security. It doesn't even matter if the government uses third parties to handle the back doors or front doors or whatever. You are still left hoping, trusting that these people will protect your data.
Security is only as good as its weakest link, and when you introduce ways around or through encryption, you introduce trust into an otherwise trustless system. Broken security is no security because it takes the power out of your hands and places it into the hands of others. I am not ok with that, and you shouldn't be either.
This is TechnoAgorist, episode 43.