Edward Snowden’s 2013 bulk surveillance revelations turned up the heat under the cauldron where privacy and civil rights activists had for years been pouring in examples of corporate surrender to government overreach.

The response was new interest and new players in the fight for ubiquitous encryption — not just to protect dissidents from abuses of human rights and law, but also to protect everyday citizens from the ongoing threats of online fraud and theft.

Encryption and privacy became mainstream issues of competition rather than an obscure issue few took interest in. Both Apple and Google enabled encryption by default; encryption that even they couldn’t bypass.

In his testimony before the Senate Judiciary Committee, FBI Director James Comey recognized that “companies have been responding to a market demand for products and services that protect the privacy and security of their customers, [which] has generated positive innovation that has been crucial to the digital economy.” Nevertheless, the Director cited to ISIL terrorism and violence in the Middle East, calling on companies to build backdoors into their encryption to allow the government access to encrypted data with a court order or warrant.

New York Assembly Member Matthew Titone, who has received campaign contributions from law enforcement PACs, introduced a bill that would fine companies like Apple and Google for selling phones that can’t be decrypted by government court order.

Of course, just like there are no guns that can only shoot bad guys, there is no such thing as a backdoor that can only be opened by the U.S. government. As tech journalist Walt Mossberg said:

“Once an encryption system is breached, a cascade of other actors, from malevolent hackers to foreign dictatorships like China and Russia will waltz through that backdoor, either by hacking or by enacting laws requiring that U.S. companies provide them the same access provided to American agencies.”

This isn’t the first time politicians, prompted by fear and State interests, legislate technology they don’t understand. Encryption isn’t a service or product, it’s an intangible technology that cannot be regulated unless attached to a company using it.

Many predicted that government-backed onerous regulation on companies that use encryption would backfire; that ISIL would eventually create their own services and platforms for secure communications.

And they did.

Earlier this month, researchers found that ISIL created Alrawi, their own Android app for secure communications and the spreading of propaganda. Perhaps knowing that the app would be banned from the Google Play app store, ISIL distributed the application package (APK) for manual installation.

TechCrunch’s Josh Constine jumped in:

“Alrawi can’t be downloaded from Google Play. Instead it must be installed from shady back alleys of the Internet.”

I take issue with Constine’s characterization of sideloading as ‘shady,’ the process of manually installing an application not available in Apple or Google’s mobile app stores.

Apple has faced extreme criticism for its restrictions on apps it allows into its iOS App Store, banning apps that provide useful tools for software developers, let users use their phone as a hotspot, or contain politically disagreeable content. Perhaps most notoriously, The Intercept journalist Josh Begley created an iOS app that collates news reports of every time someone was killed by a U.S. drone, which Apple removed at least seven times for being “objectionable.”

Just like the spoken word, software code is speech. Software code is expression. There can be reasonable restrictions in a marketplace, but there still must be freedom. Google and Apple are corporations, though, free to enact any prohibitions they desire.

Sideloading allows users the freedom to create their own apps or obtain apps directly from developers to use their purchased hardware the way they want. After all, a phone is just a handheld computer, and most computer programs aren’t officially distributed through your operating system’s app store.

Which is why it was so strange to see Constine, a technology journalist, go on to state:

“This raises the question of how far mobile platforms are willing to go to fight terrorism. Governments are pushing for backdoors through encryption, but perhaps there’s another way to keep people safe without violating privacy for everyone. Apple and Google could easily kick apps used to organize violence out of their official app stores. But would they be willing to build further barriers to usage directly into their mobile operating systems?”

The unqualified suggestion that enacting further restriction on app distribution and digital rights could “keep people safe” from terrorism is repugnant.

Just as terrorism is not a card you can hold up to get out of tough conversations about freedom of expression and civil liberties, the same can be said about online platforms and the freedom to tinker with hardware you own.

So ISIL gets a green light here? Absolutely not. Government has the resources and power to directly target Alrawi’s servers through legal process or, if appropriate, targeted attacks. We are at war, after all. But let us also recognize that the Paris attackers didn’t use encryption to plan their attacks, lawmakers found no evidence that the San Bernardino shooters used encryption, and government often ignores threats even if intelligence is handed right to them.

At the end of the day, security is a fantasy, but is achievable in the moment through good and always-improving practices. There are open-source, peer-reviewed, security-audited messaging apps like Signal, which use a plethora of good practices to handle security and authentication even before you begin to look at the issue of encryption and message delivery.

So, the silver lining here is that if ISIL is rolling their own encryption and communications systems, they’re doing it without support, and they’re probably not doing a very good job.

Perhaps Apple CEO Tim Cook said it best: “On your iPhone, there’s likely health information, there’s financial information. There are intimate conversations with your family, your co-workers. There’s probably business secrets and you should have the ability to protect. And the only way we know how to do that, is to encrypt it … I don’t believe that the trade-off here is privacy versus national security. I think that’s an overly simplistic view. We’re America. We should have both.”