You think you're preserving the privacy of your data by using cryptography?

Think again, or your data sovereignty will be at risk.

Cryptography can indeed keep your private data safe, but only if you guard the keys privately and safely.

Microsoft's practice of uploading your keys to its own servers is the opposite of that.

That malpractice just makes it easier for Microsoft to comply with orders from terrorist governments that don't respect human rights.

But it's actually much worse than that!


Microsoft Windows is a nonfree operating system.

This means it's a set of programs that controls your computer and (nearly) everything that runs on it (operating system), and that it doesn't serve you (nonfree), but rather its true master.

In the case of Microsoft Windows, the true master is Microsoft.

Microsoft built a universal backdoor into its nonfree operating system.

It also instructed its nonfree operating system to contact mothership regularly for new instructions.

Whenever Microsoft feels like it, it can force software changes onto your computer, and they may be customized specifically for you.

People are used to being asked whether to install updates now or later.

What this doesn't tell them is that the only reason they're even asked is because Microsoft has told their computers to do so, and that there are changes that go through without any questions asked.


So far, we know that Microsoft will happily keep your keys handy, and will happily hand them over when ordered.

Now, say you were given the choice to not upload your keys to Microsoft's servers, and you chose not to do so. Are you safe?

Remember, the nonfree operating system installed on your computer will obey its true master, who's not you.

Would you expect Microsoft to offer any resistance if a court ordered it to silently push a key-extraction program onto your computer?

I wouldn't.

I expect your keys would then end up in Microsoft's servers anyway, and they'd then be handed over to the terrorist government that demanded them.

That sucks.

But it gets worse.


Microsoft will already encourage or even force you to upload your other data onto its network storage.

Let's say you have avoided that, and kept your data on your own computer.

Just like the terrorist government can tell Microsoft to extract the encryption keys from the computer you think of as yours, it can tell Microsoft to extract any piece of data stored on "your" computer.

I expect Microsoft will then quietly push instructions onto "your" computer to collect and upload the requested data, and hand it over to the terrorist government.

Note that it doesn't really matter whether you're suspected of doing something wrong.

As long as Microsoft wishes to do business under any terrorist government, it will comply with its orders.

And since, by assumption, it's a terrorist government, it follows that it doesn't respect your human rights, or anyone else's.

Whether or not it is compromising data, it may end up compromised by the true master of the nonfree operating system you use.

And yet it still gets much worse.


Microsoft is hardly the only company that makes nonfree operating systems.

Apple does.

Google does.

They run on the computing devices that most people carry in their pockets, and on many desktop and laptop computing devices too.

Their operating systems also contain universal backdoors for their true masters, and they also contact mothership regularly for updated instructions.

If you do cryptography on them, and rely on that for your safety, you should be very worried.

Not only because your keys are likely already compromised, but also because any data you enter onto that computer, or display on it, must necessarily have been handled by the computer in plaintext form.

This means the data may have been intercepted and leaked to the true master even if your keys somehow remain safe, and even if the data eventually got safely encrypted or erased.

So if you think you're safe because of the strong end-to-end cryptography that your favorite instant messaging app claims to implement, think again.

If you run it on a rotten foundation, or if you ever ran it on a rotten foundation, it has to be presumed compromised.


Unfortunately, a lot of the apps running on top of these rotten foundations are nonfree themselves.

This makes you vulnerable not only to the nonfree operating system's true masters, but also to the apps' true masters.

Though apps typically don't get to control as much of the computing device as operating systems do, whatever data they can access may also be presumed compromised.

Yet somehow it still manages to get worse.


Because you're not the only one to rely on the rotten foundation of nonfree operating systems and on nonfree apps for the safety of your data.

A lot of people out there have also been fooled into making that mistake.

Say you've decided to fix your computing practices and use software that serves you, and that respects your freedom and human rights, all the way down to the operating system.

If others you communicate with rely on rotten foundations, the data you share with them, or that they share with you, must also be presumed compromised.

Unless your communication peers also fix their computing practices.

I know it may seem hard to believe, but it still gets worse than that.


Our governments, even the non-terrorist ones, have also made such mistakes, and adopted computing practices that make our data and the computing they do on our behalf vulnerable to the whims of nonfree software suppliers.

This means that they, too, are vulnerable to universal backdoors and frequent callbacks to mothership built into operating systems and apps.

Microsoft has already shown to be willing to terminate public officers' access to data and computing services.

It's certainly not the only one.

It must be very hard to avoid when your software business operates under a hostile, terrorist, and corrupt government, and you insist on keeping control over your products even while your customers use it.

Governments and the people they serve shouldn't tolerate this sort of hostage situation.

Just like individual users, governments deserve to have control over the computing they do.

Their nations' sovereignty depends on that.

Surprisingly, it can get better.


There is a large body of reliable, freedom-respecting and thus possibly truly secure software developed collaboratively and internationally by individuals and businesses.

Individual users, businesses and governments could install them on their computing devices to make them truly their own.

Maybe they will find the programs don't do exactly what they wish.

A Free Software package is unlike a nonfree program that needs to be taken as a finished all-or-nothing deal.

Users, whether individuals, businesses or governments, are allowed to modify Free Software so that it does what they wish, individually or collectively.

They are even allowed to hire others to make such changes, and to decide whom to hire, whom to trust, whom to get support from.

It wouldn't be Free Software if they couldn't.

It's infinitely adaptable to serve your needs, even when that doesn't serve someone else's business model.

It won't call mothership unless you want it to, and if you do, you get to pick who plays mothership.

It won't take instructions from others, unless you want it to, and if you do, you decide when it should follow them.

It won't upload your data or your keys unless you want it to, and if you do, you decide when and where to upload them to.

It won't change on its own, unless you want it to, or be pulled from under you (why would you ever want that?), to serve others' business needs or interests, and even if others make and release changes, you decide whether to adopt them for your own use.

It enables you to be in control.

Software is like that: if you don't control it, it means that others do, and they thus gain unjust power over you.

Freedom is what the Free in Free Software is about.

Your freedom.

So blong,