With Windows 10, Microsoft Blatantly Disregards User Choice and Privacy: A Deep Dive
Microsoft had an ambitious goal with the launch of Windows 10: a billion devices running the software by the end of 2018. In its quest to reach that goal, the company aggressively pushed Windows 10 on its users and went so far as to offer free upgrades for a whole year. However, the company’s strategy for user adoption has trampled on essential aspects of modern computing: user choice and privacy. We think that’s wrong.
You don’t need to search long to come across stories of people who are horrified and amazed at just how far Microsoft has gone in order to increase Windows 10’s install base. Sure, there is some misinformation and hyperbole, but there are also some real concerns that current and future users of Windows 10 should be aware of. As the company is currently rolling out its “Anniversary Update” to Windows 10, we think it’s an appropriate time to focus on and examine the company’s strategy behind deploying Windows 10.
Disregarding User Choice
The tactics Microsoft employed to get users of earlier versions of Windows to upgrade to Windows 10 went from annoying to downright malicious. Some highlights: Microsoft installed an app in users’ system trays advertising the free upgrade to Windows 10. The app couldn’t be easily hidden or removed, but some enterprising users figured out a way. Then, the company kept changing the app and bundling it into various security patches, creating a cat-and-mouse game to uninstall it.
Eventually, Microsoft started pushing Windows 10 via its Windows Update system. It started off by pre-selecting the download for users and downloading it on their machines. Not satisfied, the company eventually made Windows 10 a recommended update so users receiving critical security updates were now also downloading an entirely new operating system onto their machines without their knowledge. Microsoft even rolled in the Windows 10 ad as part of an Internet Explorer security patch. Suffice to say, this is not the standard when it comes to security updates, and isn’t how most users expect them to work. When installing security updates, users expect to patch their existing operating system, and not see an advertisement or find out that they have downloaded an entirely new operating system in the process.
In May 2016, in an action designed in a way we think was highly deceptive, Microsoft actually changed the expected behavior of a dialog window, a user interface element that’s been around and acted the same way since the birth of the modern desktop. Specifically, when prompted with a Windows 10 update, if the user chose to decline it by hitting the ‘X’ in the upper right hand corner, Microsoft interpreted that as consent to download Windows 10.
Time after time, with each update, Microsoft chose to employ questionable tactics to cause users to download a piece of software that many didn’t want. What users actually wanted didn’t seem to matter. In an extreme case, members of a wildlife conservation group in the African jungle felt that the automatic download of Windows 10 on a limited bandwidth connection could have endangered their lives if a forced upgrade had begun during a mission.
Disregarding User Privacy
The trouble with Windows 10 doesn’t end with forcing users to download the operating system. By default, Windows 10 sends an unprecedented amount of usage data back to Microsoft, and the company claims most of it is to “personalize” the software by feeding it to the OS assistant called Cortana. Here’s a non-exhaustive list of data sent back: location data, text input, voice input, touch input, webpages you visit, and telemetry data regarding your general usage of your computer, including which programs you run and for how long.
While we understand that many users find features like Cortana useful, and that such features would be difficult (though not necessarily impossible) to implement in a way that doesn’t send data back to the cloud, the fact remains that many users would much prefer to opt out of these features in exchange for maintaining their privacy.
And while users can opt-out of some of these settings, it is not a guarantee that your computer will stop talking to Microsoft’s servers. A significant issue is the telemetry data the company receives. While Microsoft insists that it aggregates and anonymizes this data, it hasn’t explained just how it does so. Microsoft also won’t say how long this data is retained, instead providing only general timeframes. Worse yet, unless you’re an enterprise user, no matter what, you have to share at least some of this telemetry data with Microsoft and there’s no way to opt-out of it.
Microsoft has tried to explain this lack of choice by saying that Windows Update won’t function properly on copies of the operating system with telemetry reporting turned to its lowest level. In other words, Microsoft is claiming that giving ordinary users more privacy by letting them turn telemetry reporting down to its lowest level would risk their security since they would no longer get security updates1. (Notably, this is not something many articles about Windows 10 have touched on.)
But this is a false choice that is entirely of Microsoft’s own creation. There’s no good reason why the types of data Microsoft collects at each telemetry level couldn’t be adjusted so that even at the lowest level of telemetry collection, users could still benefit from Windows Update and secure their machines from vulnerabilities, without having to send back things like app usage data or unique IDs like an IMEI number.
And if this wasn’t bad enough, Microsoft’s questionable upgrade tactics of bundling Windows 10 into various levels of security updates have also managed to lower users’ trust in the necessity of security updates. Sadly, this has led some people to forego security updates entirely, meaning that there are users whose machines are at risk of being attacked.
There’s no doubt that Windows 10 has some great security improvements over previous versions of the operating system. But it’s a shame that Microsoft made users choose between having privacy and security.
The Way Forward
Microsoft should come clean with its user community. The company needs to acknowledge its missteps and offer real, meaningful opt-outs to the users who want them, preferably in a single unified screen. It also needs to be straightforward in separating security updates from operating system upgrades going forward, and not try to bypass user choice and privacy expectations.
Otherwise it will face backlash in the form of individual lawsuits, state attorney general investigations, and government investigations.
We at EFF have heard from many users who have asked us to take action, and we urge Microsoft to listen to these concerns and incorporate this feedback into the next release of its operating system. Otherwise, Microsoft may find that it has inadvertently discovered just how far it can push its users before they abandon a once-trusted company for a better, more privacy-protective solution.
- 1. Confusingly, Microsoft calls the lowest level of telemetry reporting (which is not available on Home or Professional editions of Windows 10) the “security” level—even though it prevents security patches from being delivered via Windows Update.
#Privacy #Security #Microsoft #Windows #Cybersecurity @Gadget Guru+ @LibertyPod+
last edited: Sun, 03 Jan 2016 10:27:02 -0600
Recently Bought a Windows Computer? Microsoft Probably Has Your Encryption Key
One of the excellent features of new Windows devices is that disk encryption is built-in and turned on by default, protecting your data in case your device is lost or stolen. But what is less well-known is that, if you are like most users and login to Windows 10 using your Microsoft account, your computer automatically uploaded a copy of your recovery key – which can be used to unlock your encrypted disk – to Microsoft’s servers, probably without your knowledge and without an option to opt-out.
During the “crypto wars” of the nineties, the National Security Agency developed an encryption backdoor technology – endorsed and promoted by the Clinton administration – called the Clipper chip, which they hoped telecom companies would use to sell backdoored crypto phones. Essentially, every phone with a Clipper chip would come with an encryption key, but the government would also get a copy of that key – this is known as key escrow – with the promise to only use it in response to a valid warrant. But due to public outcry and the availability of encryption tools like PGP, which the government didn’t control, the Clipper chip program ceased to be relevant by 1996. (Today, most phone calls still aren’t encrypted. You can use the free, open source, backdoorless Signal app to make encrypted calls.)
The fact that new Windows devices require users to backup their recovery key on Microsoft’s servers is remarkably similar to a key escrow system, but with an important difference. Users can choose to delete recovery keys from their Microsoft accounts (you can skip to the bottom of this article to learn how) – something that people never had the option to do with the Clipper chip system. But they can only delete it after they’ve already uploaded it to the cloud.
“The gold standard in disk encryption is end-to-end encryption, where only you can unlock your disk. This is what most companies use, and it seems to work well,” says Matthew Green, professor of cryptography at Johns Hopkins University. “There are certainly cases where it’s helpful to have a backup of your key or password. In those cases you might opt in to have a company store that information. But handing your keys to a company like Microsoft fundamentally changes the security properties of a disk encryption system.”
As soon as your recovery key leaves your computer, you have no way of knowing its fate. A hacker could have already hacked your Microsoft account and can make a copy of your recovery key before you have time to delete it. Or Microsoft itself could get hacked, or could have hired a rogue employee with access to user data. Or a law enforcement or spy agency could send Microsoft a request for all data in your account, which would legally compel them to hand over your recovery key, which they could do even if the first thing you do after setting up your computer is delete it.
As Green puts it, “Your computer is now only as secure as that database of keys held my Microsoft, which means it may be vulnerable to hackers, foreign governments, and people who can extort Microsoft employees.”
Of course, keeping a backup of your recovery key in your Microsoft account is genuinely useful for probably the majority of Windows users, which is why Microsoft designed the encryption scheme, known as “device encryption,” this way. If something goes wrong and your encrypted Windows computer breaks, you’re going to need this recovery key to gain access to any of your files. Microsoft would rather give their customers crippled disk encryption than risk their data.
“When a device goes into recovery mode and the user doesn’t have access to the recovery key the data on the drive will become permanently inaccessible. Based on the possibility of this outcome and a broad survey of customer feedback we chose to automatically backup the user recovery key,” a Microsoft spokesperson told me. “The recovery key requires physical access to the user device and is not useful without it.”
After you finish setting up your Windows computer, you can login to your Microsoft account and delete the recovery key. Is this secure enough? “If Microsoft doesn’t keep backups, maybe,” says Green. “But it’s hard to guarantee that. And for people who aren’t aware of the risk, opt-out seems risky.”
This policy is in stark contract to Microsoft’s major competitor, Apple. New Macs also ship with built-in and default disk encryption: a technology known as FileVault. Like Microsoft, Apple lets you store a backup of your recovery key in your iCloud account. But in Apple’s case, it’s an option. When you set up a Mac for the first time, you can uncheck a box if you don’t want to send your key to Apple’s servers.
This policy is also in contrast to Microsoft’s premium disk encryption product called BitLocker, which isn’t the same thing as what Microsoft refers to as device encryption. When you turn on BitLocker you’re forced to make a backup of your recovery key, but you get three options: Save it in your Microsoft account, save it to a USB stick, or print it.
To fully understand the different disk encryption features that Windows offers, you need to know some Microsoft jargon. Windows comes in different editions: Home (the cheapest), Pro, and Enterprise (more expensive). Windows Home includes device encryption, which started to become available during Windows 8, and requires your computer to have a tamper-resistant chip that stores encryption keys, something all new PCs come with. Pro and Enterprise both include device encryption, and they also include BitLocker, which started to become available during Windows Vista, but only for the premium editions. Under the hood, device encryption and BitLocker are the same thing. The difference is there’s only one way to use device encryption, but BitLocker is configurable.
If you’re using a recent version of Windows, and your computer has the encryption chip, and if you have a Microsoft account, your disk will automatically get encrypted, and your recovery key will get sent to Microsoft. If you login to Windows using your company’s or university’s Windows domain, then your recovery key will get sent to a server controlled by your company or university instead of Microsoft – but still, you can’t prevent device encryption from sending your recovery key at all. If you choose to not use a Microsoft or a domain account at all and instead create a “local only” account, then you don’t get disk encryption.
BitLocker, on the other hand, gives you more control. When you turn on BitLocker you get the choice to store your recovery key locally, among other options. But if you buy a new Windows device, even if it supports BitLocker, you’ll be using device encryption when you first set it up, and you’ll automatically send your recovery key to Microsoft.
In short, there is no way to prevent a new Windows device from uploading your recovery key the first time you login to your Microsoft account, even if you have a Pro or Enterprise edition of Windows. And this is worse than just Microsoft choosing an insecure default option. Windows Home users don’t get the choice to not upload their recovery key at all. And while Windows Pro and Enterprise users do get the choice (because they can use BitLocker), they can’t exercise that choice until after they’ve already uploaded their recovery key to Microsoft’s servers.
How to delete your recovery key from your Microsoft account
Go to this website and login to your Microsoft account – this will be the same username and password that you use to login to your Windows device. Once you’re in, it will show you a list of recovery keys backed up to your account.
If any of your Windows devices are listed, this means that Microsoft, or anyone that manages to access data in your Microsoft account, is technically able to unlock your encrypted disk, without your consent, as long as they physically have your computer. You can go ahead and delete your recovery key on this page – but you may want to back it up locally first, for example by writing it down on a piece of paper that you keep somewhere safe.
If you don’t see any recovery keys, then you either don’t have an encrypted disk, or Microsoft doesn’t have a copy of your recovery key. This might be the case if you’re using BitLocker and didn’t upload your recovery key when you first turned it on.
When you delete your recovery key from your account on this website, Microsoft promises that it gets deleted immediately, and that copies stored on their backup drives get deleted shortly thereafter as well. “The recovery key password is deleted right away from the customer’s online profile. As the drives that are used for failover and backup are sync’d up with the latest data the keys are removed,” a Microsoft spokesperson assured me.
If you have sensitive data that’s stored on your laptop, in some cases it might be safer to completely stop using your old encryption key and generate a new one that you never send to Microsoft. This way you can be entirely sure that the copy that used to be on Microsoft’s server hasn’t already been compromised.
Generate a new encryption key without giving a copy to Microsoft
In order to generate a new disk encryption key, this time without giving a copy to Microsoft, you need decrypt your whole hard disk and then re-encrypt it, but this time in such a way that you’ll actually get asked how you want to backup your recover key.
This is only possible if you have Windows Pro or Enterprise. Unfortunately, the only thing you can do if you have the Home edition is upgrade to a more expensive edition or use non-Microsoft disk encryption software, such as BestCrypt, which you have to pay for. You may also be able to get open source encryption software like VeraCrypt working, but sadly the open source options for full disk encryption in Windows don’t currently work well with modern PC hardware (as touched on here).
Go to Start, type “bitlocker”, and click “Manage BitLocker” to open BitLocker Drive Encryption settings.
From here, click “Turn off BitLocker”. It will warn you that your disk will get decrypted and that it may take some time. Go ahead and continue. You can use your computer while it’s decrypting.
After your disk is finished decrypting, you need to turn BitLocker back on. Back in the BitLocker Drive Encryption settings, click “Turn on BitLocker”.
It will check to see if your computer supports BitLocker, and then it will ask you how you want to backup your recovery key. It sure would be nice if it asked you this when you first set up your computer.
If you choose to save it to a file, it will make you save it onto a disk that you’re not currently encrypting, such as a USB stick. Or you can choose to print it, and keep a hard copy. You must choose one of them to continue, but make sure you don’t choose “Save to your Microsoft account.”
On the next page it will ask you if you want to encrypt used disk space only (faster) or encrypt your entire disk including empty space (slower). If you want to be on the safe side, choose the latter. Then on the next page it will ask you if you wish to run the BitLocker system check, which you should probably do.
Finally, it will make you reboot your computer.
When you boot back up your hard disk will be encrypting in the background. At this point you can check your Microsoft account again to see if Windows uploaded your recovery key – it shouldn’t have.
Now just wait for your disk to finish encrypting. Congratulations: Your disk is encrypted and Microsoft no longer has the ability to unlock it.
The post Recently Bought a Windows Computer? Microsoft Probably Has Your Encryption Key appeared first on The Intercept.
#Microsoft #Windows #Key Escrow #Encryption #Clipper Chip #Security @Gadget Guru+