Microsoft’s storage of Windows encryption keys could expose users to hackers, gov’t – report
When it comes to computer security, encryption is key. That’s because encrypted devices or data require a key only accessible to the owner, but many Microsoft users aren’t as protected as they think, and could be exposed to hackers and law enforcement.
Since Windows 8.1, “disk encryption” has been a built-in feature for smartphones and other devices. Though it is helpful against common thieves, its compulsory nature sends a backup copy of the recovery key to Microsoft when a user logs in through a Microsoft account. As soon as one double is created, it can lead to a slippery slope of vulnerability, inviting backdoor access to hackers or government agencies such as the Federal Bureau of Investigation.
There is no warning or opt-out option for Microsoft users, something eerily akin to the Clipper chip program pushed by the National Security Agency and the Clinton White House in the 1990’s, according to The Intercept, which first reported on the Windows vulnerability.
The Clipper chip was an encryption technology developed by the NSA for telecom companies, allowing for “key escrow,” or shared access between the government and the corporations to personal encrypted gadgets.
The NSA’s Clipper chip was defunct by 1996 thanks to advances in encryption technology. Today, options like PGP encryption exist for messaging, and there are free open-source tools like Signal which block out surveillance of phone calls.
After a Microsoft user logs in for the first time, which automatically sends a copy of their encryption key to the company, the key can be deleted. However, this tactic may only be useful if nefarious forces haven’t already accessed the machine or its accessories after a login, which is possible to accomplish in less time than it takes to delete the key.
“The gold standard in disk encryption is end-to-end encryption, where only you can unlock your disk. This is what most companies use, and it seems to work well,” Johns Hopkins University cryptography professor Matthew Green told The Intercept. “There are certainly cases where it’s helpful to have a backup of your key or password. In those cases you might opt in to have a company store that information. But handing your keys to a company like Microsoft fundamentally changes the security properties of a disk encryption system.”
“Your computer is now only as secure as that database of keys held by Microsoft, which means it may be vulnerable to hackers, foreign governments, and people who can extort Microsoft employees,” Green added.
‘Going dark’: Smartphone encryption debate heats up with no sign of legislative solution https://t.co/bOaWc0fKgKpic.twitter.com/uOrcO2seJA
— RT America (@RT_America) November 28, 2015
The logic behind Microsoft’s decision to set up its encryption in this way is simple: provide for the most probable customer needs.
“When a device goes into recovery mode, and the user doesn’t have access to the recovery key, the data on the drive will become permanently inaccessible. Based on the possibility of this outcome and a broad survey of customer feedback we chose to automatically backup the user recovery key. The recovery key requires physical access to the user device and is not useful without it,” a Microsoft spokesperson told the Intercept.
The main competitor to Microsoft Windows is Apple iOS, and Apple leaves it up to the customer whether to store a key double on its iCloud. There is no such choice for Windows users unless they use more expensive systems like Windows Pro or Windows Enterprise. In those cases, a premium service called BitLocker allows users the choice of printing the key or saving it to a USB stick, rather than sending it to Microsoft.