Does the U.S. government have the right to examine any of our private information, no matter how it’s stored, with a properly executed warrant? That’s the crux of the latest battle between Apple and the FBI. As you can read elsewhere on Macworld, the United States District Court of California has issued an order requiring Apple to build firmware to help decrypt an iPhone 5c owned by one of the San Bernardino terrorists.
But I’d like to hone in on a couple of things Tim Cook said in his public letter explaining Apple’s rejection of the court order, which it’s expressed in legal terms in a filing as well:
Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government. … The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices.
Cook has scoped this tightly, talking only about the U.S. government. My colleague, Rich Mogull, wrote here on Macworld about the domestic civil rights issues raised. But what he doesn’t say can also be heard loudly: Once this tool is created, any government in the world in which Apple does business or has employees could ask for the same thing.
Apple doesn’t have special Jedi powers that allow it, in countries in which it has operations, to wave its hands in front of officials in those nations and say, “This isn’t the key you want.” Once Apple succumbs in any country, not just America, to build in backdoors or to create special cracking firmware that uses its deep knowledge to bypass protections, everyone suffers, everywhere.
This isn’t an Apple issue as such. Google has fought a similar, though less fraught battle, about encryption. And it affects any company that makes any form of strong end-to-end or device-based encryption, which could include firms that create and run virtual private network (VPN) services, cloud-based backup systems, VoIP systems (like Skype), and the like.
I currently use both Backblaze and CrashPlan for backups and 1Password for password storage. I’ve specifically opted to use each of those services because I retain the password that locks the encryption key used. 1Password only recently started offering centralized business and family syncing, and uses a method that prevents it from ever requiring or storing passwords that would give it access to customers’ keys.
The U.S. government and any government could try to compel each of these services and many others to build in backdoors or to engineer Web app or client-software-based interception to capture passwords.
A door is a door
In presidential debates and campaign speeches (of both major American parties), we’ve heard a lot of incorrect information about encryption, just as we have from the head of the FBI, James Comey. They posit that only good guys—duly authorized agents of the law—could use golden keys to gain access for limited, court-approved purposes. (Fortunately, a number of former national-security officials and current and former law-enforcement officials have a better take and are speaking out now.)
[“source -pcworld”]