- From: David Booth <david@dbooth.org>
- Date: Wed, 24 Feb 2016 11:19:25 -0500
- To: public-credentials@w3.org
On 02/24/2016 10:21 AM, Timothy Holborn wrote: > Without considering the technical concept explicitly described as > 'backdoor', is the following a true statement? > > "“It would be great if we could make a backdoor that only the FBI could > walk through,” says Nate Cardozo, an attorney with the Electronic > Frontier Foundation. “But that doesn’t exist. And literally every single > mathematician, cryptographer, and computer scientist who’s looked at it > has agreed.” > > Source: http://www.wired.com/2016/02/apple-fbi-privacy-security/ Since I am not a security expert I won't comment on that question. But on as a side note, it seems to me that Apple could make a simple change to IOS to make it *impossible* for them to do what the FBI is asking them to do, even if the court orders them to comply. If I have understood correctly, the FBI wants Apple to push to the phone a new version of IOS that would disable the delete-all-data-after-10-failed-unlock-attempts feature, thereby enabling the FBI to use a brute force attack to unlock the phone. But if Apple updated IOS to require a phone to be *already* unlocked in order to install IOS updates, then it would be impossible for Apple to do that. In fact, if Apple is currently able to disable the delete-all-data-after-10-failed-unlock-attempts feature by pushing an IOS update to a locked phone then it seems to me that that is a significant security hole already, which really should be patched. Do others agree, or have I misunderstood something? David Booth
Received on Wednesday, 24 February 2016 16:19:57 UTC