【时代周刊 20160218】苹果究竟为什么拒绝FBI?
【中文标题】苹果究竟为什么拒绝FBI?
【原文标题】This Is the Real Reason Apple Is Fighting the FBI
【登载媒体】时代周刊
【原文作者】Julian Sanchez
【原文链接】http://time.com/4229601/real-reason-apple-is-fighting-the-fbi/
有关苹果与FBI的抗争——法院要求苹果帮助解锁已死的圣贝纳迪诺市枪手的手机,我们首先需要了解的一件事是,这其实与手机没有多大的关系。甚至它也算不上是最新一轮的“密码战争”,也就是长期以来有关执法机构和情报部门如何应对无所不在的难以破解的密码工具。而是一场针对未来高科技监控、针对支持全球软件生态系统的基础设施的斗争,而且还是深入探索高科技公司和软件开发人员在多大程度上会被心怀恶意的黑客发起者和政府部门所招募的机会。这只不过是在私下暗潮涌动的表象罢了。
我们首先来看看事情的细节。FBI要求苹果协助解锁赛义德•法鲁克所使用的iPhone,这个人被当局认为去年在一个办公室圣诞节聚会中大开杀戒,后来在与警方的交火中被击毙。他们已经从苹果的云服务器上获取了有关法鲁克行动的大量信息,因为他在那里备份了很多数据,还从其它通信服务商——比如脸书——获得了其它的信息。法鲁克在事发前毁掉了另外两部手机,里面显然隐藏了更多的相关信息,目前尚不知晓数据是否可以恢复。
但是有关法鲁克行为的最新信息储存在他的iPhone 5c里,而且没有备份,这部手机被一组简单的数字密码所保护。由于他们不需要应对长长的、数字加字母的复杂密码,所以FBI可以轻松地使用“蛮力破解法”,也就是花几个小时的时间尝试所有的数字组合。前提是这部手机没有被设置为规定次数的错误尝试之后,密码板会消失,导致手机内的数据永远无法读取。
于是联邦调查局希望苹果可以开发一个特殊的iOS操作系统版本,允许无次数限制、快速地尝试密码组合,并且要得到开发密匙的确认,这样手机会认为它是一次合法、正常的软件更新。
孤立地来看,这个要求似乎很公平。如果仅仅是解锁某个单一的设备,即使里面不包括敏感数据,给予配合似乎无伤大雅。苹果CEO蒂姆•库克之所以反对法院的执行令,是因为他了解FBI建立一个法律判决先例的企图。
从以下四个方面可以看到有关法院执行令的潜在问题。
1,这会导致政府有办法让科技公司协助调查。执法机构和情报部门多年来试图说服国会更新1992年的《通信协助执法法》,其中列明了电话公司和互联网服务商协助政府调查的责任,目的是应对逐渐普及的密码保护手段,或许是要求公司为安全设备和信息软件给政府留下系统后门。在科技公司、安全专家和民间自由团体的强烈反对下,国会一直拒绝这个提案。
于是政府有意扩展了1798年《所有文书法案》中的概念,以此来迫使苹果制作黑客工具,政府在试图从铁板一块的法律层面上寻找突破口。而且,迫使公司在犯罪之后帮助破解自己的安全设置,会提升产品的设计成本,因为在设计过程中就必须要有意留下突破口。
2,这场斗争的结果将会影响到政府所发布的密令。有关数据情报监控的若干项联邦法律条款要求企业为间谍机构提供“技术协助”。就目前所了解到的情况来看,政府的律师倾向于更大适用范围的理解——他们可能已经开始这么做了。但是这场斗争或许已经在外国情报监察法庭前引发了密不示人的争论。其设定的判决先例或许可以表明,政府为了要求企业产生用来破坏自己的设备和软件的黑客工具,究竟会使出什么样的手段。
3,允许这类技术手段被征用的判决先例,在规模上的影响是巨大的。今年夏天,曼哈顿检察官塞勒斯•万斯写到,他的办公室在6个月的期间里遇到了与74台无法解锁的iPhone有关的案件。一旦法院确定苹果可以被迫设计一把万能钥匙,来自政府各个阶层和海内外的类似请求将会潮水般地涌入。毫无疑问,苹果必将成立一个新的部门,专门为政府开通后门,就像很多公司那样为了处理无休止的搜查令而全力配合。
这将会造成内部利益冲突。一家公司必须保护产品的安全,同时还要削弱其安全性。前一方面做得越好,后一方面就越困难。正如库克所说,这还可能让企业很难保证破解工具不外泄或者不被复制。
4,更糟糕的是,如果FBI赢得这场官司,几乎可以肯定其作用不仅限于智能手机。过去一年里,我与哈佛大学法学院的一个专家组合作发表了一份报告,其中预测政府将会利用目前火热的“物联网”来建立一个全球监控网络,以应对加密的挑战。利用开发者提供的密匙,政府可以把间谍软件装扮成可信的系统更新软件。别以为只有你的笔记本电脑上的网络摄像机和麦克风会受到控制,像亚马逊Echo的声控设备、智能电视、网络路由器、可穿戴电子设备,甚至智能芭比娃娃都会落入掌控。
传统计算机设备和新一代网络应用的全球市场,在很大程度上依赖于一个可信的生态系统——人们相信由开发商提供,并且有密码键保护的关键安全更新与他们所宣称的目的一致,其运行,包括与其它应用的交互是广为人知,且一致的。制作可信应用的开发者是信任生态系统的关键所在。如果开发者根据政府的要求被迫开发这些密匙,这个系统将无法维系。如果政府确定可以把间谍软件伪装成应用更新,不是采取更加透明的方法,而是根据公众和法院乱糟糟的一团意见,那么用户和消费者的信任将被损耗殆尽。
这些,就是苹果拒绝执行FBI命令的原因。并非不让联邦政府查看某个死掉的恐怖分子的手机,而是关乎科技公司是否可以被政府要求来破坏人们对于电子设备的信任感。对于一次犯罪调查来说,这样的代价太高了。
原文:
The first thing to understand about Apple’s latest fight with the FBI—over a court order to help unlock the deceased San Bernardino shooter’s phone—is that it has very little to do with the San Bernardino shooter’s phone.
It’s not even, really, the latest round of the Crypto Wars—the long running debate about how law enforcement and intelligence agencies can adapt to the growing ubiquity of uncrackable encryption tools.
Rather, it’s a fight over the future of high-tech surveillance, the trust infrastructure undergirding the global software ecosystem, and how far technology companies and software developers can be conscripted as unwilling suppliers of hacking tools for governments. It’s also the public face of a conflict that will undoubtedly be continued in secret—and is likely already well underway.
First, the specifics of the case. The FBI wants Apple’s help unlocking the work iPhone used by Syed Farook, who authorities believe perpetrated last year’s mass killing at an office Christmas party before perishing in a shootout with police. They’ve already obtained plenty of information about Farook’s activities from Apple’s iCloud servers, where much of his data was backed up, and from other communications providers such as Facebook. It’s unclear whether they’ve been able to recover any data from two other mobile devices Farook physically destroyed before the attack, which seem most likely to have contained relevant information.
But the most recent data from Farook’s work-assigned iPhone 5c wasn’t backed up, and the device is locked with a simple numeric passcode that’s needed to decrypt the phone’s drive. Since they don’t have to contend with a longer, stronger alphanumeric passphrase, the FBI could easily “brute force” the passcode—churning through all the possible combinations—in a matter of hours, if only the phone weren’t configured to wipe its onboard encryption keys after too many wrong guesses, rendering its contents permanently inaccessible.
So the bureau wants Apple to develop a customized version of their iOS operating system that permits an unlimited number of rapid guesses at the passcode—and sign it with the company’s secret developer key so that it will be recognized by the device as a legitimate software update.
Considered in isolation, the request seems fairly benign: If it were merely a question of whether to unlock a single device—even one unlikely to contain much essential evidence—there would probably be little enough harm in complying. The reason Apple CEO Tim Cook has pledged to fight a court’s order to assist the bureau is that he understands the danger of the underlying legal precedent the FBI is seeking to establish.
Four important pieces of context are necessary to see the trouble with the Apple order.
1. This offers the government a way to make tech companies help with investigations. Law enforcement and intelligence agencies have for years wanted Congress to update the Communications Assistance for Law Enforcement Act of 1992, which spells out the obligations of telephone companies and Internet providers to assist government investigations, to deal with growing prevalence of encryption—perhaps by requiring companies to build the government backdoors into secure devices and messaging apps. In the face of strong opposition from tech companies, security experts and civil liberties groups, Congress has thus far refused to do so.
By falling back on an unprecedentedly broad reading of the 1789 All Writs Act to compel Apple to produce hacking tools, the government is seeking an entry point from the courts it hasn’t been able to obtain legislatively. Moreover, saddling companies with an obligation to help break their own security after the fact will raise the cost of resisting efforts to mandate vulnerabilities baked in by design.
2. This public fight could affect secret orders from the government. Several provisions of the federal laws governing digital intelligence surveillance require companies to provide “technical assistance” to spy agencies. Everything we know suggests that government lawyers are likely to argue for an expansive reading of that obligation—and may already have done so. That fight, however, will unfold in secret, through classified arguments before the Foreign Intelligence Surveillance Court. The precedent set in the public fight may help determine how ambitious the government can be in seeking secret orders that would require companies to produce hacking or surveillance tools meant to compromise their devices and applications.
3. The consequences of a precedent permitting this sort of coding conscription are likely to be enormous in scope. This summer, Manhattan District Attorney Cyrus Vance wrote that his office alone had encountered 74 iPhones it had been unable to open over a six-month period. Once it has been established that Apple can be forced to build one skeleton key, the inevitable flood of similar requests—from governments at all levels, foreign and domestic—could effectively force Apple and its peers to develop internal departments dedicated to building spyware for governments, just as many already have full-time compliance teams dedicated to dealing with ordinary search warrants.
This would create an internal conflict of interest: The same company must work to both secure its products and to undermine that security—and the better it does at the first job, the larger the headaches it creates for itself in doing the second. It would also, as Apple’s Cook has argued, make it far more difficult to prevent those cracking tools from escaping into the wild or being replicated.
4. Most ominously, the effects of a win for the FBI in this case almost certainly won’t be limited to smartphones. Over the past year I worked with a group of experts at Harvard Law School on a report that predicted governments will to respond to the challenges encryption poses by turning to the burgeoning “Internet of Things” to create a global network of surveillance devices. Armed with code blessed by the developer’s secret key, governments will be able to deliver spyware in the form of trusted updates to a host of sensor-enabled appliances. Don’t just think of the webcam and microphone on your laptop, but voice-control devices like Amazon’s Echo, smart televisions, network routers, wearable computing devices and even Hello Barbie.
The global market for both traditional computing devices and the new breed of networked appliances depends critically on an underlying ecosystem of trust—trust that critical security updates pushed out by developers and signed by their cryptographic keys will do what it says on the tin, functioning and interacting with other code in a predictable and uniform way. The developer keys that mark code as trusted are critical to that ecosystem, which will become ever more difficult to sustain if developers can be systematically forced to deploy those keys at the behest of governments. Users and consumers will reasonably be even more distrustful if the scope of governments’ ability to demand spyware disguised as authentic updates is determined, not by a clear framework, but a hodgepodge of public and secret court decisions.
These, then, are the high stakes of Apple’s resistance to the FBI’s order: not whether the federal government can read one dead terrorism suspect’s phone, but whether technology companies can be conscripted to undermine global trust in our computing devices. That’s a staggeringly high price to pay for any investigation.
演戏呢,为了能在中国销售,因中国刚通过的新安全法。 Q28)
页:
[1]