When the blockbuster twin security exploits known as Meltdown and Spectre appeared in early 2018, Mozilla was among the first to respond, retroactively changing several behaviors of Firefox to help prevent them.
Both attacks rely on using high-speed timing measurements to detect sensitive information, so somewhat counterintuitively, the patches had to decrease the speed of seemingly mundane computations. The first change was to slow down the performance API for web browsers, which had previously been able to analyze the behavior of a page at speeds fast enough to be used in an attack; the second change removed SharedArrayBuffer, a new kind of data structure atop which similar timers could be trivially rebuilt. Similar changes were also soon also implemented by Microsoft for Internet Explorer and Edge browsers and also by WebKit, a tool for viewing the web that is used to build Safari, Mobile Safari, Android Browser, and the dedicated browsers embedded on many other devices. As of this writing, SharedArrayBuffer is now disabled in all major browsers.
Backpedaling on established features of the internet was necessary, but also strange and unexpected. The web is, among other things, a decentralized specification: It is an agreement about how to build things, and then also how to run the things that have been built. In order for a new feature to meaningfully exist on the web, developers and browsers and standards bodies must all first come to an understanding about how it will work. Once you add something to that agreement, you can’t remove it, because you have no idea what problems might arise, or even in which far-flung corners they may appear.
In contrast, technology systems and programming languages that operate in narrower contexts—on a specific server, for example, or inside a specific app—can successfully withstand more dramatic changes to their behaviors. Any upgrade-related malfunctions are localized, and accordingly easy to fix. There are no such promises with a distributed web, though, so its technologies have always evolved in ways thatmaintain backwards compatibility. This is why old web pages pretty much always to work in newer browsers.
Spectre forced browsers to finally break the compatibility covenant of the web. It’s entirely likely that no meaningful projects relying on those features even exist, and even if they do, there may still be simple, safer workarounds. Nonetheless, such a prominent episode in which the internet broke its own code retroactively comes with a cost, at least ideologically. The web can’t quite be trusted as an infallible platform to the extent it had been.
There’s a common practice in software engineering calledsemantic versioning, whereby the officially published new releases of software tools and packages are given slightly complex version numbers—think the more nuanced “version 2.4.1” instead of moving straight to “version 3″—so that shifts to the largest numbers can indicate to both users and automated systems that something important has changed. The system might no longer work the same as it did previously. These are referred to as “breaking changes,” and they serve as safety checks, or at least warning flags.
We don’t know how widespread the impacts were, but the patches made to web browsers to help protect against Meltdown and Spectre do meet the technical definition of breaking changes for the entire web. Of course, by now the web is far too old and chaotic to be subject to any versioning or planning whatsoever. That’s precisely why, until now, it had always opted to safely preserve backwards compatibility.
Both Meltdown and Spectre are caused by the widespread use of a technique called “speculative execution” in which processors eagerly and proactively execute instructions even before they are actually needed by the program. The speculatively computed material is then faster, but the primary discovery of Meltdown and Spectre was that it is insufficiently secured, and thus provides a way to leak sensitive information. Meltdown most notably affects Intel hardware in which the speculative execution was previously assumed to be safe, and attempts to disable it at the software level can have a marked performance cost. This is not just about sluggish laptops—many cloud service providers charge clients varying rates that reflect the computational burden of the contract, so Meltdown and Spectre may show up as an increase in technical budgets, paid out as a literal dollar amount to services that now have to run more slowly as a result of the patch.
Find the full article on Wired.com