Tuesday marked the fourth anniversary of "Safer Internet Day," a 40-country effort to raise awareness about computer and Internet security. But the day probably didn't feel too safe for the dozens of unheralded technologists responsible for defending the World Wide Web against one of the most concerted attacks against the Internet's core since a similar assault in 2002.
Details about the sources, size and methods used in the attack are still trickling in, but like the celebration of Safer Internet Day, it's not clear that anyone using the Web at the time even took notice. That's largely a good thing, and I'll explain why later in this post.
At around 7 p.m. ET on Monday, three of the Internet's 13 "root servers" -- the computers that provide the primary roadmap for nearly all Internet communications -- came under heavy and sustained attack from a fairly massive, remote-controlled network of zombie computers. These are machines infected surreptitiously with programs that allow criminals to control them remotely. The zombies were programmed to try to overwhelm several of the root servers with massive amounts of traffic.
Among the apparent targets was a root server controlled by the Department of Defense Network Information Center. There is also evidence to suggest the attackers targeted the servers responsible for managing the stability of the ".uk" and ".org" domains.
A number of technologists I spoke with who helped defend against the attack said it's too early to say definitively where the attack came from, but this perspective from an operator responsible for maintaining one of the root servers suggests that South Korea, China and the United States were the biggest source of computers used in the attack (the initial analysis suggest that 13 percent of machines involved in the attack were located here in San Francisco, the site of the RSA Security Conference, from which I'm currently blogging.)
In the news coverage so far, theories about the motives behind the attack varied widely, from speculation that it was just hacker mischief to notions that it was cooked up by curious criminals bent on testing their ability to extort the many wealthy and powerful interests that rely on a functioning Internet.
The truth is that no one but the attackers knows the true reason. Paul Levins, vice president of the Internet Corporation for Assigned Names and Numbers (ICANN) -- the entity charged with, among other tasks, coordinating responses among root server providers in such attacks -- said it would likely be at least a week before the more meaningful facts come out.
"This is a fact based community, and we're waiting for the facts to come in after the analysis before we can make committed statements about what the origins were, and its intended targets," Levins said.
This attack highlights a couple of important but often overlooked points, one dark and troubling, and the other somewhat more hopeful. First, the tools and resources used by organized cyber criminals -- namely hacked personal computers that can be remotely controlled by attackers -- are so abundant that they've become virtually disposable. Experts estimate that at any given time there are tens of millions of hacked personal computers that are used in attacks or, more commonly, in sending spam and hosting phishing Web sites.
On the other hand, the fact that there is scant evidence that anyone surfing the Web at the time of the attack even noticed is testament to the resiliency of the global Internet infrastructure, as well as to the swift action on the part of the technologist and experts charged with maintaining the network most of us have come to take for granted.
Not that you can ever have enough security and capacity to handle these types of attacks. The various organizations that operate the 13 root servers are constantly upgrading bits and pieces of their systems to make them more robust and resilient, and one root-server operator -- Verisign Inc. -- is announcing Thursday that it plans to spend $100 million over the next three years to achieve a tenfold increase in its capacity to handle Internet traffic requests.