Tuesday, February 23, 2010

Cyber Shock, Virtualization, Cloud Computing & the State of the Web -- Not for the Faint of Heart ...


Partial map of the Internet based on the January 15, 2005 data found on opte.org. Each line is drawn between two nodes, representing two IP addresses. The length of the lines are indicative of the delay between those two nodes. This graph represents less than 30% of the Class C networks reachable by the data collection program in early 2005. Lines are color-coded according to their corresponding RFC 1918 allocation as follows:
* Dark blue: net, ca, us
* Green: com, org
* Red: mil, gov, edu
* Yellow: jp, cn, tw, au, de
* Magenta: uk, it, pl, fr
* Gold: br, kr, nl
* White: unknown

Matt Britt, Internet Map, 12-1-2005

Cyber Shock, Virtualization, Cloud Computing & the State of the Web -- Not for the Faint of Heart ...

Richard Power


Three stories have intrigued me over the past few days, and I will share them with you here. Woven together they deliver a message that needs to be heard (over and over again apparently). And that message? From the corridors of the highest power, through IT networks across the planet, down to each and every end user -- there is a tremendous amount of yet as yet undone, and much of it is attitudinal and cultural.

Cyber Shock

First, there is the Cyber Shock story.

Event: A massive cyber attack has turned the cellphones and computers of tens of millions of Americans into weapons to shut down the Internet. A cascading series of events then knocks out power for most of the East Coast amid hurricanes and a heat wave.
Is the assault on cellphones an armed attack? In a crisis, what power does the government have to order phone and Internet carriers to allow monitoring of their networks? What level of privacy can Americans expect?
A war game, sponsored by a nonprofit group and attended by former top-ranking national security officials, laid bare Tuesday that the U.S. government lacks answers to such key questions.
Half an hour into an emergency meeting of a mock National Security Council, the attorney general declared: "We don't have the authority in this nation as a government to quarantine people's cellphones."
The White House cyber coordinator was "shocked" and asserted: "If we don't have the authority, the attorney general ought to find it."
Ellen Nakashima, War game reveals U.S. lacks cyber-crisis skills, Washington Post, 2-17-10

If you google Cyber Shock, you will find hundreds of news items, and blog posts, about it.

When I heard it, I started hearing the theme music from the hilarious filmGroundhog Day, starring Bill Murray and co-written and directed by Harold Ramis again.

Do you remember The Day After (1996)? Do you remember Eligible Receiver (1997)? There have been many other such cyber war games over the past 15 years. I have been involved in a few, and studied the results of many.

So when I read the WashPo's headline,"War game reveals U.S. lacks cyber-crisis skills," I tweeted, "Shouldn't this read STILL lacks?"

As a reality check, I asked a few friends and colleagues who have been at and beyond the front-line for all the years of this long struggle. I received three responses. None thought it was anything new, of course. One dismissed it as a "bad joke." One said hopefully it would "embarrass" some officials on high to "addressing the issue" at long last. One said that it never hurts to try to get the attention of people who are just tuning in."

From my perspective, I encourage and commend such exercises, but I think everyone would learn more if we framed them in proper historical context, and I certainly would have hoped that we would be much farther along, here at the start of the second decade of the 21st Century, than discovering that we STILL LACK cyber crisis skills.

As it quite often, the press (both mainstream and IT) missed the real story.

Head in the Clouds

Meanwhile, two new studies perked my interest, and not surprisingly they also reinforce the sense that we are spinning our wheels in the the digital Thunderdome.

The first of them was conducted for Symantec by Applied Research. It queried "2,100 top IT and security managers in 27 countries."

Is moving to virtualization and cloud computing making network security easier or harder? The "2010 State of Enterprise Security Survey - Global Data" report shows that about one-third believe virtualization and cloud computing make security "harder," while one-third said it was "more or less the same," and the remainder said it was "easier." The telephone survey was done by Applied Research last month on behalf of Symantec, and it covered 120 questions about technology use -- organizations remain overwhelmingly Microsoft Windows-based -- and cyberattacks on organizations. Ellen Messmer, Security of Virtualization, Cloud Computing Divides IT and Security, Network World, 2-22-10

Harder? About the same? Easier? Well, of course, the most insightful answer would be "All of the Above." (There are numerous security concerns, not the least of which is that we still do not know most of them.) But I doubt the "All of the Above" was available option to the respondents.

Certainly, it should be of concern to us all that most of these "top IT and security managers" are under the impression that virtualization and cloud computing have either made the challenge of cyber security easier to deal with or had no impact on it at all. This data point suggests that a majority of "top IT and security managers" don't really understand security in any depth.

Leading SaaS Vendor Offers Evidence of Enterprise Exposure

The second study is hot of the digital press.

Zscaler Lab's State of the Web - Q4 2009, A View of the Web from an End User's Perspective." Zscaler is a Security-as-a-Service (SaaS) vendor, so its "network of web gateways continually inspects traffic for millions of end users around the globe."

"With the emergence of Advanced Persistent Threats (APT) as evidenced with the Operation Aurora attacks, it is obvious that large enterprise are vulnerable to such targeted attacks that exploit employee behavior," the Zscaler study concludes; and furthermore, it also offers corroboration that "vulnerable desktops and browsers are extremely common in corporate environment making them easier targets than previously believed."

It is an interesting report, with lots of data points to ponder, but two in particular jumped out at me.

First, Internet Explorer continues to dominate among browsers, with over 70% of market share, while Firefox Mozilla its nearest competitor, with only 15% of market share in December 2009. I was not surprised that IE still held sway, but I was surprised that the margin was so significant. But what is even more striking to me was that 48% were still using 6.0, while only 46% had moved on to 7.0, and only 5% have moved on to 8.0. Yes, although 6.0 is still being updated by Microsoft, it lacks several important security features available in later releases. We are talking about IE here? How can this be so?

Another intriguing tidbit, concerning "Top Ten Malware IP Addresses," was that "more and more," Zscaler reports seeing "otherwise legitimate sites hosting malware without being aware of it."

I asked Mike Geide, a senior security researcher for Zscaler, for some insight into the study's findings.

CyBlog: On pg. 17 and pg. 18, you share some numbers on browsers that were eye opening to me, specifically breakdowns of market share and IE versions. I expected to see IE still holding the lion's share, but I confess I was surprised to see by it to be dominating by such a wide margin. Why? Is this a failure on corporate network managers to push upgrades out, or insist on them, or are the 6.x number inflated by home use (of course, either way it represents an unnecessary risk)?

Mike Geide: In short, yes this a failure of corporate network managers. This could be because corporate managers are unfamiliar with the security risks of IE 6 / benefits of IE 8. Because MS is still supporting / releasing patches for IE 6, managers may feel that they are sufficiently secure because their IE 6 is patched. Additionally, while IE 8 is pushed as a High-Priority update from Automatic Updates, it will not automatically install on machines. Users must opt-in to install IE 8 or organizations may deploy via SMS / WSUS. (Ref: http://blogs.msdn.com/ie/archive/2009/04/10/prepare-for-automatic-update-distribution-of-ie8.aspx)

What could corporate IT security people be doing to improve on these numbers?

Mike Geide: Corporate IT should have a centralized method for managing software and patch installation. It is not enough to just patch the underlying OS, web browsers and other client software (e.g., Adobe Acrobat, MS Office, etc.) need to be included in this management.

CyBlog: Is there anything that Microsoft could be doing (or not doing) that could help these numbers?

Mike Geide: If Microsoft announced an end-of-life for IE 6, this may force organizations to conduct an audit of their systems and upgrade to IE 8. However, from MS's point-of-view they may feel that there is still too much of a user base to end-of-life the product, and should continue to maintain/provide patches for any newly discovered vulnerabilities that would impact this user base. The responsibility is really on the corporations and end users to sufficiently patch and upgrade.

CyBlog: On pg. 22, you talk about Top 10 Malware IP Addresses, and note that "more and more legitimate sites are hosting malware without being aware of it." I see you have listed the IP addresses, could you share some more information here? Of the Top 10, how many are "legitimate sites," and could you characterize them in any way? Types of businesses, types of applications, types of traffic?

Mike Geide: For this report we did not specifically break out stats for "legitimate sites" versus "illegitimate sites" for malware- perhaps we will do this for the next report. I would need to pull the logs to properly answer this question, however based on Google results, the majority of the top 10 appeared to be illegitimate sites identified in other resources (e.g., ThreatExpert).

CyBlog: Of the Top 10, how do you define this?

Mike Geide: I would define an illegitimate site as a site that is setup by the attacker explicitly for malicious purposes. Whereas a malicious legitimate site provides or had provided benign content, but has been compromised and is serving some malicious content (e.g., embedded malicious iframe or JS drive-by-download).

CyBlog:Is there any accountability for either "legitimate sites" or "illegitimate sites" that show up in such a ranking?

Mike Geide: Depends. Some hosting providers will terminate legitimate site accounts if they are repeatedly compromised, while other hosting companies care very little about malicious content being hosting. Depending on the severity of the incident and jurisdiction, law enforcement (LE) could enforce accountability- but LE resources do not scale to the Internet. In short, the Internet doesn't have an overall policing body to ensure accountability. If a site continues to be an offending site, block it from being visited. If the site provides a legitimate service they will either fix their security problem or go out of business.

CyBlog: What are key measures that "legitimate sites" should be taking to avoid ending up as a part of the problem instead of being part of the solution?

Mike Geide: Legitimate sites are frequently leveraged to host malicious content through vulnerabilities in 3rd party web applications (e.g., Wordpress, Joomla, etc.). The most common vulnerabilities in these and other web applications are SQL Injection and Cross Site Scripting (XSS). Legitimate sites should only install 3rd party web applications that they absolutely need, and then follow the patch cycle of the application to ensure that known vulnerabilities are fixed.