Friday, August 13, 2010

Voltaire Lives: A Report from USENIX Security Symposium 2010

Voltaire was one of several Enlightenment figures (along with Montesquieu, John Locke and Jean-Jacques Rousseau) whose works and ideas influenced important thinkers of both the American and French Revolutions. -- Wikipedia


Voltaire: A Report from USENIX Security Symposium 2010

By Richard Power


The 19th USENIX Security Symposium, held in Washington, D.C., delivered a high caliber program, engaging and commercial-free(er), just as I have come to expect from attending this event on a regular basis over the years.

It is always refreshing to attend a conference that's primary function isn't simply to serve as an excuse to hold a trade show, which are, of course, populated by sales personnel who don't understand what the products they are selling, let alone the problems those products are supposed to address (and usually don't).

Here are some of my notes from this year's Symposium.

Proving Voltaire Right

"Common sense is not so common." -- Voltaire (1694‐1778)

It isn't difficult to prove Voltaire right, of course. But civilization is just a little bit better off, whenever someone does. And as I am confident you have noticed, these days, civilization needs all the help it can get.

In his keynote talk, "Proving Voltaire Right: Security Blunders Dumber Than Dog Snot," Roger G. Johnston of Argonne National Laboratory, spoke about the work of the Lab's Vulnerability Assessment Team, "a multi-disciplinary team of physicists, engineers, hackers, and social scientists," and highlighted some fascinating evidence and powerful insights on blunders in numerous aspects of security, e.g., "while publishing guidelines encouraging Member States to conduct background checks on key personnel... The IAEA does no significant background checks on its own employees, including nuclear inspectors."

Johnston spoke of his team's research into GPS spoofing: "It's easy to do with widely available GPS satellite simulators," which can be "purchased, rented, or stolen" and are "not export controlled." "Many are surprisingly user friendly. Little expertise is needed (in electronics, computers, or GPS) to use them."

The easily realizable nature of GPS spoofing attacks, Johnston remarked, translates into a plethora of potential threats: "Crash national utility, financial, telecommunications and computer networks that rely on GPS for critical time synchronization. Steal cargo or nuclear material being tracked by GPS. Install false time stamps in security videos or financial transactions. Send emergency response vehicles to the wrong location after an attack. Interfere with military logistics (DoD uses civilian GPS for cargo). Interfere with battlefield soldiers using civilian GPS (against policy, but common practice anyway). Spoof GPS ankle bracelets used by courts and GPS data loggers used for counter-intelligence ..."

"The creativity of the adversary," Johnston added, "is the only limitation."

Another big security blunder Johnston focused in on was "Thinking Engineers Understand Security." Engineers, he observed, "work in solution space, not problem space ... They make things work but aren't trained or mentally inclined to figure out how to make things break. They view Nature as the adversary, not the bad guys. They tend to think technologies fail randomly, not by deliberate, intelligent, malicious intent. They are not typically predisposed to think like bad guys."

Johnston also articulated some of the wrong assumptions that undermine much of the work done in terms of vulnerability assessments (VA), including thinking that "there are a small number of vulnerabilities, that most or all can be found & eliminated, that vulnerabilities are bad news, and that a VA should ideally find zero vulnerabilities," and well as working with "modular VAs or other artificial constraints, using only security experts, not thinking like the bad guys and thinking the good guys get to define the problem."

Johnston gave a compelling presentation, and as a keynote, it was a bold counterpoint to one of the most technically driven conference agendas in the field of cyber security.

Best Paper and Best Student Paper Awards

A record number of papers were submitted, 207, one paper was withdrawn by the authors, three papers were withdrawn for double submission, and one was withdrawn for "outright plagiarism." So 202 papers went into consideration. There were two rounds of reviews; two reviews in each round. In the first round, 41 papers were rejected in the first round, leaving 161 for the second round. In the second round, each paper got three to five reviews. At the end of the second round, 76 papers advanced. After two days of conference, there were 38 papers remaining. Since only 30 papers could fit into the program, eight had to be rejected although worthy of presentation.

The USENIX Security Symposium's Best Student Paper Award went to Robert N. M. Watson, and Jonathan Anderson of University of Cambridge, and Ben Laurie and Kris Kennaway of Google UK Ltd., for "Capsicum: Practical Capabilities for UNIX."

The authors describe Capsicum as "a lightweight operating system capability and sandbox framework planned for inclusion in FreeBSD 9" and as "a practical capabilities extension to the POSIX API, and a prototype based on FreeBSD, planned for inclusion in FreeBSD 9.0."

"Our goal has been to address the needs of application authors who are already experimenting with sandboxing, but find themselves building on sand when it comes to effective containment techniques. We have discussed our design choices, contrasting approaches from research capability systems, as well as commodity access control and sandboxing technologies, but ultimately leading to a new approach. Capsicum lends itself to adoption by blending immediate security improvements to current applications with the long-term prospects of a more capability-oriented future. We illustrate this through adaptations of widely-used applications, from the simple gzip to Google’s highly-complex Chromium web browser, showing how firm OS foundations make the job of application writers easier. Finally, security and performance analyses show that improved security is not without cost, but that the point we have selected on a spectrum of possible designs improves on the state of the art."

The USENIX Security Symposium's Best Paper Award went to Sruthi Bandhakavi, Samuel T. King, P. Madhusudan and Marianne Winslett of the University of Illinois (Urbana Champaign) for "VEX: Vetting Browser Extensions for Security Vulnerabilities."

They describe VEX as "a framework for highlighting potential security vulnerabilities in browser extensions by applying static information-flow analysis to the JavaScript code used to implement extensions."

"Our main thesis is that most vulnerabilities in web extensions can be characterized as explicit flows, which in turn can be statically analyzed. VEX is a proof-of-concept tool for detecting potential security vulnerabilities in browser extensions using static analysis for explicit flows. VEX helps automate the difficult manual process of analyzing browser extensions by identifying and reasoning about subtle and potentially malicious flows. Experiments on thousands of extensions indicate that VEX is successful at identifying flows that indicate potential vulnerabilities. Using VEX, we identify three previously unknown security vulnerabilities and three previously known vulnerabilities, together with a variety of instances of unsafe programming practices."

And Yes, A CyLab Researcher Made Some Headlines

CyLab and Carnegie Mellon Silicon Valley researcher Collin Jackson is doing important work in the browser security and privacy space, and some of it bubbled up at this year's Symposium.

Here is brief excerpt from the Computerworld story, with a link to the full text:

Browsing in "private mode" isn't as private as users think, a researcher said today.
"There are some traces left behind [by all browsers] that could reveal some of the sites that you've been to," said Collin Jackson, an assistant research professor at the Silicon Valley campus of Carnegie Mellon University. Jackson, along with three colleagues from Stanford University, will present their findings later today at the Usenix Security Symposium in Washington, D.C.
Internet Explorer (IE), Firefox, Chrome and Safari offer private browsing intended to cloak a user from Web sites and erase all browsing evidence from the PC or Mac.
Gregg Keizer, Browsers' private modes leak info, say researchers, Computerworld, 8-10-10

See Also

Android Security, Naked Keystrokes, Selling Viagra, Crying Wolf & More! -- A Report from the 18th USENIX Security Symposium (Montreal, 2009)

And Other CyBlog Conference Coverage ...

BlackHat USA 2010: How to Turn ATMs Into Jackpotted Slots, Basejump the GSM, Lift Malware Developer's Fingerprints & Face Our Existential Dilemma

SOUPS 2010: Insight into Usable Privacy & Security Deepens at 6th Annual Symposium

Notes on TIW 2010: The Builders & Building Blocks of Trustworthy Infrastructure

CyLab Research has Powerful Impact on 2010 IEEE Security & Privacy Symposium

From Parking Meters to the Cloud, from SMS to Smart Grids ... Is Everything Broken? -- Report from Black Hat Briefings (Las Vegas 2009)

RSA 2010: Lost in the Cloud, & Shrouded in the Fog of War, How Far Into the Cyber Future Can You Peer? Can You See Even Beyond Your Next Step?

Reflections on SOUPS 2009: Between Worlds, Cultivating Superior Cleverness, Awaiting a Shift in Consciousness

RSA Conference 2009: Summary of Posts