Thursday, November 20, 2014

CyLab Chronicles: Researchers Share Compelling Work with Silicon Valley Thought Leaders, Technical Experts, Law Enforcement Officials and Media at Fall 2014 Executive Briefing

The Fall 2014 CyLab Executive Briefing,  held at Carnegie Mellon University's Silicon Valley campus (NASA Research Park), brought together an RSVP invitation-only group of  C-level executives, thought leaders, technical experts, venture capitalists, government investigators and one cyber journalist (with an exclusive) for an update from one of the world's premiere cyber security and privacy research programs.

The event consisted of four compelling presentations from CyLab researchers (with Q and A) followed by open-ended dialogue over a delicious buffet luncheon.

The fifteen organizations represented ranged from high tech manufacturers and financial services institutions to social media giants and federal law enforcement agencies.

The presentations delivered reflected the broad scope of CyLab research:

Osman Yagan on "Designing Secure and Reliable Wireless Sensor Networks"

Bruce De Bruhl for Patrick Tague on "That Moment When You Realize Your Network Has Become Self-Aware"

David Brumley on "Software Security"

Anupam Datta on "Privacy through Accountability"

CyLab's Executive Briefings are an outreach to leaders in business, government and the media tasked with cyber security and privacy responsibilities. For half a day, attendees get a glimpse into just a few of the benefits that come with CyLab partnership. These benefits range from online access to the CyLab Seminar Series and attendance at the annual CyLab Partners Conference to the opportunity to put their own researcher at a desk in CyLab for a month or a year,  and the opportunity to help design a CyLab research project and to be integrally involved in its ongoing progress.

Here is a recording of one of the four presentations from this CyLab Executive Briefing, Anupam Datta on "Privacy Through Accountability" --

Tuesday, October 28, 2014

CyLab Chronicles: 2014 CyLab Partners Conference Explores Latest Research Into Vital Cyber Security and Privacy Issues

Student Poster Session, 11th Annual CyLab Partners Conference (October 2014)
The eleventh annual CyLab Partners Conference was held at the main campus of Carnegie Mellon University in Pittsburgh, Pennsylvania, on October 7th and 8th, 2014.

For two days, representatives of a dozen partners were briefed on CyLab's latest research across of broad spectrum of vital issues in the fields of cyber security and privacy. With nineteen presentations, five shared meals and a student poster session, attendees were provided with ample opportunities to both engage in meaningful dialogue with CyLab researchers and network with each other.

Four new CyLab Partners joined us for this year's event: PNC Financial Services Group (PNC), TD Ameritrade, UPMC (University of Pittsburgh Medical Center) and the National Police Agency of Japan.

Partners Conference attendance is an exclusive benefit of formal partnership with CyLab, so is access to the archive of video recordings, presentations and student posters for this year as well as previous years.

CyLab Director Virgil Gligor welcomes attendees, 11th Annual CyLab Partners Conference (October 2014)
Each year, to promote the CyLab program and contribute to the public good, we post a few select conference session videos for free, public access via the CyLab YouTube Channel and CyLab on iTunesU. This year's selections include four full faculty presentations and a fourteen minute sampler with brief excerpts from several other sessions:
For more on the benefits of CyLab partnership, and other aspects of the Carnegie Mellon University CyLab program, visit CyLab Online.

Some Related Posts
Anupam Datta, 11th Annual CyLab Partners Conference (October 2014)
Student Poster Session, 11th Annual CyLab Partners Conference (October 2014)

Wednesday, August 27, 2014

CMU CyLab Researchers Wins USENIX Security 2014 Best Student Paper Award; Seven Other CMU Papers Delivered

As with other leading conferences in the vital fields of cyber security and privacy, Carnegie Mellon University (CMU) CyLab researchers distinguished themselves at USENIX Security 2014, the 23rd USENIX Security Symposium, held in San Diego, California, 8/20/14-8/22/14.

Three hundred fifty papers were submitted to the USENIX program committee, and the ensuing process, which involved 1,340 reviews and 1,627 follow up comments, resulted in sixty-seven papers being accepted for publication, including several from CMU CyLab researchers.

Most notably, CMU's Kyle Soska won one of two Best Student Paper Awards for Automatically Detecting Vulnerable Websites Before They Turn Malicious co-authored with CyLab faculty member Nicolas Christin

Additionally, CyLab faculty member David Brumley co-authored three of the published papers:

BYTEWEIGHT: Learning to Recognize Functions in Binary Code, with Tiffany Bao, Jonathan Burket, and Maverick Woo of Carnegie Mellon University and Rafael Turner, University of Chicago.

Blanket Execution: Dynamic Similarity Testing for Program Binaries and Components, with Manuel Egele, Maverick Woo and Peter Chapman.

Optimizing Seed Selection for Fuzzing, with Alexandre Rebert, Carnegie Mellon University and ForAllSecure; Sang Kil Cha and Thanassis Avgerinos of Carnegie Mellon University; Jonathan Foote and David Warren of Software Engineering Institute CERT; Gustavo Grieco of Centro Internacional Franco Argentino de Ciencias de la Información y de Sistemas (CIFASIS) and Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET).

Brumley also delivered a paper for one of the workshops that proceeded the main body of the Symposium itself, PicoCTF: A Game-Based Computer Security Competition for High School Students, co-authored with Peter Chapman and Jonathan Burket, also from CMU.

CyLab Usable Security and Privacy (CUPS) Lab director Lorrie Cranor teamed up with Cormac Herley, Principal Researcher in the Machine Learning Department at Microsoft Research, and several colleagues, Saranga Komanduri and Richard Shay of CMU and Stuart Schechter of Microsoft Research to co-author Telepathwords: Preventing Weak Passwords by Reading Users' Minds 

Two other CMU-authored papers were presented at USENIX Security 2014

The Long "Taile" of Typosquatting Domain Names co-authored by Janos Szurdi, Carnegie Mellon University; Balazs Kocso and Gabor Cseh, Budapest University of Technology and Economics; Jonathan Spring, Carnegie Mellon University; Mark Felegyhazi, Budapest University of Technology and Economics; and Chris Kanich, University of Illinois at Chicago. 

Password Managers: Attacks and Defenses co-authored by David Silver, Suman Jana, and Dan Boneh, Stanford University; Eric Chen and Collin Jackson, Carnegie Mellon University

Related Posts

Thursday, August 21, 2014

CMU CyLab PPP and CUPS teams win “Capture the Flag” and “Crack Me If You Can" contests at DEFCON 22

Members of CMU CyLab's Plaid Parliament of Pwning (PPP)
Carnegie Mellon University demonstrated its cyber prowess at DEFCON 22 by winning the “Capture the Flag” and “Crack Me If You Can” contests ...

Carnegie Mellon’s computer hacking team, the Plaid Parliament of Pwning (PPP), took first place for the second consecutive year in the Capture the Flag (CTF) contest. Globally, hundreds of teams battle throughout the year for one of 20 slots at DEFCON’s CTF competition, which has been called the “World Series of hacking.”

“Our team competed against universities and also against large defense contractors. This win is a huge accomplishment for our team,” says team advisor David Brumley, an associate professor of Electrical and Computer Engineering and the technical director of Carnegie Mellon CyLab.

The PPP team qualified for DEFCON for the last three years, and won first place in 2013 and again in 2014. The PPP team is part of CyLab’s Undergraduate Computer Security Research group, and it consists of 35 members from the College of Engineering and the School of Computer Science.

At DEFCON 22, the team was limited to eight members: George Hotz, Ryan Goulden, Tyler Nighswander, Brian Pak, Alex Reece, Max Serrano, Andrew Wesie, Ricky Zhou ...

A second team, this one from CyLab Usable Privacy and Security (CUPS) Lab, and simply named “cmu,” won the Street Division category in the “Crack Me If You Can” contest. In this two-day event sponsored by KoreLogic Security, teams exposed or “cracked” encrypted passwords.

"The students leveraged what they had learned from our research studies to develop their winning strategy," CUPS Director Lorrie Cranor says, "It is remarkable for a first-time team to win this competition." Cranor and fellow CyLab faculty members Lujo Bauer, Nicolas Christin, along with their team of students, are responsible for a growing body of work on passwords.

"Black Badge" bestowed upon CTF winners guarantees lifetime free entry to DEFCON
See Also

CyLab's David Brumley and His Student Hacker Team Featured on PBS NEWSHOUR and CNBC

Carnegie Mellon's Capture the Flag Team Excels in Hackjam Competition

CUPS Password Research Studies

Sunday, July 13, 2014

A Decade Into Its Vital Work, Another Savory SOUPS, A Report from the 10th Annual Symposium On Usable Privacy and Security

CMU CyLab's Dr. Lorrie Cranor, Founder of CUPS and SOUPS preps
for welcoming remarks at SOUPS 2014

The CyLab Usable Privacy and Security Laboratory (CUPS) 10th Annual Symposium on Usable Privacy and Security (SOUPS) was hosted by Facebook at its headquarters in Menlo Park, California (7/9/14 - 7/11/14). CUPS Director Lorrie Cranor welcomed the attendees, with the record-breaking numbers in both attendance and papers submitted. For three full days of proceedings, hundreds of researchers from business, academia and government communed together amidst the proliferation of signage which has come to characterize the social media giant's corporate culture: e.g., "Ship Love," "Ruthless Prioritization," "Demand Success," Nelson Mandela, arms outstretched, with the caption, "Open the Doors," etc. (Not so subliminal messaging.)
Perhaps more poignantly than any previous SOUPS keynote, Christopher Soghoian of American Civil Liberties Union (ACLU) articulated the vital nature of research into usable privacy and security. Putting flesh and blood on these issues, Soghoian used examples from the shadow world of investigative reporters and whistle-blowers to highlight the need for privacy and security software that is not only robust but eminently usable. One great benefit of the revelations brought forth by Glenn Greenwald in the Edward Snowden affair, Soghoian opined, is that there has been increased crypto adoption by journalists.
But the heightened engagement has also brought long-standing problems into a harsh new light. For example, Soghoian told SOUPS attendees, many investigative journalists using PGP still do not realize subject lines are not encrypted. "The best our community has to offer sucks, the usability and the default values suck," Soghoian declared, "the software is not protecting journalists and human rights activists, and that's our fault as researchers"

As contributing markets factors for why we still don't have usable encryption, Soghoian cited: potential data loss ("telling your customer that they've just lost every photo of their children is a non starter"), current business models, and of course, government pressure.

Facebook HQ Signage, 1 Hacker Way, Menlo Park
In other parts of his very substantive keynote, Soghoian touched on consumer issues related to the efficacy of privacy and security. He elucidated the differences in privacy and security between the iPhone and the Android: "The privacy and security differences ... are not advertised." He also shed light on a new aspect of the growing gap between rich and poor, "security by default for the rich," and "insecurity by default for the poor." "Those who are more affluent get the privacy benefits without shopping around," he explained, because the discounted, and mass-marketed versions of software often do not have the same full-featured privacy and security as the more expensive business or professional versions.

[NOTE: Full-length video of Soghoian's keynote is available via the CyLab YouTube Channel.]

Several awards were also announced during the opening sessions, including:

The 2014 IAPP SOUPS Privacy Award for the paper with the most practical application in the field of privacy went to Would a Privacy Fundamentalist Sell Their DNA for $1000...If Nothing Bad Happened as a Result? The Westin Categories, Behavioral Intentions, and Consequences authored by Allison Woodruff, Vasyl Pihur, Sunny Consolvo, and Lauren Schmidt of Google; and Laura Brandimarte and Alessandro Acquisti of Carnegie Mellon University.

The 2014 SOUPS Impact Award for a SOUPS paper "published between 2005 and 2009 that has had a significant impact on usable privacy and security research and practice" went to Usability of CAPTCHAs or Usability Issues in CAPTCHA Design authored in 2008 by Jeff Yan and Ahmad Salah El Ahmad of Newcastle University (UK).

Two Distinguished Papers awards were presented:

Understanding and Specifying Social Access Control Lists, authored by Mainack Monda of Max Planck Institute for Software Systems (MPI-SWS), Yabing Liu of Northeastern University, Bimal Viswanath and Krishna P. Gummadi of Max Planck Institute for Software Systems (MPI-SWS), and Alan Mislove of Northeastern University.

Crowdsourcing Attacks on Biometric Systems, authored by Saurabh Panjwani, an independent consultant and Achintya Prakash of University of Michigan.
Carnegie Mellon University (CMU), home to the CyLab Usable Privacy and Security (CUPS) Lab and the MSIT-Privacy Engineering Masters Program was well-represented in the proceeding.

In addition to the IAPP SOUPS Privacy Award winning "Would a Privacy Fundamentalist Sell Their DNA for $1000...If Nothing Bad Happened as a Result? The Westin Categories, Behavioral Intentions, and Consequences," co-authored with Google researchers, several other CMU papers were presented:

Parents’ and Teens’ Perspectives on Privacy In a Technology-Filled World, authored by Lorrie Faith Cranor, Adam L. Durity, Abigail Marsh, and Blase Ur, Carnegie Mellon University

Privacy Attitudes of Mechanical Turk Workers and the U.S. Public, authored by Ruogu Kang, Carnegie Mellon University, Stephanie Brown, Carnegie Mellon University and American University, Laura Dabbish and Sara Kiesler, Carnegie Mellon University

CMU researcher Ruogu Kang presenting
Privacy Attitudes of Mechanical Turk Workers and the U.S. Public
Harder to Ignore? authored by Cristian Bravo-Lillo, Lorrie Cranor, and Saranga Komanduri, Carnegie Mellon University, Stuart Schechter, Microsoft Research, Manya Sleeper, Carnegie Mellon University

The Effect of Social Influence on Security Sensitivity, authored by Sauvik Das, Tiffany Hyun-Jin Kim, Laura A. Dabbish, and Jason I. Hong, Carnegie Mellon University

Modeling Users’ Mobile App Privacy Preferences: Restoring Usability in a Sea of Permission Settings, authored by Jialiu Lin, Bin Liu, Norman Sadeh, and Jason I. Hong, Carnegie Mellon University

The full proceedings of SOUPS 2014 are available via USENIX.

-- Richard Power

Check out CyLab CyBlog's Archive of SOUPS Coverage

A Distinguish Paper Award for CUPS, and Other News from Ninth Annual SOUPS 2013

CyLab's SOUPS 2012 Continues Its Ongoing, Deepening Dialogue on What Works and What Doesn't

SOUPS 2011 Advances Vital Exploration of Usability and Its Role in Strengthening Privacy and Security  

 SOUPS 2010: Insight into Usable Privacy & Security Deepens at 6th Annual Symposium

Reflections on SOUPS 2009: Between Worlds, Cultivating Superior Cleverness, Awaiting a Shift in Consciousness

Glimpses into the Fourth Annual Symposium on Usable Security and Privacy (SOUPS 2008)

Mike Farb of CyLab's SafeSlinger project presents during the 2014 EFF Crypto Usability Prize (EFF CUP)
Workshop on Day One of SOUPS 2014

Facebook HQ Signage, 1 Hacker Way, Menlo Park

Wednesday, June 25, 2014

CyLab Researchers In 5-Year NSF-Funded Project to Overcome Challenges in Synchronizing Time for Cyber-Physical Systems

A Sundial in Saint-Rémy-de-Provence, France (Source: Wikimedia)

NOTE: This CyLab Chronicles is cross-posted on both the Carnegie Mellon CyLab public site and the CyLab partners-only portal. - See more at:

The National Science Foundation (NSF) recently announced "a five-year, $4 million award to tackle the challenge of synchronizing time in cyber-physical systems (CPS)--systems that integrate sensing, computation, control and networking into physical objects and infrastructure." According to the NSF, the award "will support a project called Roseline, which seeks to develop new clocking technologies, synchronization protocols, operating system methods, as well as control and sensing algorithms ... Examples of cyber-physical systems include autonomous cars, aircraft autopilot systems, tele-robotics devices and energy-efficient buildings, among many others." (NSF, 6-13-14)

Two CMU CyLab researchers, Raj Rajkumar and Anthony Rowe, are members of the Roseline project team. Mani Srivastava of UCLA and Rajesh Gupta of UC San Diego will serve as Principle Investigators. The team also includes Sudhakar Pamarti of UCLA, João Hespanha of UC Santa Barbara and Thomas Schmid of the University of Utah.

CyLab News caught up with both CyLab researchers to get their perspectives on the scope and significance of Project Roseline.

"We all know that time is a fundamental attribute," Rajkumar said. "Computers maintain time by using local clocks, and synchronize these values among themselves or with reliable clock sources on the internet. However, the accuracies of these synchronized clocks are very highly system- and network-dependent. This in turn causes a wide range of applications from smart grid systems to robotic systems like autonomous vehicles to be customized and tweaked. In other words, there does not yet exist core foundations to specify, implement and utilize a notion of time, whose accuracy can be specified, controlled and achieved - we refer to this as the Quality of Time. This project will develop the foundations for managing the Quality of Time in computer systems." 

"There is a notion of time that transcends all layers of modern computer systems," Rowe adds. "At the lowest-level you have clocks driving hardware. Above that you have operating systems and networking that use time to manage how and more importantly when resources should be consumed. At the very top you have applications ranging from GPS to banking transactions that rely on timing. Our goal is to develop mechanisms and interfaces for improving what we call Quality of Time (QoT) aware applications. Specifically at CyLab we will be working on operating system abstractions and approaches to network coordination that improve energy-efficiency and reliability of networked embedded systems. Our target applications range from secure smart-grid monitoring to robotic systems like autonomous vehicles."

Full Text of NSF Press Release

Monday, June 9, 2014

CyLab Chronicles: Anupam Datta on Impact and Implications of IEEE Award Winning Paper, "Bootstrapping Privacy Compliance in Big Data Systems"

NOTE: This CyLab Chronicles is cross-posted on both the Carnegie Mellon CyLab public site and the CyLab partners-only portal.

Anupam Datta is an Associate Professor at Carnegie Mellon University (CMU) and a leading researcher at CMU CyLab. CyLab Chronicles recently sat down with him to discuss his team's latest successes and the future direction of their vital research. Here is our conversation.

In the 21st Century, the privacy space has become a very challenging one - for government, for business and for each of us as individuals. This very challenging space is fraught with complex issues. Which particular issue does the work in "Bootstrapping Privacy Compliance in Big Data Systems" seek to address?

Anupam Datta: To allay privacy concerns, Web services companies, such as Facebook, Google and Microsoft, all make promises about how they will *use* personal information they gather. But ensuring that *millions of lines of code* in their systems *respect* these *privacy promises* is a challenging problem. Recent work with my PhD student in collaboration with Microsoft Research addresses this problem.  We present a workflow and tool chain to automate privacy policy compliance checking in big data systems. The tool chain has been applied to check compliance of over a million lines of code in the data analytics pipeline for Bing, Microsoft’s Web search engine. This is the first time automated privacy compliance analysis has been applied to the production code of an Internet-scale system. The paper, written jointly with my PhD student Shayak Sen and a team from Microsoft Research, was presented at the 2014 IEEE Symposium on Security and Privacy and recognized with a Best Student Paper Award.

Tell us something about the system you and your team designed? And briefly describe its components, Legalease and Grok?

Datta: Central to the design of the workflow and tool chain are (a) *Legalease* — a language that allows specification of privacy policies that impose restrictions on how user data flows through software systems; and (b) *Grok* — a data inventory that annotates big data software systems (written in the Map-Reduce programming model) with Legalease’s policy datatypes, thus enabling automated compliance checking. Let me elaborate.
Privacy policies are often crafted by legal teams while software that has to respect these policies are written by developers. An important challenge is thus to design privacy policy languages that are *usable* by legal privacy teams, yet have precise *operational meaning* (semantics) that software developers can use to restrict how their code operates on personal information of users. The design of Legalease was guided by these dual considerations. Legalease builds on prior work from my research group that shows that privacy policies often involve *nested allow-deny information flow rules with exceptions* (see DeYoung et al. 2010, Garg et al. 2011 for the first complete logical specification and audit of the HIPAA Privacy Rule for healthcare privacy in the US). For example a rule might say: "IP address will not be used for advertising, except it address may be used for detecting abuse. In such cases it will not be combined with account information.” Our hypothesis was that such rules match the *mental model* of legal privacy policy authors. The results of our *user study*, involving participants from the legal privacy team and privacy champions at Microsoft (who sit between the legal privacy team and software developers), provide evidence in support of our hypothesis: after a short tutorial, the participants were able to encode the entire Bing policy pertaining to how users’ personal information will be used on the Bing servers (9 policy clauses) in about 15 minutes with high accuracy.
Software systems that perform data analytics over personal information of users are often written without a technical connection to the privacy policies that they are meant to respect. Tens of millions of lines of such code are already in place in companies like Facebook, Google, and Microsoft. An important challenge is thus to *bootstrap* existing software for privacy compliance. Grok addresses this challenge. It annotates software written in programming languages that support the Map-Reduce programming model (e.g., Dremel, Hive, Scope) with Legalease’s policy datatypes. We focus on this class of languages because they are the languages of choice in industry for writing data analytics code. A simple way to conduct the bootstrapping process would be to ask developers to manually annotate all code with policy datatypes (e.g., labelling variables as IPAddress, programs as being for the purpose of Advertising etc.). However, this process is too labor-intensive to scale. Instead, we develop a set of techniques to automate the bootstrapping process. I should add that the development of Grok was led by Microsoft Research and was underway before our collaboration with them began.

What do you see as the practical outcome of this research? Is it going to be applicable in the operational environment? What is it leading toward?

Datta: One practical outcome of this research is that it enables companies to check that their big data software systems are compliant with their privacy promises. Indeed the prototype system is already running on the production system for Bing, Microsoft's search engine. So, yes, it is applicable to current operational environments.
I view this result as a significant step forward in ensuring privacy compliance at scale. With the advent of technology that can help companies keep their privacy promises, users can reasonably expect stronger privacy promises from companies that operate over their personal information. Of course, much work remains to be done in expanding this kind of compliance technology to cover a broader set of privacy policies and in its adoption in a wide range of organizations.

What other privacy challenges are you looking to address in your ongoing research? What's next for your team?

Datta: We are deeply examining the Web advertising ecosystem, in particular, its *transparency* and its implications for *privacy* and *digital discrimination*. There are important computational questions lurking behind this ecosystem: Are web-advertising systems transparently explaining what information they collect and how they use that information to serve targeted ads? Are these systems compliant with promises they make that they won't use certain types of information (e.g., race, health information, sexual orientation) for advertising? Can we answer these questions from the "outside", i.e., without gaining access to the source code and data of web advertising systems (in contrast with our privacy compliance work with Microsoft Research)? How can we enhance transparency of the current Web advertising ecosystem? What does digital discrimination mean in computational terms and how can we design systems that avoid it?

As a researcher who has looked long and deeply into the privacy space, what do you think is most lacking in the approaches of government and business? What is most needed? What are people missing?

Datta: First, I believe that there is a pressing need for better *computational tools* that organizations can (and should) use to ensure that their software systems and people are compliant with their privacy promises. These tools have to be based on well-founded computing principles and at the same time have to be usable by the target audience, which may include lawyers and other groups who are not computer scientists.
Second, current policies about collection, use, and disclosure of personal information often make rather *weak promises*. This is, in part, a corollary of my first point: organizations do not want to make promises that they don't have the tools to help them be compliant with. But it is also driven by economic considerations, e.g., extensive use of users' personal information for advertising can result in higher click-through rates. While many companies make promises that they will not use certain types of information (e.g., race, health information, sexual orientation) for advertising, it is far from clear what these promises even mean and how a company can demonstrate they are compliant with these promises.
Third, we need better *transparency mechanisms* from organizations that explain what information they collect and how they use that information. Self-declared privacy policies and mechanisms like ad settings managers are steps in the right direction. However, the transparency mechanisms should themselves be auditable to ensure that they indeed transparently reflect the information handling practices of the organization.
These are all fascinating questions -- intellectually deep from a computer science standpoint and hugely relevant to the lives of hundreds of millions of people around the world. They keep me and my research group up at night!

Related Posts

IEEE Security and Privacy Symposium 2014: Another Challenging Year, Another Compelling IEEE SPS, and Another Significant Contribution from CMU CyLab 

(CMU-CyLab-14-005) Temporal Mode-Checking for Runtime Monitoring of Privacy Policies

(CMU-CyLab-13-005) Purpose Restrictions on Information Use

Anupam Datta - Privacy through Accountability (2014 CyLab Seminar Series, You Tube Video)

Anupam Datta - Privacy through Accountability (10th Annual CyLab Partners Conference, You Tube Video)

CyLab Chronicles: Q&A with Anupam Datta (2009)

CyLab Chronicles: Q&A with Anupam Datta (2008)

Thursday, May 22, 2014

IEEE Security and Privacy Symposium 2014: Another Challenging Year, Another Compelling IEEE SSP, and Another Significant Contribution from CMU CyLab

Giovanni Domenico Tiepolo - Procession of the Trojan Horse in Troy (1773)
Another challenging year in cyber security and privacy means another compelling IEEE Security and Privacy Symposium, and another compelling IEEE Security and Privacy Symposium means another significant contribution from Carnegie Mellon University CyLab.

This year, three hundred and thirty three papers were submitted. After a rigorous review process (which included ninety nine "intensive discussions," one thousand two hundred eighteen reviews and a rebuttal phase), forty four papers were selected to be published as part of the Symposium.

Of these forty four worthy contributions, four were singled out for IEEE Security and Privacy Symposium 2014 Best Papers Awards:

Best Paper
Secure Multiparty Computations on BitCoin by Marcin Andrychowicz, Stefan Dziembowski, Daniel Malinowski, and Łukasz Mazurek (University of Warsaw)

Best Practical Paper
Using Frankencerts for Automated Adversarial Testing of Certificate Validation in SSL/TLS Implementations by Chad Brubaker and Suman Jana (University of Texas at Austin), Baishakhi Ray (University Of California Davis), and Sarfraz Khurshid and Vitaly Shmatikov (University of Texas at Austin)

Best Student Papers

Framing Signals — A Return to Portable Shellcode by Erik Bosman and Herbert Bos (Vrije Universiteit Amsterdam)

Bootstrapping Privacy Compliance in Big Data Systems by Shayak Sen (Carnegie Mellon University), Saikat Guha (Microsoft Research, India), Anupam Datta (Carnegie Mellon University), Sriram Rajamani (Microsoft Research, India), Janice Tsai (Microsoft Research, Redmond), and Jeannette Wing (Microsoft Research)

CMU CyLab researcher Shayak Sen presented the award winning paper co-authored by members of the CyLab and Microsoft Research teams:

In this paper, we demonstrate a collection of techniques to transition to automated privacy compliance compliance checking in big data systems. To this end we designed the LEGALEASE language, instantiated for stating privacy policies as a form of restrictions on information flows, and the GROK data inventory that maps low level data types in code to highlevel policy concepts. We show that LEGALEASE is usable by non-technical privacy champions through a user study. We show that LEGALEASE is expressive enough to capture real-world privacy policies with purpose, role, and storage restrictions with some limited temporal properties, in particular that of Bing and Google. To build the GROK data flow grap we leveraged past work in program analysis and data flow analysis. We demonstrate how to bootstrap labeling the graph with LEGALEASE policy datatypes at massive scale. We note that the structure of the graph allows a small number of annotations to cover a large fraction of the graph. We report on our experiences and learnings from operating the system for over a year in Bing. -- Shayak Sen (Carnegie Mellon University), Saikat Guha (Microsoft Research, India), Anupam Datta (Carnegie Mellon University), Sriram Rajamani (Microsoft Research, India), Janice Tsai (Microsoft Research, Redmond), and Jeannette Wing (Microsoft Research), Bootstrapping Privacy Compliance in Big Data Systems, IEEE Security and Privacy Symposium 2014, Best Student Paper (1 of 2)

But, of course, the Bootstrapping Privacy Compliance paper was not the only CyLab contribution to the Symposium program, e.g., CMU CyLab researcher Zongwei Zhou spoke on Dancing with Giants; Wimpy Kernels for On-Demand Isolation I/O, a paper co-authored with Miao Yu and Virgil Gligor:

Trustworthy applications are unlikely to survive in the marketplace without the ability to use a variety of basic services securely, such as on-demand isolated I/O channels to peripheral devices. This paper presents a security architecture based on a wimpy kernel that provides these services without bloating the underlying trusted computing base. It also presents a concrete implementation of the wimpy kernel for a major I/O subsystem, namely USB subsystem, and a variety of device drivers. Experimental measurements show that the desired minimality and efficiency goals for the trusted base are achieved. -- Zongwei Zhou, Miao Yu, Virgil Gligor, Dancing with Giants; Wimpy Kernels for On-Demand Isolation I/O, IEEE Security and Privacy Symposium 2014

Other CMU papers selected and presented at IEEE SSP 2014 included:

All Your Screens Are Belong to Us; Attacks Exploiting the HTML5 Screen Sharing API, Analyzing Forged SSL Certificates in the Wild by Lin-Shung Huang, Yuan Tian, Patrick Tague and others, CMU SV and Facebook

Analyzing Forged SSL Certificates in the Wild by Lin-Shung Huang, Alrex Rice, Erling Ellingsen, Collin Jackson

Stopping A Rapid Tornado with A Puff by Jose Lopes and Nuno Neves of CMU Portugal
CyLab's contribution to IEEE SPP 2014 also included several papers from two CMU CyLab alumni and alumna.

There were three papers co-authored by CMU CyLab alumnus XiaoFeng Wang of Indiana University (Bloomington): Hunting the Red Fox Online: Understanding and dectection of Mass Redirect-Script Injections, Upgrading Your Android, Elevating My Malware - Privilege Escalation Through Mobile OS updating, and Perils of Fragmentation: Security Hazards in Android Device Driven Customizations.

Also CMU CyLab alumnus Bryan Parno of Microsoft Research and a CMU CyLab alumna Elaine Shi of University of Maryland (College Park) were among the co-authors of PermaCoin: Repurposing Bitcoin Work for Data Preservation, and Shi co-authored a second paper, Automating Efficient RAM-Model Secure Computation.

CyLab's efforts were also apparent on the organizational level at IEEE SSP 2014:

Adrian Perrig of ETH Zürich, formerly CyLab's Research Director, now a CyLab Distinguished Fellow, served as one of the Symposium's three program chairs.

Three CyLab researchers served as Session Chairs, Lujo Bauer for Systems Security, Virgil Gligor (CyLab Director) for Attacks 3 and Anupam Datta for Secure Computation and Storage.

Also, CMU CyLab alum Bryan Parno served as a Session Chair for Privacy and Anonymity.

And, looking ahead to next year, Lujo Bauer will be one of the Symposium program chairs. 2015 will likely be another challenging year in cyber security and privacy, which will mean another compelling IEEE Security and Privacy Symposium, with another significant contribution from Carnegie Mellon University CyLab.

Related Posts

CyLab's Strong Presence Continues at Annual IEEE Symposium on Security and Privacy (2013)

CyLab Chronicles: CyLab's Strong Presence at IEEE Security and Privacy 2012 Packs A Wallop

A Report on 2012 IEEE Symposium on Privacy and Security

Microcosm & Macrocosm: Reflections on 2010 IEEE Symposium on Security & Privacy; Q & A on Cloud, Cyberwar & Internet Freedom w/ Dr. Peter Neumann

CyLab Research has Powerful Impact on 2010 IEEE Security & Privacy Symposium

Tuesday, April 15, 2014

CyLab Research Sheds Light on Heartbleed and Its Implications

Heartbleed is a significant event along the cyber security timeline. Its consequences will be with us all for quite awhile. If you haven't already come to grips with this issue, you should do so urgently.

For some guidance go to

To verify if a particular server is vulnerable, go to:

For a command-line tool, go to:

Here at CyLab, the story has provided us with an opportunity to reflect on some of our recent research and its relevancy to the problem at hand, e.g., --

Perspectives: If you scan a server, and find that it isn't vulnerable, you would still need to know if it had been vulnerable in the past, and/or if it has been updated. One way to answer that question is to determine if the private key has been updated. If you connect to the server with a Firefox browser that has the Perspectives extension installed, and then inspect the key history. To do so, click the Perspectives icon on the left-hand of the URL bar, and select "View Notary Results." (Of course, if the key has not been changed, you're still none the wiser.) For more on Perspectives, visit the Perspectives Project page.

TrustVisor: In TrustVisor, we proposed keeping the OpenSSL private key inside a PAL, which would have defended against this vulnerability. See our paper on TrustVisor: Efficient TCB Reduction and Attestation, authored by Jonathan M. McCune (now with Google), Yanlin Li, Ning Qu, Zongwei Zhou, Anupam Datta, Virgil Gligor and Adrian Perrig.

Flicker: In Flicker, we proposed to store the SSH password database inside a PAL, which would also prevent password theft. See Flicker: An Execution Infrastructure for TCB Minimization,
authored by three CyLab researchers Jonathan M. McCune, Bryan Parno (now with Microsoft), and Adrian Perrig (now with ETH Zurich), along with Michael K. Reiter of University of North Carolina (Chapel Hill), and Hiroshi Isozaki of Toshiba Corporation.

Perspectives, TrustVisor and Flicker all evolved out of CyLab's work on Trustworthy Computing Platforms and Devices. And this continues to be one of CyLab's major research thrusts.

Amit Vasudevan, a CyLab  Research Systems Scientist, and Miao Yu, a CyLab grad student, took a few moments to sit down with CyBlog, and share some insights on where we are and what's next.

According to Vasudevan, "the IEE technologies and prototypes we have been developing (XMHF - TrustVisor, KISS, Minibox, etc.) lay a solid foundation to protect against Heartbleed-like attacks."

"But going from our prototypes to the real-world is a different kind of challenge. The software ecosystem out there today does not really consider security as a first-class citizen. Consequently, tweaking these components to adapt to our IEE design is non-trivial ...  In the long term, developers of security-oriented/sensitive software would benefit from a simple and solid security framework that would allow them to leverage strong security properties, while letting them also implement the desired functionality. And our work with XMHF plus Trustvisor plus other hypapps ( is the right step in this direction."

"This bug is still underestimated," warns Yu.

He cites three reasons for his concern:
"Currently, we putting a lot of care into HTTPS websites. But other protocols, e.g., FTPS (used in file transfer) server, can also be impacted by this bug. 

"Not only servers, but also clients, e.g. smart phones and other devices, may suffer from this bug. And for certain devices, the problem can be even worse. For example, mobiles phones have long patch cycles. For the heartbleeding bug, the first patch of this bug came out in 20 minutes and web servers began the repair in the first day. But Android phones only get scanners, e.g., Bluebox Heartbleed Scanner or Heartbleed Detector to help users find out if their phone is vulnerable ... From our experience with past vulnerabilities, it would take tens of weeks until half of the mobile devices get patched. During this period, the devices are at risk. Other devices, which may use OpenSSL for establishing administration channels, also may suffer from long patch cycles. At CyLab, Zongwei Zhou, Miao Yu, Yoshiharu Imamoto, Amit Vasudevan, Virgil Gilgor and I have developed an isolated execution environment for the ARM mobile platform. It is quite similar to TrustVisor, but focuses on mobile system security, so that, e.g., you could run a banking client (or some other sensitive application) in an isolated execution environment, so that your code and data would still be secure in spite of this or other vulnerabilities present in Android.

"All three recent SSL bugs, i.e., IOS's goto fail bug, the GnuTLS bug and the Heartbleed bug are implementation-related rather than design related. The lesson is that design security doesn't mean implementation security. We do need runtime protection as a last line of defense."

-- Richard Power

Wednesday, March 12, 2014

New on @CyLab @YouTube Channel: Anupam Data - Privacy through Accountability (Full-Length Seminar)

The CyLab Seminar Series is held on Mondays during the school year, at CyLab HQ on the main campus of Carnegie Mellon University (Pittsburgh, PA.) These weekly talks feature the latest research from CyLab faculty and visiting colleagues from other centers of academic research into cyber security and privacy. Occasional Business Risk Forum events introduce the insights of cybersecurity and privacy excerpts from the operational side of business and government.

Access to the webcasts of this dynamic Seminar Series is an exclusive benefit of corporate partnership with CyLab. But from time to time, to encourage further research into cybersecurity and privacy, and to contribute to the ongoing dialogue on these vital issues, we release select videos for free public viewing via the CyLab You Channel.

Here is the abstract and embedded video for one such talk released via @CyLab @YouTube:

Privacy has become a significant concern in modern society as personal information about individuals is increasingly collected, used, and shared, often using digital technologies, by a wide range of organizations. To mitigate privacy concerns, organizations are required to respect privacy laws in regulated sectors (e.g., HIPAA in healthcare, GLBA in financial sector) and to adhere to self-declared privacy policies in self-regulated sectors (e.g., privacy policies of companies such as Google and Facebook in Web services). We investigate the possibility of formalizing and enforcing such practical privacy policies using computational techniques. We formalize privacy policies that prescribe and proscribe flows of personal information as well as those that place restrictions on the purposes for which a governed entity may use personal information. Recognizing that traditional preventive access control and information flow control mechanisms are inadequate for enforcing such privacy policies, we develop principled audit and accountability mechanisms with provable properties that seek to encourage policy-compliant behavior by detecting policy violations, assigning blame and punishing violators. We apply these techniques to several US privacy laws and organizational privacy policies, in particular, producing the first complete logical specification and audit of all disclosure-related clauses of the HIPAA Privacy Rule.

CyLab Seminar: Anupam Data - Privacy through Accountability (Full-Length Seminar)

For more information on the work of CyLab researcher Anupam Datta. For more information on the CyLab corporate partnership program.

New on @CyLab @YouTube Channel: Osman Yagan - Designing Secure and Reliable Wireless Sensor Networks (Full-length CyLab Seminar)

The CyLab Seminar Series is held on Mondays during the school year, at CyLab HQ on the main campus of Carnegie Mellon University (Pittsburgh, PA.) These weekly talks feature the latest research from CyLab faculty and visiting colleagues from other centers of academic research into cyber security and privacy. Occasional Business Risk Forum events introduce the insights of cybersecurity and privacy excerpts from the operational side of business and government.

Access to the webcasts of this dynamic Seminar Series is an exclusive benefit of corporate partnership with CyLab. But from time to time, to encourage further research into cybersecurity and privacy, and to contribute to the ongoing dialogue on these vital issues, we release select videos for free public viewing via the CyLab You Channel.

Here is the abstract and embedded video for one such talk released via @CyLab @YouTube.

Wireless sensor networks (WSNs) are distributed collection of small sensor nodes that gather security-sensitive data and control security-critical operations in a wide range of industrial, home and business applications. The current developments in the sensor technology and ever increasing applications of WSNs point to a future where the reliability of these networks will be at the core of the society’s well-being, and any disruption in their services will be more costly than ever. There is thus a fundamental question as to how one can design wireless sensor networks that are both secure and reliable. 
In this talk, we will present our approach that addresses this problem by considering WSNs that employ a randomized key predistribution scheme and deriving conditions to ensure the k-connectivity of the resulting network. Random key predistribution schemes are widely accepted solutions for securing WSN communications and the k-connectivity property ensures that the network is reliable in the sense that its connectivity will be preserved despite the failure of any k − 1 sensors or links.
CyLab Seminar: Osman Yagan - Designing Secure and Reliable Wireless Sensor Networks

For more information on the work of CyLab researcher Osman Yagan. For more information on the CyLab corporate partnership program.

Friday, March 7, 2014

Descending Into the Maelstrom - Notes on RSA Conference 2014

Dan Geer: "We are all intelligence officers now." (RSA Conference 2014)

Descending Into the Maelstrom -- Notes on RSA Conference 2014

By Richard Power 
I confess that I was tempted to simply go to Trustycon instead. But from a historical perspective, I could not resist the bitter ironies that RSA Conference 2014 offered up. After all, I had been there, two decades earlier, when Jim Bidzos, then President and CEO of RSA, led industry and public advocacy resistance to the adoption of the National Security Agency's Clipper Chip. In the mid-1990s, the RSA Conference was one of the principle venues for the marshaling of that resistance. So how could I not attend this year's RSA Conference, with Snowden-sourced revelations of RSA's "secret $10 million" deal to an NSA "backdoor" rolled into its Bsafe product (Reuters, 12/20/13). How could I not bear witness to how all of this would play out?

So I attended RSA Conference 2014, as I have since the beginning, as a member of the press. I simply had to follow up on this latest twist in the plot of our collective lives and careers in security and intelligence. Walking toward the escalators to descend into the maelstrom, I glanced up. An overarching banner read: "Share. Learn. Secure. Capitalizing on Collective Intelligence." Seriously. Such a strange thematic choice for this particular RSA Conference. Were they going for irony, or doubling down on the brouhaha, or was someone simply asleep at the wheel. "Capitalizing on Collective Intelligence." Wow. Don't misunderstand me. I am a strong champion for the role of intelligence in government and business. Despite the tendency of governments to "sex up" intelligence to serve ideological agendas, and the tendency of business executives to avoid any real intelligence that might force them to confront some inconvenient truth. Nevertheless, at this particular moment in time, with the allegations concerning NSA's paid-for "backdoor" into RSA product still causing a profound disturbance in the force, "Capitalizing on Collective Intelligence" seemed, well, a strange spin. Was it ill-chosen, or plucky, or both?

Lucy in the Sky with Diamonds

As I sat in the front row of the auditorium for the first day's keynote session, I was still mulling over the organizers' choice of conference themes, when over the loudspeakers a secondary (and completely unrelated) theme was introduced: "Security the final frontier ... These are the voyages of the RSA Conference ..." Suddenly, William Shatner (yes, the original Captain James T. Kirk) burst into the hall and took the stage, to sing the Beatles "Lucy in the Sky with Diamonds," with lyrics re-written for the occasion: "Follow her down to a patch in the system ... where the people eat malicious code pies ... and the botnets grow so incredibly high ..." Seriously.

William Shattner: "Security gets us high ..." (RSA Conference 2014)
And then came the C-suite kabuki.

In his address, RSA Executive Chairman Art Coviello declared "our personal information" the "true currency of the digital age." (Given the backdrop of the Snowden revelations regarding RSA and the NSA, this comment struck me in the same way as the Conference theme of "Capitalizing on Collective Intelligence.") And he went on to proclaim, "We are at a crossroads ... I call on all the nations of the world to adopt these four principles:  renounce the use of cyber weapons and the use of Internet to wage war... cooperate internationally on the apprehension of cyber criminals ... respect intellectual property rights ... ensure privacy of all individuals ..." 

Nawaf Bitar, Senior VP and GM of Security Business Unit, Juniper Network opened his presentation with the image of a Tibetan protester's self-immolation, and went on to invoke the spirit of Nelson Mandela. In the course of his mocking remarks about what he termed "#FirstWorldOutrage," Bitar took a swipe at Trustycon: "not showing up at a conference is not outrage," and belittled the role of social media: "Bitar argued that 'liking a cause on Facebook' or 'retweeting a link' are not appropriate displays of outrage" (See Rachel King, Juniper Networks exec: 'First-world outrage' will not help cyber security, ZDNet, 2/25/14 )

Although I am tempted to take some time to dissect these speeches, and explore some possible contradictions inherent in their premises, I will not.

I will simply say this -- there was no Andrei Sakharov in the house.

Crypto Panel Reality Check

As I am a well-seasoned veteran of many RSA Conferences, I know that if you are looking for a reality check in the throes of that C-level kabuki, you just have to hold out for the Cryptographers Panel. And as in previous years, Ron Rivest, Whit Diffie and Adi Shamir  delivered.

In agreeing with Rivest that the most disturbing aspect of the Snowden revelations was that the NSA would tamper with NIST security guidance to the U.S. government, Diffie added:

"I grew-up in an era, where despite my conflicts with them, I believed that they were just  100% interested in the security of American communications, I thought that began to crumble when we saw key escrow, and I now sort of feel that they gave up doing that above board and turned aside to do it in other places, and that puts on us a tremendous additional burden of trying to vet things and make sure they haven't been tampered with."

The panel participants didn't just dwell on the Snowden revelations.

For example, Shamir highlighted some of his latest research: "I published a few weeks ago a paper with my colleagues ... that showed [a new] attack on RSA, where we can just listen to the sound made by a laptop ... from eight meters away we can eavesdrop on the acoustic sound made by the laptop and after less than an hour we can recover the full RSA key ..."  

Here are a few more bread and circus vignettes from RSA Conference 2004. 


On the second day of RSA 2014, I decided on two worthy sessions to tweet out via @CyLab, the first of these was "Lessons from BSIMM" with Citigal CTO Gary McGraw.

As the BSIMM project evolves, it is yielding more and more insight into how organizations are approaching software security in the real world. 

In this talk, McGraw focused on how to scale some of the best practices in the area of Software Security, and particular on three touch-points:
  • Remedial Code Review
  • Remedial Architecture Analysis
  • Remedial Penetration Testing
According to McGraw, 50 of 67 of the BSIMM participants have automated tool for remedial code review, 56 of 67 review security features, and 62 of 67 use external pen testers.

As he shared his real-world data on these three activities, McGraw also elucidated pitfalls, shared some perspective on trends and offered practical suggestions.

For example, concerning remedial code review, he remarked, "security people are good at finding problems, but they suck at fixing them,"  and "developers learn to game the results."

Ruminating on the second touch-point, remedial architecture analysis, McGraw optimistically opined, "We are in the Age of Bugs, next will be the Age of Flaws."

In regard to his third touch-point, remedial pen testing, McGraw exhorted attendees to "periodically pen test everything you can," and added "fix what you find."

Gary McGraw, Citigal - Scaling A Software Security Initiative: Lessons from the BSIMM (RSA Conference 2014)
Gumshoes: Investigative Journalists Speak Out
– Security Investigative Journalists Speak Out

The other session that I tweeted out on Day 2 was "Gumshoes: Security Journalists Speak Out" with  Brian Krebs (Krebs on Security), Nicole Perlroth (New York Times) and Kevin Poulson (Wired).

The role of the investigative journalist is one of the most vital in all of the realm of cyber risk; it is also one of the most challenging. The work can be frustrating and dangerous too. 

Although there are many more reporters on this beat than there were two decades ago, there are still only a handful that stand out as consummate professionals.

Three of them participated in this panel discussion.

Here are a few of the many insights they shared.

Krebs: "A lot of stuff has happened to me that what I talk about publicly. People have offered me money to do something, or not do something ..."

Poulsen: "We are all accustomed to being approached by people who aren't who they say they are."

Krebs: "In national security reporting, you only get certain stuff by being spun a few times ..."

Perloth: "I get a lot of information that is over-hyped, every single day. And I think that is perhaps the biggest challenge in my job ..."

Krebs: "My biggest challenge is what not to write about ..."

Perlroth: "It is really interesting that Snowden released tens of thousands of these documents, as I was sifting through them this summer, I thought about that a lot ... There was a lot of stuff I didn't need to see, a lot more than what we needed ..."

Perlroth: "After spending summer looking through the Snowden documents, I looked at my  co-worker ... and said, Well, I think there is a consensus here, being a spy is one of the most [expletive] jobs ..."

Poulsen: "I would like to see a better explanation of what [RSA] thought they were doing [re: NSA]."
Nicole Perlroth, N.Y. Times (RSA Conference 2014)
Brian Krebs, Krebs on Security (RSA Conference 2014)
Key Trends in Security: The Venture Capitalists' View 

On Day 3, I attended "Key Trends in Security: The Venture Capitalists' View."

Moderated by Joseph Menn (@josephmenn) of Reuters, this panel featured David Cowan of Bessemer Venture, Ray Rothrock of Venrock and Asheem Chandra of Greylock.

One takeaway from this session was Cowan's savvy perspective on mobile device security:

"There is a lot of money going into mobile computing that is not going to be well-spent, by either the investors or the customers ... It is mostly around wrapping apps, encrypting data on the mobile devices, compartmentalizing the phone, trying to figure out what's work and what's personal. There are various problems with these approaches. Not every product suffers all of these problems, but every product suffers from one or more of these problems. One problem is that they all assume that there are work apps and personal apps, and we all know we use our phones, SMS, e-mail and cameras for both work and for personal, so it's delusional to think you can compartmentalize them ... The app wrappers are kludgy ... You have problems because there are some that say they are going to encrypt the data created inside the app; well, that means, great, I can upload some things to my phone, but if I try to get to it from my browser I am going to get a lot of garbage because my browser doesn't know how to decrypt what my wrap app encrypted. And then there is a fundamental security flaw, which is that you can wrap data inside a phone, but at some point if it's going to be useful, you are going to have to unwrap it ... then when you unwrap the data for the user, I am going to collect it right then and there ... enterprises are grasping for solutions ... but these are not really solutions, this is grasping for straws ..."

Yes, but isn't that what much of the cyber security industry is about: "grasping for straws"? 

We Are All Intelligence Officers Now 

It wasn't until Day 4 that I heard someone speak truth to the power of that we, as a civilization, have unwittingly unleashed upon ourselves.

Dan Geer is a force of nature in the field of cyber security and intelligence. He has been engaged at the highest levels of our discourse for as long as I have been involved, and I go back to the early 1990s. As the darkness deepens, his vision sharpens and expands.

Here are some excerpts from Dan's speech, "We Are All Intelligence Officers Now," followed by a link to the full text:

We are all data collectors, data keepers, data analysts. Some citizens do it explicitly; some citizens have it done for them by robots. To be clear, we are not just a society of informants, we are becoming an intelligence community of a second sort  ...

This is not a Chicken Little talk; it is an attempt to preserve if not make a choice while choice is still relevant ... 

Richard Clarke's novel _Breakpoint_ centered around the observation that with fast enough advances in genetic engineering not only will the elite think that they are better than the rest, they will be. [RC] I suggest that with fast enough advances in surveillance and the inferences to be drawn from surveillance, that a different elite will not just think that it knows better, it will know better. Those advances come both from Moore's and from Zuboff's laws, but more importantly they rest upon the extraordinarily popular delusion that you can have freedom, security, and convenience when, at best, you can have two out of three ... 

If knowledge is power, then increasing the store of knowledge must increase the store of power; increasing the rate of knowledge acquisition must increase the rate of power growth. All power tends to corrupt, and absolute power corrupts absolutely,[LA] so sending vast amounts of knowledge upstream will corrupt absolutely, regardless of whether the data sources are reimbursed with some pittance of convenience ... Very nearly everyone at this conference is explicitly and voluntarily part of the surveillance fabric because it comes with the tools you use, with what Steve Jobs would call your digital life. With enough instrumentation carried by those who opt in, the person who opts out hasn't really opted out. If what those of you who opt in get for your role in the surveillance fabric is "security," then you had better be damnably sure that when you say "security" that you all have close agreement on precisely what you mean by that term ...

It is said that the price of anything is the foregone alternative. The price of dependence is risk. The price of total dependence is total risk. Standing in his shuttered factory, made redundant by coolie labor in China, Tom McGregor said that "American consumers want to buy things at a price that is cheaper than they would be willing to be paid to make them." A century and a half before Tom, English polymath John Ruskin said that "There is nothing in the world that some man cannot make a little worse and sell a little cheaper, and he who considers price only is that man's lawful prey." Invoking Zittrain yet again, the user of free services is not the customer, he's the product. Let me then say that if you are going to be a data collector, if you are bound and determined to instrument your life and those about you, if you are going to "sell" data to get data, then I ask that you not work so cheaply that you collectively drive to zero the habitat, the lebensraum, of those of us who opt out. If you remain cheap, then I daresay that opting out will soon require bravery and not just the quiet tolerance to do without digital bread and circuses. 

To close with Thomas Jefferson: 'I predict future happiness for Americans, if they can prevent the government from wasting the labors of the people under the pretense of taking care of them.'   

Dan Geer, We Are All Intelligence Officers Now, RSA Conference 2014 (Full Text) 

End Game? 

Back in the mid-1990s, reporting from those early RSA Conferences, at the eye of the raging Clipper Chip storm, I was sympathetic to the concerns of national intelligence and law enforcement communities. I was awake to the very real threat from Al Qaeda (yes, very much pre-9/11). I was also deeply concerned about the abomination of child pornography (and the hell realms that provide its content). I understood the frustrations and genuine needs of government agents, friends and colleagues dedicated to fighting such evils. I did not want to deny them tools.

But, of course, I was also laboring under some false assumptions about where we were as a "civil society." It was inconceivable to me that the Bill of Rights would prove to be somehow less than inviolate or that any White House official would ever characterize the Geneva Accords as "obsolete" and "quaint." Likewise, I assumed that the Powell Doctrine would never be ignored, and that the findings of the Church Committee would never be forgotten.  

All these assumptions proved to be wrong. 

It would be easy, too easy, to argue that these false assumptions were swept away in the aftermath of the slaughter of the innocents on 9/11. And that our desperate efforts to respond to that atrocity put us in conflict with some of our own most cherished societal values. But I realize, now, that it would only be misleading to attribute our current circumstances solely to the atrocity of 9/11. Because much of what has happened to us would have happened anyway, one way or another, much of what has been lost in terms of the Bill of Rights, would have been lost anyway, just not as rapidly. Perhaps with better governance in the months prior to 9/11 and in the years, there would have been more time for reasoned debate. Perhaps better choices could have been made.

Perhaps. But only perhaps.

As Geer's speech elucidates, our current circumstances are to a great extent the consequence of Moore's Law (number of transistors on integrated circuits doubles approximately every two years) and Zuboff's Three Laws (everything that can be automated will be automated, everything that can be informated will be informated, every digital application that can be used for surveillance and control will be used for surveillance and control).

The process is seemingly inexorable. Is there time for the relevant choice that Geer eluded to in his remarks, and are there viable options from which to choose?

Either way, "we are all intelligence officers now."

-- Richard Power 

Jim Bizdos posthumously honoring F. Lynn McNulty with the RSA Lifetime Achievement Award. Lynn's wife Peggy accepted on behalf of the family. I had the great pleasure of knowing F. Lynn McNulty, I interviewed him on several occasions in the halycon days, and sought his views on important issues. He was a man of integrity, and a true patriot. (RSA Conference 2014)

Related Posts

RSA 2012: Diffie, Rivest & Shamir Shine on Crypto Panel, Ranum, Too, re: Cyber War 

RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations

RSA 2010: Lost in the Cloud, & Shrouded in the Fog of War, How Far Into the Cyber Future Can You Peer? Can You See Even Beyond Your Next Step? 

RSA 2010: Lifestyle Hacking - Notes on "Social Networks & Gen Y Meet Security & Privacy"

RSA 2010: Hacking the Smart Grid - Myths, Nightmares & Professionalism

RSA 2010: Merging Mind & Machine - Hacking the Neural Net

RSA Conference 2009: Summary of Posts 

RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations - See more at:
RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations - See more at:
RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations - See more at:
RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations - See more at:
RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations - See more at:
RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations - See more at:
RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations - See more at: