Thursday, November 20, 2014

CyLab Chronicles: Researchers Share Compelling Work with Silicon Valley Thought Leaders, Technical Experts, Law Enforcement Officials and Media at Fall 2014 Executive Briefing


The Fall 2014 CyLab Executive Briefing,  held at Carnegie Mellon University's Silicon Valley campus (NASA Research Park), brought together an RSVP invitation-only group of  C-level executives, thought leaders, technical experts, venture capitalists, government investigators and one cyber journalist (with an exclusive) for an update from one of the world's premiere cyber security and privacy research programs.

The event consisted of four compelling presentations from CyLab researchers (with Q and A) followed by open-ended dialogue over a delicious buffet luncheon.

The fifteen organizations represented ranged from high tech manufacturers and financial services institutions to social media giants and federal law enforcement agencies.

The presentations delivered reflected the broad scope of CyLab research:

Osman Yagan on "Designing Secure and Reliable Wireless Sensor Networks"

Bruce De Bruhl for Patrick Tague on "That Moment When You Realize Your Network Has Become Self-Aware"

David Brumley on "Software Security"

Anupam Datta on "Privacy through Accountability"

CyLab's Executive Briefings are an outreach to leaders in business, government and the media tasked with cyber security and privacy responsibilities. For half a day, attendees get a glimpse into just a few of the benefits that come with CyLab partnership. These benefits range from online access to the CyLab Seminar Series and attendance at the annual CyLab Partners Conference to the opportunity to put their own researcher at a desk in CyLab for a month or a year,  and the opportunity to help design a CyLab research project and to be integrally involved in its ongoing progress.

Here is a recording of one of the four presentations from this CyLab Executive Briefing, Anupam Datta on "Privacy Through Accountability" --





Tuesday, October 28, 2014

CyLab Chronicles: 2014 CyLab Partners Conference Explores Latest Research Into Vital Cyber Security and Privacy Issues



Student Poster Session, 11th Annual CyLab Partners Conference (October 2014)
The eleventh annual CyLab Partners Conference was held at the main campus of Carnegie Mellon University in Pittsburgh, Pennsylvania, on October 7th and 8th, 2014.

For two days, representatives of a dozen partners were briefed on CyLab's latest research across of broad spectrum of vital issues in the fields of cyber security and privacy. With nineteen presentations, five shared meals and a student poster session, attendees were provided with ample opportunities to both engage in meaningful dialogue with CyLab researchers and network with each other.

Four new CyLab Partners joined us for this year's event: PNC Financial Services Group (PNC), TD Ameritrade, UPMC (University of Pittsburgh Medical Center) and the National Police Agency of Japan.

Partners Conference attendance is an exclusive benefit of formal partnership with CyLab, so is access to the archive of video recordings, presentations and student posters for this year as well as previous years.

CyLab Director Virgil Gligor welcomes attendees, 11th Annual CyLab Partners Conference (October 2014)
Each year, to promote the CyLab program and contribute to the public good, we post a few select conference session videos for free, public access via the CyLab YouTube Channel and CyLab on iTunesU. This year's selections include four full faculty presentations and a fourteen minute sampler with brief excerpts from several other sessions:
For more on the benefits of CyLab partnership, and other aspects of the Carnegie Mellon University CyLab program, visit CyLab Online.

Some Related Posts
Anupam Datta, 11th Annual CyLab Partners Conference (October 2014)
Student Poster Session, 11th Annual CyLab Partners Conference (October 2014)

Wednesday, August 27, 2014

CMU CyLab Researchers Wins USENIX Security 2014 Best Student Paper Award; Seven Other CMU Papers Delivered



As with other leading conferences in the vital fields of cyber security and privacy, Carnegie Mellon University (CMU) CyLab researchers distinguished themselves at USENIX Security 2014, the 23rd USENIX Security Symposium, held in San Diego, California, 8/20/14-8/22/14.

Three hundred fifty papers were submitted to the USENIX program committee, and the ensuing process, which involved 1,340 reviews and 1,627 follow up comments, resulted in sixty-seven papers being accepted for publication, including several from CMU CyLab researchers.

Most notably, CMU's Kyle Soska won one of two Best Student Paper Awards for Automatically Detecting Vulnerable Websites Before They Turn Malicious co-authored with CyLab faculty member Nicolas Christin

Additionally, CyLab faculty member David Brumley co-authored three of the published papers:

BYTEWEIGHT: Learning to Recognize Functions in Binary Code, with Tiffany Bao, Jonathan Burket, and Maverick Woo of Carnegie Mellon University and Rafael Turner, University of Chicago.

Blanket Execution: Dynamic Similarity Testing for Program Binaries and Components, with Manuel Egele, Maverick Woo and Peter Chapman.

Optimizing Seed Selection for Fuzzing, with Alexandre Rebert, Carnegie Mellon University and ForAllSecure; Sang Kil Cha and Thanassis Avgerinos of Carnegie Mellon University; Jonathan Foote and David Warren of Software Engineering Institute CERT; Gustavo Grieco of Centro Internacional Franco Argentino de Ciencias de la Información y de Sistemas (CIFASIS) and Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET).

Brumley also delivered a paper for one of the workshops that proceeded the main body of the Symposium itself, PicoCTF: A Game-Based Computer Security Competition for High School Students, co-authored with Peter Chapman and Jonathan Burket, also from CMU.

CyLab Usable Security and Privacy (CUPS) Lab director Lorrie Cranor teamed up with Cormac Herley, Principal Researcher in the Machine Learning Department at Microsoft Research, and several colleagues, Saranga Komanduri and Richard Shay of CMU and Stuart Schechter of Microsoft Research to co-author Telepathwords: Preventing Weak Passwords by Reading Users' Minds 

Two other CMU-authored papers were presented at USENIX Security 2014

The Long "Taile" of Typosquatting Domain Names co-authored by Janos Szurdi, Carnegie Mellon University; Balazs Kocso and Gabor Cseh, Budapest University of Technology and Economics; Jonathan Spring, Carnegie Mellon University; Mark Felegyhazi, Budapest University of Technology and Economics; and Chris Kanich, University of Illinois at Chicago. 

Password Managers: Attacks and Defenses co-authored by David Silver, Suman Jana, and Dan Boneh, Stanford University; Eric Chen and Collin Jackson, Carnegie Mellon University

Related Posts

Thursday, August 21, 2014

CMU CyLab PPP and CUPS teams win “Capture the Flag” and “Crack Me If You Can" contests at DEFCON 22

Members of CMU CyLab's Plaid Parliament of Pwning (PPP)
Carnegie Mellon University demonstrated its cyber prowess at DEFCON 22 by winning the “Capture the Flag” and “Crack Me If You Can” contests ...

Carnegie Mellon’s computer hacking team, the Plaid Parliament of Pwning (PPP), took first place for the second consecutive year in the Capture the Flag (CTF) contest. Globally, hundreds of teams battle throughout the year for one of 20 slots at DEFCON’s CTF competition, which has been called the “World Series of hacking.”

“Our team competed against universities and also against large defense contractors. This win is a huge accomplishment for our team,” says team advisor David Brumley, an associate professor of Electrical and Computer Engineering and the technical director of Carnegie Mellon CyLab.

The PPP team qualified for DEFCON for the last three years, and won first place in 2013 and again in 2014. The PPP team is part of CyLab’s Undergraduate Computer Security Research group, and it consists of 35 members from the College of Engineering and the School of Computer Science.

At DEFCON 22, the team was limited to eight members: George Hotz, Ryan Goulden, Tyler Nighswander, Brian Pak, Alex Reece, Max Serrano, Andrew Wesie, Ricky Zhou ...

A second team, this one from CyLab Usable Privacy and Security (CUPS) Lab, and simply named “cmu,” won the Street Division category in the “Crack Me If You Can” contest. In this two-day event sponsored by KoreLogic Security, teams exposed or “cracked” encrypted passwords.

"The students leveraged what they had learned from our research studies to develop their winning strategy," CUPS Director Lorrie Cranor says, "It is remarkable for a first-time team to win this competition." Cranor and fellow CyLab faculty members Lujo Bauer, Nicolas Christin, along with their team of students, are responsible for a growing body of work on passwords.


"Black Badge" bestowed upon CTF winners guarantees lifetime free entry to DEFCON
See Also


CyLab's David Brumley and His Student Hacker Team Featured on PBS NEWSHOUR and CNBC

Carnegie Mellon's Capture the Flag Team Excels in Hackjam Competition

CUPS Password Research Studies



Sunday, July 13, 2014

A Decade Into Its Vital Work, Another Savory SOUPS, A Report from the 10th Annual Symposium On Usable Privacy and Security



CMU CyLab's Dr. Lorrie Cranor, Founder of CUPS and SOUPS preps
for welcoming remarks at SOUPS 2014


The CyLab Usable Privacy and Security Laboratory (CUPS) 10th Annual Symposium on Usable Privacy and Security (SOUPS) was hosted by Facebook at its headquarters in Menlo Park, California (7/9/14 - 7/11/14). CUPS Director Lorrie Cranor welcomed the attendees, with the record-breaking numbers in both attendance and papers submitted. For three full days of proceedings, hundreds of researchers from business, academia and government communed together amidst the proliferation of signage which has come to characterize the social media giant's corporate culture: e.g., "Ship Love," "Ruthless Prioritization," "Demand Success," Nelson Mandela, arms outstretched, with the caption, "Open the Doors," etc. (Not so subliminal messaging.)
 
Perhaps more poignantly than any previous SOUPS keynote, Christopher Soghoian of American Civil Liberties Union (ACLU) articulated the vital nature of research into usable privacy and security. Putting flesh and blood on these issues, Soghoian used examples from the shadow world of investigative reporters and whistle-blowers to highlight the need for privacy and security software that is not only robust but eminently usable. One great benefit of the revelations brought forth by Glenn Greenwald in the Edward Snowden affair, Soghoian opined, is that there has been increased crypto adoption by journalists.
But the heightened engagement has also brought long-standing problems into a harsh new light. For example, Soghoian told SOUPS attendees, many investigative journalists using PGP still do not realize subject lines are not encrypted. "The best our community has to offer sucks, the usability and the default values suck," Soghoian declared, "the software is not protecting journalists and human rights activists, and that's our fault as researchers"

As contributing markets factors for why we still don't have usable encryption, Soghoian cited: potential data loss ("telling your customer that they've just lost every photo of their children is a non starter"), current business models, and of course, government pressure.

Facebook HQ Signage, 1 Hacker Way, Menlo Park
In other parts of his very substantive keynote, Soghoian touched on consumer issues related to the efficacy of privacy and security. He elucidated the differences in privacy and security between the iPhone and the Android: "The privacy and security differences ... are not advertised." He also shed light on a new aspect of the growing gap between rich and poor, "security by default for the rich," and "insecurity by default for the poor." "Those who are more affluent get the privacy benefits without shopping around," he explained, because the discounted, and mass-marketed versions of software often do not have the same full-featured privacy and security as the more expensive business or professional versions.

[NOTE: Full-length video of Soghoian's keynote is available via the CyLab YouTube Channel.]

Several awards were also announced during the opening sessions, including:

The 2014 IAPP SOUPS Privacy Award for the paper with the most practical application in the field of privacy went to Would a Privacy Fundamentalist Sell Their DNA for $1000...If Nothing Bad Happened as a Result? The Westin Categories, Behavioral Intentions, and Consequences authored by Allison Woodruff, Vasyl Pihur, Sunny Consolvo, and Lauren Schmidt of Google; and Laura Brandimarte and Alessandro Acquisti of Carnegie Mellon University.

The 2014 SOUPS Impact Award for a SOUPS paper "published between 2005 and 2009 that has had a significant impact on usable privacy and security research and practice" went to Usability of CAPTCHAs or Usability Issues in CAPTCHA Design authored in 2008 by Jeff Yan and Ahmad Salah El Ahmad of Newcastle University (UK).

Two Distinguished Papers awards were presented:

Understanding and Specifying Social Access Control Lists, authored by Mainack Monda of Max Planck Institute for Software Systems (MPI-SWS), Yabing Liu of Northeastern University, Bimal Viswanath and Krishna P. Gummadi of Max Planck Institute for Software Systems (MPI-SWS), and Alan Mislove of Northeastern University.

Crowdsourcing Attacks on Biometric Systems, authored by Saurabh Panjwani, an independent consultant and Achintya Prakash of University of Michigan.
Carnegie Mellon University (CMU), home to the CyLab Usable Privacy and Security (CUPS) Lab and the MSIT-Privacy Engineering Masters Program was well-represented in the proceeding.

In addition to the IAPP SOUPS Privacy Award winning "Would a Privacy Fundamentalist Sell Their DNA for $1000...If Nothing Bad Happened as a Result? The Westin Categories, Behavioral Intentions, and Consequences," co-authored with Google researchers, several other CMU papers were presented:

Parents’ and Teens’ Perspectives on Privacy In a Technology-Filled World, authored by Lorrie Faith Cranor, Adam L. Durity, Abigail Marsh, and Blase Ur, Carnegie Mellon University

Privacy Attitudes of Mechanical Turk Workers and the U.S. Public, authored by Ruogu Kang, Carnegie Mellon University, Stephanie Brown, Carnegie Mellon University and American University, Laura Dabbish and Sara Kiesler, Carnegie Mellon University

CMU researcher Ruogu Kang presenting
Privacy Attitudes of Mechanical Turk Workers and the U.S. Public
Harder to Ignore? authored by Cristian Bravo-Lillo, Lorrie Cranor, and Saranga Komanduri, Carnegie Mellon University, Stuart Schechter, Microsoft Research, Manya Sleeper, Carnegie Mellon University

The Effect of Social Influence on Security Sensitivity, authored by Sauvik Das, Tiffany Hyun-Jin Kim, Laura A. Dabbish, and Jason I. Hong, Carnegie Mellon University

Modeling Users’ Mobile App Privacy Preferences: Restoring Usability in a Sea of Permission Settings, authored by Jialiu Lin, Bin Liu, Norman Sadeh, and Jason I. Hong, Carnegie Mellon University

The full proceedings of SOUPS 2014 are available via USENIX.

-- Richard Power

Check out CyLab CyBlog's Archive of SOUPS Coverage

A Distinguish Paper Award for CUPS, and Other News from Ninth Annual SOUPS 2013

CyLab's SOUPS 2012 Continues Its Ongoing, Deepening Dialogue on What Works and What Doesn't

SOUPS 2011 Advances Vital Exploration of Usability and Its Role in Strengthening Privacy and Security  

 SOUPS 2010: Insight into Usable Privacy & Security Deepens at 6th Annual Symposium

Reflections on SOUPS 2009: Between Worlds, Cultivating Superior Cleverness, Awaiting a Shift in Consciousness

Glimpses into the Fourth Annual Symposium on Usable Security and Privacy (SOUPS 2008)

Mike Farb of CyLab's SafeSlinger project presents during the 2014 EFF Crypto Usability Prize (EFF CUP)
Workshop on Day One of SOUPS 2014

Facebook HQ Signage, 1 Hacker Way, Menlo Park

Wednesday, June 25, 2014

CyLab Researchers In 5-Year NSF-Funded Project to Overcome Challenges in Synchronizing Time for Cyber-Physical Systems

A Sundial in Saint-Rémy-de-Provence, France (Source: Wikimedia)

NOTE: This CyLab Chronicles is cross-posted on both the Carnegie Mellon CyLab public site and the CyLab partners-only portal. - See more at: http://www.cyblog.cylab.cmu.edu/#sthash.e9Z6CBOz.dpuf

The National Science Foundation (NSF) recently announced "a five-year, $4 million award to tackle the challenge of synchronizing time in cyber-physical systems (CPS)--systems that integrate sensing, computation, control and networking into physical objects and infrastructure." According to the NSF, the award "will support a project called Roseline, which seeks to develop new clocking technologies, synchronization protocols, operating system methods, as well as control and sensing algorithms ... Examples of cyber-physical systems include autonomous cars, aircraft autopilot systems, tele-robotics devices and energy-efficient buildings, among many others." (NSF, 6-13-14)

Two CMU CyLab researchers, Raj Rajkumar and Anthony Rowe, are members of the Roseline project team. Mani Srivastava of UCLA and Rajesh Gupta of UC San Diego will serve as Principle Investigators. The team also includes Sudhakar Pamarti of UCLA, João Hespanha of UC Santa Barbara and Thomas Schmid of the University of Utah.

CyLab News caught up with both CyLab researchers to get their perspectives on the scope and significance of Project Roseline.

"We all know that time is a fundamental attribute," Rajkumar said. "Computers maintain time by using local clocks, and synchronize these values among themselves or with reliable clock sources on the internet. However, the accuracies of these synchronized clocks are very highly system- and network-dependent. This in turn causes a wide range of applications from smart grid systems to robotic systems like autonomous vehicles to be customized and tweaked. In other words, there does not yet exist core foundations to specify, implement and utilize a notion of time, whose accuracy can be specified, controlled and achieved - we refer to this as the Quality of Time. This project will develop the foundations for managing the Quality of Time in computer systems." 

"There is a notion of time that transcends all layers of modern computer systems," Rowe adds. "At the lowest-level you have clocks driving hardware. Above that you have operating systems and networking that use time to manage how and more importantly when resources should be consumed. At the very top you have applications ranging from GPS to banking transactions that rely on timing. Our goal is to develop mechanisms and interfaces for improving what we call Quality of Time (QoT) aware applications. Specifically at CyLab we will be working on operating system abstractions and approaches to network coordination that improve energy-efficiency and reliability of networked embedded systems. Our target applications range from secure smart-grid monitoring to robotic systems like autonomous vehicles."

Full Text of NSF Press Release

Monday, June 9, 2014

CyLab Chronicles: Anupam Datta on Impact and Implications of IEEE Award Winning Paper, "Bootstrapping Privacy Compliance in Big Data Systems"



NOTE: This CyLab Chronicles is cross-posted on both the Carnegie Mellon CyLab public site and the CyLab partners-only portal.

Anupam Datta is an Associate Professor at Carnegie Mellon University (CMU) and a leading researcher at CMU CyLab. CyLab Chronicles recently sat down with him to discuss his team's latest successes and the future direction of their vital research. Here is our conversation.

In the 21st Century, the privacy space has become a very challenging one - for government, for business and for each of us as individuals. This very challenging space is fraught with complex issues. Which particular issue does the work in "Bootstrapping Privacy Compliance in Big Data Systems" seek to address?

Anupam Datta: To allay privacy concerns, Web services companies, such as Facebook, Google and Microsoft, all make promises about how they will *use* personal information they gather. But ensuring that *millions of lines of code* in their systems *respect* these *privacy promises* is a challenging problem. Recent work with my PhD student in collaboration with Microsoft Research addresses this problem.  We present a workflow and tool chain to automate privacy policy compliance checking in big data systems. The tool chain has been applied to check compliance of over a million lines of code in the data analytics pipeline for Bing, Microsoft’s Web search engine. This is the first time automated privacy compliance analysis has been applied to the production code of an Internet-scale system. The paper, written jointly with my PhD student Shayak Sen and a team from Microsoft Research, was presented at the 2014 IEEE Symposium on Security and Privacy and recognized with a Best Student Paper Award.

Tell us something about the system you and your team designed? And briefly describe its components, Legalease and Grok?

Datta: Central to the design of the workflow and tool chain are (a) *Legalease* — a language that allows specification of privacy policies that impose restrictions on how user data flows through software systems; and (b) *Grok* — a data inventory that annotates big data software systems (written in the Map-Reduce programming model) with Legalease’s policy datatypes, thus enabling automated compliance checking. Let me elaborate.
Privacy policies are often crafted by legal teams while software that has to respect these policies are written by developers. An important challenge is thus to design privacy policy languages that are *usable* by legal privacy teams, yet have precise *operational meaning* (semantics) that software developers can use to restrict how their code operates on personal information of users. The design of Legalease was guided by these dual considerations. Legalease builds on prior work from my research group that shows that privacy policies often involve *nested allow-deny information flow rules with exceptions* (see DeYoung et al. 2010, Garg et al. 2011 for the first complete logical specification and audit of the HIPAA Privacy Rule for healthcare privacy in the US). For example a rule might say: "IP address will not be used for advertising, except it address may be used for detecting abuse. In such cases it will not be combined with account information.” Our hypothesis was that such rules match the *mental model* of legal privacy policy authors. The results of our *user study*, involving participants from the legal privacy team and privacy champions at Microsoft (who sit between the legal privacy team and software developers), provide evidence in support of our hypothesis: after a short tutorial, the participants were able to encode the entire Bing policy pertaining to how users’ personal information will be used on the Bing servers (9 policy clauses) in about 15 minutes with high accuracy.
Software systems that perform data analytics over personal information of users are often written without a technical connection to the privacy policies that they are meant to respect. Tens of millions of lines of such code are already in place in companies like Facebook, Google, and Microsoft. An important challenge is thus to *bootstrap* existing software for privacy compliance. Grok addresses this challenge. It annotates software written in programming languages that support the Map-Reduce programming model (e.g., Dremel, Hive, Scope) with Legalease’s policy datatypes. We focus on this class of languages because they are the languages of choice in industry for writing data analytics code. A simple way to conduct the bootstrapping process would be to ask developers to manually annotate all code with policy datatypes (e.g., labelling variables as IPAddress, programs as being for the purpose of Advertising etc.). However, this process is too labor-intensive to scale. Instead, we develop a set of techniques to automate the bootstrapping process. I should add that the development of Grok was led by Microsoft Research and was underway before our collaboration with them began.

What do you see as the practical outcome of this research? Is it going to be applicable in the operational environment? What is it leading toward?

Datta: One practical outcome of this research is that it enables companies to check that their big data software systems are compliant with their privacy promises. Indeed the prototype system is already running on the production system for Bing, Microsoft's search engine. So, yes, it is applicable to current operational environments.
I view this result as a significant step forward in ensuring privacy compliance at scale. With the advent of technology that can help companies keep their privacy promises, users can reasonably expect stronger privacy promises from companies that operate over their personal information. Of course, much work remains to be done in expanding this kind of compliance technology to cover a broader set of privacy policies and in its adoption in a wide range of organizations.

What other privacy challenges are you looking to address in your ongoing research? What's next for your team?

Datta: We are deeply examining the Web advertising ecosystem, in particular, its *transparency* and its implications for *privacy* and *digital discrimination*. There are important computational questions lurking behind this ecosystem: Are web-advertising systems transparently explaining what information they collect and how they use that information to serve targeted ads? Are these systems compliant with promises they make that they won't use certain types of information (e.g., race, health information, sexual orientation) for advertising? Can we answer these questions from the "outside", i.e., without gaining access to the source code and data of web advertising systems (in contrast with our privacy compliance work with Microsoft Research)? How can we enhance transparency of the current Web advertising ecosystem? What does digital discrimination mean in computational terms and how can we design systems that avoid it?

As a researcher who has looked long and deeply into the privacy space, what do you think is most lacking in the approaches of government and business? What is most needed? What are people missing?

Datta: First, I believe that there is a pressing need for better *computational tools* that organizations can (and should) use to ensure that their software systems and people are compliant with their privacy promises. These tools have to be based on well-founded computing principles and at the same time have to be usable by the target audience, which may include lawyers and other groups who are not computer scientists.
Second, current policies about collection, use, and disclosure of personal information often make rather *weak promises*. This is, in part, a corollary of my first point: organizations do not want to make promises that they don't have the tools to help them be compliant with. But it is also driven by economic considerations, e.g., extensive use of users' personal information for advertising can result in higher click-through rates. While many companies make promises that they will not use certain types of information (e.g., race, health information, sexual orientation) for advertising, it is far from clear what these promises even mean and how a company can demonstrate they are compliant with these promises.
Third, we need better *transparency mechanisms* from organizations that explain what information they collect and how they use that information. Self-declared privacy policies and mechanisms like ad settings managers are steps in the right direction. However, the transparency mechanisms should themselves be auditable to ensure that they indeed transparently reflect the information handling practices of the organization.
These are all fascinating questions -- intellectually deep from a computer science standpoint and hugely relevant to the lives of hundreds of millions of people around the world. They keep me and my research group up at night!

Related Posts

IEEE Security and Privacy Symposium 2014: Another Challenging Year, Another Compelling IEEE SPS, and Another Significant Contribution from CMU CyLab 

(CMU-CyLab-14-005) Temporal Mode-Checking for Runtime Monitoring of Privacy Policies

(CMU-CyLab-13-005) Purpose Restrictions on Information Use

Anupam Datta - Privacy through Accountability (2014 CyLab Seminar Series, You Tube Video)

Anupam Datta - Privacy through Accountability (10th Annual CyLab Partners Conference, You Tube Video)

CyLab Chronicles: Q&A with Anupam Datta (2009)

CyLab Chronicles: Q&A with Anupam Datta (2008)