Tuesday, April 15, 2014

CyLab Research Sheds Light on Heartbleed and Its Implications


Heartbleed is a significant event along the cyber security timeline. Its consequences will be with us all for quite awhile. If you haven't already come to grips with this issue, you should do so urgently.

For some guidance go to http://heartbleed.com/

To verify if a particular server is vulnerable, go to: http://filippo.io/Heartbleed/

For a command-line tool, go to: https://github.com/FiloSottile/Heartbleed

Here at CyLab, the story has provided us with an opportunity to reflect on some of our recent research and its relevancy to the problem at hand, e.g., --

Perspectives: If you scan a server, and find that it isn't vulnerable, you would still need to know if it had been vulnerable in the past, and/or if it has been updated. One way to answer that question is to determine if the private key has been updated. If you connect to the server with a Firefox browser that has the Perspectives extension installed, and then inspect the key history. To do so, click the Perspectives icon on the left-hand of the URL bar, and select "View Notary Results." (Of course, if the key has not been changed, you're still none the wiser.) For more on Perspectives, visit the Perspectives Project page.

TrustVisor: In TrustVisor, we proposed keeping the OpenSSL private key inside a PAL, which would have defended against this vulnerability. See our paper on TrustVisor: Efficient TCB Reduction and Attestation, authored by Jonathan M. McCune (now with Google), Yanlin Li, Ning Qu, Zongwei Zhou, Anupam Datta, Virgil Gligor and Adrian Perrig.

Flicker: In Flicker, we proposed to store the SSH password database inside a PAL, which would also prevent password theft. See Flicker: An Execution Infrastructure for TCB Minimization,
authored by three CyLab researchers Jonathan M. McCune, Bryan Parno (now with Microsoft), and Adrian Perrig (now with ETH Zurich), along with Michael K. Reiter of University of North Carolina (Chapel Hill), and Hiroshi Isozaki of Toshiba Corporation.

Perspectives, TrustVisor and Flicker all evolved out of CyLab's work on Trustworthy Computing Platforms and Devices. And this continues to be one of CyLab's major research thrusts.

Amit Vasudevan, a CyLab  Research Systems Scientist, and Miao Yu, a CyLab grad student, took a few moments to sit down with CyBlog, and share some insights on where we are and what's next.

According to Vasudevan, "the IEE technologies and prototypes we have been developing (XMHF - TrustVisor, KISS, Minibox, etc.) lay a solid foundation to protect against Heartbleed-like attacks."

"But going from our prototypes to the real-world is a different kind of challenge. The software ecosystem out there today does not really consider security as a first-class citizen. Consequently, tweaking these components to adapt to our IEE design is non-trivial ...  In the long term, developers of security-oriented/sensitive software would benefit from a simple and solid security framework that would allow them to leverage strong security properties, while letting them also implement the desired functionality. And our work with XMHF plus Trustvisor plus other hypapps (http://xmhf.org) is the right step in this direction."

"This bug is still underestimated," warns Yu.

He cites three reasons for his concern:
"Currently, we putting a lot of care into HTTPS websites. But other protocols, e.g., FTPS (used in file transfer) server, can also be impacted by this bug. 

"Not only servers, but also clients, e.g. smart phones and other devices, may suffer from this bug. And for certain devices, the problem can be even worse. For example, mobiles phones have long patch cycles. For the heartbleeding bug, the first patch of this bug came out in 20 minutes and web servers began the repair in the first day. But Android phones only get scanners, e.g., Bluebox Heartbleed Scanner or Heartbleed Detector to help users find out if their phone is vulnerable ... From our experience with past vulnerabilities, it would take tens of weeks until half of the mobile devices get patched. During this period, the devices are at risk. Other devices, which may use OpenSSL for establishing administration channels, also may suffer from long patch cycles. At CyLab, Zongwei Zhou, Miao Yu, Yoshiharu Imamoto, Amit Vasudevan, Virgil Gilgor and I have developed an isolated execution environment for the ARM mobile platform. It is quite similar to TrustVisor, but focuses on mobile system security, so that, e.g., you could run a banking client (or some other sensitive application) in an isolated execution environment, so that your code and data would still be secure in spite of this or other vulnerabilities present in Android.

"All three recent SSL bugs, i.e., IOS's goto fail bug, the GnuTLS bug and the Heartbleed bug are implementation-related rather than design related. The lesson is that design security doesn't mean implementation security. We do need runtime protection as a last line of defense."

-- Richard Power

Wednesday, March 12, 2014

New on @CyLab @YouTube Channel: Anupam Data - Privacy through Accountability (Full-Length Seminar)


The CyLab Seminar Series is held on Mondays during the school year, at CyLab HQ on the main campus of Carnegie Mellon University (Pittsburgh, PA.) These weekly talks feature the latest research from CyLab faculty and visiting colleagues from other centers of academic research into cyber security and privacy. Occasional Business Risk Forum events introduce the insights of cybersecurity and privacy excerpts from the operational side of business and government.

Access to the webcasts of this dynamic Seminar Series is an exclusive benefit of corporate partnership with CyLab. But from time to time, to encourage further research into cybersecurity and privacy, and to contribute to the ongoing dialogue on these vital issues, we release select videos for free public viewing via the CyLab You Channel.

Here is the abstract and embedded video for one such talk released via @CyLab @YouTube:

Abstract 
Privacy has become a significant concern in modern society as personal information about individuals is increasingly collected, used, and shared, often using digital technologies, by a wide range of organizations. To mitigate privacy concerns, organizations are required to respect privacy laws in regulated sectors (e.g., HIPAA in healthcare, GLBA in financial sector) and to adhere to self-declared privacy policies in self-regulated sectors (e.g., privacy policies of companies such as Google and Facebook in Web services). We investigate the possibility of formalizing and enforcing such practical privacy policies using computational techniques. We formalize privacy policies that prescribe and proscribe flows of personal information as well as those that place restrictions on the purposes for which a governed entity may use personal information. Recognizing that traditional preventive access control and information flow control mechanisms are inadequate for enforcing such privacy policies, we develop principled audit and accountability mechanisms with provable properties that seek to encourage policy-compliant behavior by detecting policy violations, assigning blame and punishing violators. We apply these techniques to several US privacy laws and organizational privacy policies, in particular, producing the first complete logical specification and audit of all disclosure-related clauses of the HIPAA Privacy Rule.

CyLab Seminar: Anupam Data - Privacy through Accountability (Full-Length Seminar)
)

For more information on the work of CyLab researcher Anupam Datta. For more information on the CyLab corporate partnership program.

New on @CyLab @YouTube Channel: Osman Yagan - Designing Secure and Reliable Wireless Sensor Networks (Full-length CyLab Seminar)


The CyLab Seminar Series is held on Mondays during the school year, at CyLab HQ on the main campus of Carnegie Mellon University (Pittsburgh, PA.) These weekly talks feature the latest research from CyLab faculty and visiting colleagues from other centers of academic research into cyber security and privacy. Occasional Business Risk Forum events introduce the insights of cybersecurity and privacy excerpts from the operational side of business and government.

Access to the webcasts of this dynamic Seminar Series is an exclusive benefit of corporate partnership with CyLab. But from time to time, to encourage further research into cybersecurity and privacy, and to contribute to the ongoing dialogue on these vital issues, we release select videos for free public viewing via the CyLab You Channel.

Here is the abstract and embedded video for one such talk released via @CyLab @YouTube.

Abstract
Wireless sensor networks (WSNs) are distributed collection of small sensor nodes that gather security-sensitive data and control security-critical operations in a wide range of industrial, home and business applications. The current developments in the sensor technology and ever increasing applications of WSNs point to a future where the reliability of these networks will be at the core of the society’s well-being, and any disruption in their services will be more costly than ever. There is thus a fundamental question as to how one can design wireless sensor networks that are both secure and reliable. 
In this talk, we will present our approach that addresses this problem by considering WSNs that employ a randomized key predistribution scheme and deriving conditions to ensure the k-connectivity of the resulting network. Random key predistribution schemes are widely accepted solutions for securing WSN communications and the k-connectivity property ensures that the network is reliable in the sense that its connectivity will be preserved despite the failure of any k − 1 sensors or links.
 
CyLab Seminar: Osman Yagan - Designing Secure and Reliable Wireless Sensor Networks
)

For more information on the work of CyLab researcher Osman Yagan. For more information on the CyLab corporate partnership program.

Friday, March 7, 2014

Descending Into the Maelstrom - Notes on RSA Conference 2014

Dan Geer: "We are all intelligence officers now." (RSA Conference 2014)

Descending Into the Maelstrom -- Notes on RSA Conference 2014

By Richard Power 
  
I confess that I was tempted to simply go to Trustycon instead. But from a historical perspective, I could not resist the bitter ironies that RSA Conference 2014 offered up. After all, I had been there, two decades earlier, when Jim Bidzos, then President and CEO of RSA, led industry and public advocacy resistance to the adoption of the National Security Agency's Clipper Chip. In the mid-1990s, the RSA Conference was one of the principle venues for the marshaling of that resistance. So how could I not attend this year's RSA Conference, with Snowden-sourced revelations of RSA's "secret $10 million" deal to an NSA "backdoor" rolled into its Bsafe product (Reuters, 12/20/13). How could I not bear witness to how all of this would play out?

So I attended RSA Conference 2014, as I have since the beginning, as a member of the press. I simply had to follow up on this latest twist in the plot of our collective lives and careers in security and intelligence. Walking toward the escalators to descend into the maelstrom, I glanced up. An overarching banner read: "Share. Learn. Secure. Capitalizing on Collective Intelligence." Seriously. Such a strange thematic choice for this particular RSA Conference. Were they going for irony, or doubling down on the brouhaha, or was someone simply asleep at the wheel. "Capitalizing on Collective Intelligence." Wow. Don't misunderstand me. I am a strong champion for the role of intelligence in government and business. Despite the tendency of governments to "sex up" intelligence to serve ideological agendas, and the tendency of business executives to avoid any real intelligence that might force them to confront some inconvenient truth. Nevertheless, at this particular moment in time, with the allegations concerning NSA's paid-for "backdoor" into RSA product still causing a profound disturbance in the force, "Capitalizing on Collective Intelligence" seemed, well, a strange spin. Was it ill-chosen, or plucky, or both?

Lucy in the Sky with Diamonds

As I sat in the front row of the auditorium for the first day's keynote session, I was still mulling over the organizers' choice of conference themes, when over the loudspeakers a secondary (and completely unrelated) theme was introduced: "Security the final frontier ... These are the voyages of the RSA Conference ..." Suddenly, William Shatner (yes, the original Captain James T. Kirk) burst into the hall and took the stage, to sing the Beatles "Lucy in the Sky with Diamonds," with lyrics re-written for the occasion: "Follow her down to a patch in the system ... where the people eat malicious code pies ... and the botnets grow so incredibly high ..." Seriously.


William Shattner: "Security gets us high ..." (RSA Conference 2014)
And then came the C-suite kabuki.

In his address, RSA Executive Chairman Art Coviello declared "our personal information" the "true currency of the digital age." (Given the backdrop of the Snowden revelations regarding RSA and the NSA, this comment struck me in the same way as the Conference theme of "Capitalizing on Collective Intelligence.") And he went on to proclaim, "We are at a crossroads ... I call on all the nations of the world to adopt these four principles:  renounce the use of cyber weapons and the use of Internet to wage war... cooperate internationally on the apprehension of cyber criminals ... respect intellectual property rights ... ensure privacy of all individuals ..." 

Nawaf Bitar, Senior VP and GM of Security Business Unit, Juniper Network opened his presentation with the image of a Tibetan protester's self-immolation, and went on to invoke the spirit of Nelson Mandela. In the course of his mocking remarks about what he termed "#FirstWorldOutrage," Bitar took a swipe at Trustycon: "not showing up at a conference is not outrage," and belittled the role of social media: "Bitar argued that 'liking a cause on Facebook' or 'retweeting a link' are not appropriate displays of outrage" (See Rachel King, Juniper Networks exec: 'First-world outrage' will not help cyber security, ZDNet, 2/25/14 )

Although I am tempted to take some time to dissect these speeches, and explore some possible contradictions inherent in their premises, I will not.

I will simply say this -- there was no Andrei Sakharov in the house.

Crypto Panel Reality Check

As I am a well-seasoned veteran of many RSA Conferences, I know that if you are looking for a reality check in the throes of that C-level kabuki, you just have to hold out for the Cryptographers Panel. And as in previous years, Ron Rivest, Whit Diffie and Adi Shamir  delivered.

In agreeing with Rivest that the most disturbing aspect of the Snowden revelations was that the NSA would tamper with NIST security guidance to the U.S. government, Diffie added:

"I grew-up in an era, where despite my conflicts with them, I believed that they were just  100% interested in the security of American communications, I thought that began to crumble when we saw key escrow, and I now sort of feel that they gave up doing that above board and turned aside to do it in other places, and that puts on us a tremendous additional burden of trying to vet things and make sure they haven't been tampered with."

The panel participants didn't just dwell on the Snowden revelations.

For example, Shamir highlighted some of his latest research: "I published a few weeks ago a paper with my colleagues ... that showed [a new] attack on RSA, where we can just listen to the sound made by a laptop ... from eight meters away we can eavesdrop on the acoustic sound made by the laptop and after less than an hour we can recover the full RSA key ..."  

Here are a few more bread and circus vignettes from RSA Conference 2004. 

BSIMMering

On the second day of RSA 2014, I decided on two worthy sessions to tweet out via @CyLab, the first of these was "Lessons from BSIMM" with Citigal CTO Gary McGraw.

As the BSIMM project evolves, it is yielding more and more insight into how organizations are approaching software security in the real world. 

In this talk, McGraw focused on how to scale some of the best practices in the area of Software Security, and particular on three touch-points:
  • Remedial Code Review
  • Remedial Architecture Analysis
  • Remedial Penetration Testing
According to McGraw, 50 of 67 of the BSIMM participants have automated tool for remedial code review, 56 of 67 review security features, and 62 of 67 use external pen testers.

As he shared his real-world data on these three activities, McGraw also elucidated pitfalls, shared some perspective on trends and offered practical suggestions.

For example, concerning remedial code review, he remarked, "security people are good at finding problems, but they suck at fixing them,"  and "developers learn to game the results."

Ruminating on the second touch-point, remedial architecture analysis, McGraw optimistically opined, "We are in the Age of Bugs, next will be the Age of Flaws."

In regard to his third touch-point, remedial pen testing, McGraw exhorted attendees to "periodically pen test everything you can," and added "fix what you find."


Gary McGraw, Citigal - Scaling A Software Security Initiative: Lessons from the BSIMM (RSA Conference 2014)
Gumshoes: Investigative Journalists Speak Out
– Security Investigative Journalists Speak Out

The other session that I tweeted out on Day 2 was "Gumshoes: Security Journalists Speak Out" with  Brian Krebs (Krebs on Security), Nicole Perlroth (New York Times) and Kevin Poulson (Wired).

The role of the investigative journalist is one of the most vital in all of the realm of cyber risk; it is also one of the most challenging. The work can be frustrating and dangerous too. 

Although there are many more reporters on this beat than there were two decades ago, there are still only a handful that stand out as consummate professionals.

Three of them participated in this panel discussion.

Here are a few of the many insights they shared.

Krebs: "A lot of stuff has happened to me that what I talk about publicly. People have offered me money to do something, or not do something ..."

Poulsen: "We are all accustomed to being approached by people who aren't who they say they are."

Krebs: "In national security reporting, you only get certain stuff by being spun a few times ..."

Perloth: "I get a lot of information that is over-hyped, every single day. And I think that is perhaps the biggest challenge in my job ..."

Krebs: "My biggest challenge is what not to write about ..."

Perlroth: "It is really interesting that Snowden released tens of thousands of these documents, as I was sifting through them this summer, I thought about that a lot ... There was a lot of stuff I didn't need to see, a lot more than what we needed ..."

Perlroth: "After spending summer looking through the Snowden documents, I looked at my  co-worker ... and said, Well, I think there is a consensus here, being a spy is one of the most [expletive] jobs ..."

Poulsen: "I would like to see a better explanation of what [RSA] thought they were doing [re: NSA]."
Nicole Perlroth, N.Y. Times (RSA Conference 2014)
Brian Krebs, Krebs on Security (RSA Conference 2014)
Key Trends in Security: The Venture Capitalists' View 

On Day 3, I attended "Key Trends in Security: The Venture Capitalists' View."

Moderated by Joseph Menn (@josephmenn) of Reuters, this panel featured David Cowan of Bessemer Venture, Ray Rothrock of Venrock and Asheem Chandra of Greylock.

One takeaway from this session was Cowan's savvy perspective on mobile device security:

"There is a lot of money going into mobile computing that is not going to be well-spent, by either the investors or the customers ... It is mostly around wrapping apps, encrypting data on the mobile devices, compartmentalizing the phone, trying to figure out what's work and what's personal. There are various problems with these approaches. Not every product suffers all of these problems, but every product suffers from one or more of these problems. One problem is that they all assume that there are work apps and personal apps, and we all know we use our phones, SMS, e-mail and cameras for both work and for personal, so it's delusional to think you can compartmentalize them ... The app wrappers are kludgy ... You have problems because there are some that say they are going to encrypt the data created inside the app; well, that means, great, I can upload some things to my phone, but if I try to get to it from my browser I am going to get a lot of garbage because my browser doesn't know how to decrypt what my wrap app encrypted. And then there is a fundamental security flaw, which is that you can wrap data inside a phone, but at some point if it's going to be useful, you are going to have to unwrap it ... then when you unwrap the data for the user, I am going to collect it right then and there ... enterprises are grasping for solutions ... but these are not really solutions, this is grasping for straws ..."

Yes, but isn't that what much of the cyber security industry is about: "grasping for straws"? 

We Are All Intelligence Officers Now 

It wasn't until Day 4 that I heard someone speak truth to the power of that we, as a civilization, have unwittingly unleashed upon ourselves.

Dan Geer is a force of nature in the field of cyber security and intelligence. He has been engaged at the highest levels of our discourse for as long as I have been involved, and I go back to the early 1990s. As the darkness deepens, his vision sharpens and expands.

Here are some excerpts from Dan's speech, "We Are All Intelligence Officers Now," followed by a link to the full text:

We are all data collectors, data keepers, data analysts. Some citizens do it explicitly; some citizens have it done for them by robots. To be clear, we are not just a society of informants, we are becoming an intelligence community of a second sort  ...

This is not a Chicken Little talk; it is an attempt to preserve if not make a choice while choice is still relevant ... 

Richard Clarke's novel _Breakpoint_ centered around the observation that with fast enough advances in genetic engineering not only will the elite think that they are better than the rest, they will be. [RC] I suggest that with fast enough advances in surveillance and the inferences to be drawn from surveillance, that a different elite will not just think that it knows better, it will know better. Those advances come both from Moore's and from Zuboff's laws, but more importantly they rest upon the extraordinarily popular delusion that you can have freedom, security, and convenience when, at best, you can have two out of three ... 

If knowledge is power, then increasing the store of knowledge must increase the store of power; increasing the rate of knowledge acquisition must increase the rate of power growth. All power tends to corrupt, and absolute power corrupts absolutely,[LA] so sending vast amounts of knowledge upstream will corrupt absolutely, regardless of whether the data sources are reimbursed with some pittance of convenience ... Very nearly everyone at this conference is explicitly and voluntarily part of the surveillance fabric because it comes with the tools you use, with what Steve Jobs would call your digital life. With enough instrumentation carried by those who opt in, the person who opts out hasn't really opted out. If what those of you who opt in get for your role in the surveillance fabric is "security," then you had better be damnably sure that when you say "security" that you all have close agreement on precisely what you mean by that term ...

It is said that the price of anything is the foregone alternative. The price of dependence is risk. The price of total dependence is total risk. Standing in his shuttered factory, made redundant by coolie labor in China, Tom McGregor said that "American consumers want to buy things at a price that is cheaper than they would be willing to be paid to make them." A century and a half before Tom, English polymath John Ruskin said that "There is nothing in the world that some man cannot make a little worse and sell a little cheaper, and he who considers price only is that man's lawful prey." Invoking Zittrain yet again, the user of free services is not the customer, he's the product. Let me then say that if you are going to be a data collector, if you are bound and determined to instrument your life and those about you, if you are going to "sell" data to get data, then I ask that you not work so cheaply that you collectively drive to zero the habitat, the lebensraum, of those of us who opt out. If you remain cheap, then I daresay that opting out will soon require bravery and not just the quiet tolerance to do without digital bread and circuses. 

To close with Thomas Jefferson: 'I predict future happiness for Americans, if they can prevent the government from wasting the labors of the people under the pretense of taking care of them.'   

Dan Geer, We Are All Intelligence Officers Now, RSA Conference 2014 (Full Text) 

End Game? 

Back in the mid-1990s, reporting from those early RSA Conferences, at the eye of the raging Clipper Chip storm, I was sympathetic to the concerns of national intelligence and law enforcement communities. I was awake to the very real threat from Al Qaeda (yes, very much pre-9/11). I was also deeply concerned about the abomination of child pornography (and the hell realms that provide its content). I understood the frustrations and genuine needs of government agents, friends and colleagues dedicated to fighting such evils. I did not want to deny them tools.

But, of course, I was also laboring under some false assumptions about where we were as a "civil society." It was inconceivable to me that the Bill of Rights would prove to be somehow less than inviolate or that any White House official would ever characterize the Geneva Accords as "obsolete" and "quaint." Likewise, I assumed that the Powell Doctrine would never be ignored, and that the findings of the Church Committee would never be forgotten.  

All these assumptions proved to be wrong. 

It would be easy, too easy, to argue that these false assumptions were swept away in the aftermath of the slaughter of the innocents on 9/11. And that our desperate efforts to respond to that atrocity put us in conflict with some of our own most cherished societal values. But I realize, now, that it would only be misleading to attribute our current circumstances solely to the atrocity of 9/11. Because much of what has happened to us would have happened anyway, one way or another, much of what has been lost in terms of the Bill of Rights, would have been lost anyway, just not as rapidly. Perhaps with better governance in the months prior to 9/11 and in the years, there would have been more time for reasoned debate. Perhaps better choices could have been made.

Perhaps. But only perhaps.

As Geer's speech elucidates, our current circumstances are to a great extent the consequence of Moore's Law (number of transistors on integrated circuits doubles approximately every two years) and Zuboff's Three Laws (everything that can be automated will be automated, everything that can be informated will be informated, every digital application that can be used for surveillance and control will be used for surveillance and control).

The process is seemingly inexorable. Is there time for the relevant choice that Geer eluded to in his remarks, and are there viable options from which to choose?

Either way, "we are all intelligence officers now."

-- Richard Power 


Jim Bizdos posthumously honoring F. Lynn McNulty with the RSA Lifetime Achievement Award. Lynn's wife Peggy accepted on behalf of the family. I had the great pleasure of knowing F. Lynn McNulty, I interviewed him on several occasions in the halycon days, and sought his views on important issues. He was a man of integrity, and a true patriot. (RSA Conference 2014)

Related Posts

RSA 2012: Diffie, Rivest & Shamir Shine on Crypto Panel, Ranum, Too, re: Cyber War 

RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations

RSA 2010: Lost in the Cloud, & Shrouded in the Fog of War, How Far Into the Cyber Future Can You Peer? Can You See Even Beyond Your Next Step? 

RSA 2010: Lifestyle Hacking - Notes on "Social Networks & Gen Y Meet Security & Privacy"

RSA 2010: Hacking the Smart Grid - Myths, Nightmares & Professionalism

RSA 2010: Merging Mind & Machine - Hacking the Neural Net

RSA Conference 2009: Summary of Posts 


RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations - See more at: http://www.cyblog.cylab.cmu.edu/2011/02/rsa-2011-one-flew-over-kakus-nest-other.html#sthash.4AGXnxqr.dpuf
RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations - See more at: http://www.cyblog.cylab.cmu.edu/2011/02/rsa-2011-one-flew-over-kakus-nest-other.html#sthash.dO6FYYKP.dpuf
RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations - See more at: http://www.cyblog.cylab.cmu.edu/2011/02/rsa-2011-one-flew-over-kakus-nest-other.html#sthash.4AGXnxqr.dpuf
RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations - See more at: http://www.cyblog.cylab.cmu.edu/2011/02/rsa-2011-one-flew-over-kakus-nest-other.html#sthash.4AGXnxqr.dpuf
RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations - See more at: http://www.cyblog.cylab.cmu.edu/2011/02/rsa-2011-one-flew-over-kakus-nest-other.html#sthash.4AGXnxqr.dpuf
RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations - See more at: http://www.cyblog.cylab.cmu.edu/2011/02/rsa-2011-one-flew-over-kakus-nest-other.html#sthash.4AGXnxqr.dpuf
RSA 2011: One Flew Over the Kaku's Nest & Other Ruminations - See more at: http://www.cyblog.cylab.cmu.edu/2011/02/rsa-2011-one-flew-over-kakus-nest-other.html#sthash.4AGXnxqr.dpuf


Monday, February 3, 2014

White House Chief Privacy Officer Keynotes CMU Data Privacy Day (Video)

White House Privacy official Nicole Wang with host CUPS Director Lorrie Cranor
on CMU Data Privacy Day, January 29th, 2014
Nicole Wong, a veteran of Google and Twitter who last year became the White House's Deputy U.S. Chief Technology Officer (CTO), was keynote speaker for Carnegie Mellon University's Data Privacy Day program on Jan. 29. Data Privacy Day is a global event held each year to empower and educate people to protect their privacy, control their digital footprint, and make the protection of privacy and data a great priority in their lives.

Sponsored by CyLab Usable Privacy and Security (CUPS) Laboratory, Wombat Security Technologies (a CMU CyLab spin-off) and others, the event was organized by the university's Master of Science in Information Technology — Privacy Engineering program, a first-of-its-kind program launched last year in response to growing demand from industry and government for trained privacy professionals.

Wong, who holds law and journalism degrees from the University of California, Berkeley, was deputy general counsel at Google for eight years, and then joined Twitter as its legal director before she was tapped to become Deputy U.S. CTO. As a lawyer in private practice prior to joining Google, Wong represented a number of media clients including the Los Angeles Times, Walt Disney Co., Microsoft, Yahoo! and Amazon.com.

In addition to Wong's keynote, CMU Privacy Day also included a privacy clinic, where privacy engineering students provided practical advice on how to choose privacy settings on Facebook, smartphones and other sites and devices, how to manage Web browser cookies and ways to block online tracking, as well as a Privacy Research Show & Tell session that featured CMU research on privacy. CUPS Director Lorrie Cranor and fellow CyLab researcher Norman Sadeh hosted the event.

A video recording of White house privacy official Nicole Wong's address is available publicly via the CMU CyLab YouTube Channel.

Full text of CMU press release

Tuesday, January 21, 2014

CyLab's David Brumley and His Student Hacker Team Featured on PBS NEWSHOUR and CNBC


In recognition of his pioneering work, Carnegie Mellon University CyLab researcher David Brumley, has received both a Presidential Early Career Award for Scientists and Engineers (PECASE) and a Sloan Research Fellowship. And now, the mainstream media has discovered Brumley and his Plaid Parliament of Pwning (PPP) team of CMU students.

On October 26, 2013 (with repeat airing on January 19, 2014), Brumley and the PPP were featured on PBS NEWSHOUR, and on November 8, 2013, they were featured on CNBC (watch embedded videos).

Program teaches cybersecurity students how to think like hackers (PBS NEWSHOUR)


Finding the Best Hacker in the Room (CNBC)



Friday, December 13, 2013

CyLab's Anthony Rowe and Fellow CMU Researcher Win U.S. DOE Grant for Sensor Networks and Energy Efficiency Research


With a three-year, $1.9 million grant from the Department of Energy, CMU CyLab's Anthony Rowe and fellow researcher Mario Berges are developing sensor networks and an open-source software platform to optimize energy use in buildings, which annually consume 39 percent of total U.S. energy production.

Anthony Rowe
CyLab's Anthony Rowe
The project builds on results from a larger research effort to integrate all sensors on campus, called Sensor Andrew.

"We want to create a control platform for building energy management that will help monitor and understand energy usage patterns over time," said Rowe, an assistant research professor in the Department of Electrical and Computer Engineering as well as a CyLab researcher.

"User-friendly tools that quickly and economically analyze energy use will also help businesses and homeowners make better use of those technologies to save energy and lower their utility bills," Rowe said. "The big challenge is to eventually support zero net-energy buildings, buildings that over an entire climate cycle actually collect energy from sources like solar and wind power, and be efficient when those are not available."

Berges, an assistant professor in the Department of Civil and Environmental Engineering, also reports that these networks will give facility managers the ability to see if a building is wasting energy or needs replacement equipment.

Full Text of Press Release

Related Posts