Tuesday, June 22, 2010

White paper: Open Source Vulnerability Response

I've been working on a white paper on vulnerability response for open source projects. A few colleagues have been good enough to give me comments and I figure it's ready for public scrutiny. Any feedback welcome.

Intelligence Squared US podcast on CyberWar Threat

The Intelligence Squared US podcast has a good episode on the question "Has the cyber war threat been grossly exaggerated?"

The debaters were all top notch: Marc Rotenburg (Executive Director of the Electronic Privacy Information Center), Bruce Schneier (Chief Security Officer of British Telecom), Johnathan Zittrain (Harvard Professor of Law) and Mike McConnel (formel U.S. Director of National Intelligence).

The podcast is well worth listening to for a good overview of the issues involved in this complicated subject.

A couple things jumped out at me...

Bruce made the good point that the language matters. By even calling it cyber "war" as opposed to cyber "crime" we've made it a military issue as opposed to a police issue. Cyber crime is clearly an issue today without a doubt, but we still don't have an example, I would argue, of a true military use of cyber offense - as Bruce put it, examples to date are "kids playing politics."

I think everyone agreed some sort of Government help is needed to secure cyberspace, but it's clear there is disagreement regarding what part of the government and in what form. Marc echoed concerns I think many have (including myself) of the NSA being put in charge. All other arguments aside, it's a conflict of interest for a organization interesting in spying, and hence wanting to leverage weakness in cyberspace, to be put in charge of strengthening cyberspace (for an example, see the Clipper Chip).

Jonathan Zittrain also made a good point that if we want to do something, it's better to do it now, with relatively calmer heads, than immediately after a cyber September 11th equivalent.

Again, well worth the hour listening to.

Friday, June 18, 2010

My take on Google's wireless capture oops...

Regarding Google's wireless capture oops - in a nutshell, Google admitted while gathering information about geolocations of WiFi access points (to allow for location detection by non-GPS devices),  they accidentally captured small amounts of packet data from unencrypted wireless networks. Google's story is that it was an unintentional configuration of third party software they were using that caused it to capture more than they needed.

I believe Google that the over-collection was unintentional. I doubt anything interesting was even captured given the nature of the captured data and I can't imagine any reasonable motivation.

Unless someone can prove actual damages, which I highly doubt, I think this is being blown way out of proportion and I hope any lawsuit is thrown out.

So what should be taken away from this?

A couple weeks ago I was talking to someone from one of the big accounting firms whose job it is to audit online advertising (among other things). What that means is that they auditing companies who sell online advertisements to accredit that when someone buys an ad with certain parameters, they get what they pay for.

It seems reasonable that collection of public data should have the same scrutiny. When a company collects data about the public, they should have a policy about what they collect, what they do with it, who gets access and how long they store it - and they should have audits throughout the process to make sure they are actually doing what their policy says. If Google had does this, defining what data they were collecting and having an audit early in the process of the collected data, that should have caught the fact they were over-collecting and this would never have happened.

Next time it might be data that actually matters.

Edited to add: Interesting post by Lauren Weinstein on the matter.

Tuesday, June 8, 2010

InCommon Certificate Service

The InCommon Federation recently announced their offering of a new X.509 certificate service.

Personally, I think this is a great step in the right direction for the U.S. higher education and research community, who need good identity services for collaboration. InCommon has served as a focus point for Shibboleth and SAML-based services, and I think they are a good organization to expand to take on other identity technologies such as X.509. My personally view is that we'll never have one identity technology and we'll always be dealing with a mix - SAML, OpenID, X.509 being the obvious big three (with Kerberos and SSH keys having their niches and of course the ubiquitous username and passswords).

A good question is "what does this mean for the community served by the International Grid Trust Federation (IGTF)?" The IGTF currently serves an an accreditation body for the use of X.509 in most "Grid" projects. If the InCommon certificate service takes off (which I'm guessing it will), the pressure to use its certificate for those Grid projects will be strong and could potentially become the 800 pound gorilla in the room.

IGTF has done some good work defining standards for X.509 usage, and I think this is a good opportunity for them to collaborate with InCommon in bringing that work to this new service. As usual, the challenge will be getting busy people to pay attention and talk.

In the meantime, I expect translation services (which handle both technical and policy translation) to continue to play a significant role.