Many news services have reported on the release of the White House PCAST (President’s Council of Advisors on Science and Technology) report on “Big Data and Privacy,” including David Sanger and Steve Lohr of the New York Times and the AP wire feed. I have read PCAST reports before- for instance on the urgency of a crisis in STEM education in among our youth. As always, the reports are a mixed bag, and in this case you can read the whole report yourself here.
Oddly, the report titles itself with Big Data, while the real focus is on behavioral analytics and human intelligence: all that information collected explicitly and implicitly regarding your every interaction with the Internet. The recommendations made in the report stem from the realization that such data, when combined with machine learning, yields totally unanticipated power. Anonymized data plus machine intelligence and massive statistics, for example, produces such accurate estimates of gender, age, sexual preference- you name it- that anonymity melts away in the bright lights of computation. Bias, once explicit and explicitly outlawed, becomes undefinable and vague, but built-in to learning systems that predict who will be able to pay their mortgage, or whom deserves to receive car insurance. Our good legal intentions, designed for the slow, social pace of human decision-making, are basically irrelevant when data and computing combine, and this is the one point that the report most effectively argues.
Now the mixed bag bit. The offered solutions? 1) Watch how companies use data, not just how they collect data. In other words, privacy is protected only if you control collection and also intended use. Well, yes. 2) Don’t mention specific technologies or you will be embarrassed because your legislation will quickly be out of date. 3) Let’s do research. 4) Maybe we should have privacy expert career paths for professionals? 5) Nobody does this right internationally, so let’s be #1.
As for (5), I am surprised to see that Europe has exactly zero references in the entire full report. Very surprising, given that the EU has specific and significant attempts in place around digital consumer privacy. As for (2),(3), and (4), I am reminded of rearranging deck chairs, unfortunately.
Privacy is obviously threatened today, and robotic sensing and actuation technologies (think drones, eyes, ears, lip-reading, NSA, business intelligence and shake vigorously) will level up this whole problem. We need to think a bit more aggressively here about new classes of human rights, new forms of community empowerment. At this point, shortening and simplifying the “Terms of Service” text of websites is spitting in the wind.