Julie Bort reports in SFGate/ Business Insider that Apple has enabled iBeacon, which can provide high-resolution tracking of customers using their ios7 products in Apple stores (and, soon, beyond just their stores!). The outward value proposition is that your phone can now provide turn by turn directions to you as you wander the store. The full-blown value proposition is much richer, of course– it’s the essential ingredients required for what I term Mediocracy: complete shopper behavioral information. Apple will be selling this equipment to businesses, who in turn can get excited about that great Minority Report experience!
John Markoff’s recent article, Night Watchman with Wheels, describes the founding of Knightscope in Silicon Valley- a company started in reaction to the Sandy Hook massacre- to create a sensor-laden robot that provides security in public/private spaces. The article is worth reading because it identifies several issues at the heart of how robots can change society: technology displacing human labor; crime prediction and behavioral mining; robot-cloud connections; pervasive surveillance.
For instance, by charging $6.25 an hour for the robot’s services, Knightscope is explicitly targeting a cost point below California’s minimum wage of $8/hour, and this piques the interest of those who consider the displacement economics of robots and lower class employees (never mind the middle class). But what I found also fascinating echoed a sentiment in Robot Futures, where I explain that robots will glue the digital and physical worlds together so seamlessly that, when you run into a robot on the sidewalk, it will have vast knowledge about you available thanks to its interconnections to the Cloud. In the article, Markoff explains that the companies plans to create a big data portal for the robots in order to recognize faces, license plates and other physical embodiments with digital signatures. So, indeed, we are on a track to robots that know far more about us than we know about them. Welcome to a new version of inequity.
Well, the article really says it all. Bezos reported to 60 minutes that the company is experimenting with flying drones to deliver packages in thirty minutes. Real about it in this Bloomberg News story: Amazon Testing Drones for Same-Day Delivery. Perhaps my Robot Smog date estimates weren’t too eager at all.
What if you could vote by making a quick facial expression? What if you voted anytime you made a face, but you didn’t even know you were voting? Anne Eisenberg reports in today’s New York Times regarding algorithms that will turn the most transient facial expressions into emotion metrics: When Algorithms Grow Accustomed to Your Face.
Eisenberg uncovers a fascinating rhetorical approach used by companies such as Affectiva, which markets emotion-reading to companies including Coca-Cola and Mars: we aren’t recording video frames, remarks one co-founder of the company, just the emotional responses that are coded automatically from the video feed! So privacy in this parochial sense extends to imagery but not to emotional response. In New Mediocracy I argue that the ever-improving ability of computers to recognize human gestures and behavior will lead to ever more effective manipulation of consumers- and this power will only further emphasize inequity, deriving data from the masses for the economic gains of the few. By considering emotional affect to be outside the bounds of privacy, companies like Affectiva simultaneously denature the personal significance of transient behavior while also enabling its collection on such a massive scale that, as a whole, the body of knowledge has extreme marketing value.
The marketing speak resulting from this: “…we can provide just-in-time information that will help individuals, moment to moment throughout their lives” once we can parse each facial gesture. This begs the question: who will be doing this helping- and for what purpose? Pure altruism? Profit maximization? Knowledge mined broadly will translate into the power to change group behavior, and this can only concentrate power, wealth and the ability to predict the pulse of group velocity.
All of these computer vision systems will be imperfect: they won’t detect my every emotional state exactly correctly all of the time. But in the end that does not matter at all: they will collect information at such a large scale that, replete though they may be with imperfection, the newly derived knowledge will be game-changing.
Carnegie Mellon just published an article on a project our CREATE Lab has been doing in Uganda. This is a good example of the form of technology intervention I advocate in the final chapter of Robot Futures – technology for the sake of community empowerment rather than corporate or government concentration of power. Enjoy.
The December 2013 issue of The Atlantic has excellent food for thought on the related spheres of capitalism and surveillance. Eleanor Smith introduces The Explorer, a camera-covered rubber ball that SWAT teams can use to peer inside dangerous spaces: just throw it in and watch it bounce around, creating both a panorama of the space and real-time video feeds in all directions. This reminds me a bit of The Circle, of course- and has both excellent, consequential uses in disaster recovery as well as plenty of nefarious uses depending on the temperature of just who owns the bouncing ball in question. But this is, in any case, surveillance with a small ‘s.’ For Surveillance writ large, turn to Don Peck’s article in the same issue: They’re Watching You at Work. This article does an outstanding job of reminding us that data mining on human behavior provides new affordances that were unimaginable previously. Data collection, ingestion and actionable mining is not a simple incremental step forward, but rather can be a game-changer for just how we hire. Peck writes about video games designed to predict, with frightening accuracy, who will perform well if hired. Pity the statistical tail- the unusual, rarely-hired, awkward candidate who someone takes a risk on hiring and who turns out to be pure gold that just needed the right mentorship. I fear that, one day, college applicants will face this same deterministic game of digital competency: will this make the graduating class four years later depressingly homogeneous? Maybe not: they can be just exactly as diverse as we program them to be, if the data mines are to be trusted fully enough. Of course, in the end of the day, the problem with all these data-driven techniques is the wag-the-dog problem. When Sandy Pentlant’s social badge is worn by every employee so the employer can ascertain their interactions with everyone (no kidding), just how many of the interactions in the workplace will be wholly inauthentic, driven only by the need to feed the “badge” machine and placate the employer surveillance program? Sure, we will redefine the workplace and the college through these techniques. But I guarantee the side effects will be richly unanticipated.
One final article well worth reading is Chrystia Freeland’s Is Capitalism in Trouble? Not to spoil the article, but the answer is ‘yes.’ Freeland’s thesis is that Western Capitalism is inherently unsustainable as it is polarizing society in ever more dramatic ways. Income inequality leads to bad places, from social upheaval to political upheaval and so, the arc leads to a place that the wealthy themselves will endeavour to avoid– eventually. Freeland reports on the B Corp community, which specifically commits executives to serving society in positive ways. Of course, today, this movement is tiny compared to the massive number of standard, shareholder-maximizing corporations; and the role of automation, as I argue in Robot Futures, cannot be overstated since robotics will increase the ownership and wealth of the capital class at the expense of the former middle class. It’s good to see repeated discussion of these issues among journalists, CEO’s and the like. Maybe, just maybe, there will eventually be stomach for actually talking about the kind of structural solutions we will, inevitably, need to face.
William Kowinski at Humboldt County’s North Coast Journal just released a short, accurate book review of Robot Futures.
There is a pair of articles in the current Economist issue well worth reading: The People’s Panopticon and Every Step You Take. These are excellent primers on where personal surveillance technology may be taking us. Panoption starts with the concept of life logging, then introduces more dystopian directions. Sure, small form factor surveillance has clear advantages in terms of limitation of liability and such like for the likes of rental car agencies, police departments and many, many others. But, as always, there is a price that becomes increasingly hard to assess when you marry ubiquity with intelligent real-time analysis and permanent record-keeping. A camera on an elderly person can prompt conversations with loved ones: soon the conversations that result are not exactly the result of two parties, but three: a son or daughter; an older father or mother who has serious senility, and an AI system that is prompting the father/mother in just the right ways to create a reasonable-sounding conversation between the two humans in the loop. Identity starts to enter uncharted territory on this ride.
As I have argued, ubiquity also melts the off-line world into the ether of the on-line world. As the Economist writes: “Head-mounted screens would let people spend time on-line that would previously have been off-line.” One particularly unnerving patent application they discuss in the article is straight from New Mediocracy: a camera that looks outward at billboards the user sees, then looks at the emotional response on the user’s face. The magazine notes that today’s head-mounted cameras only look outward, so this is not really possible. Well…that’s not really entirely true. Research projects are already looking both out and in simultaneously, for instance to view what a driver sees by studying, in wide angle, what is visible in front of the driver, and by looking back at the pupil to see just what the driver is focusing on. First-person Vision is a good example of such a project developing real hardware at my own institution.
Both this article and its sister article, Every Step You Take, talk about face recognition technology as an interesting line in the sand. It is interesting that we are stuck on face recognition specifically, even though there are so very many aspects to privacy, from behavior, license plates, gestures– the list is never-ending– that we need a broader thoughtfulness about if and how we wish for our physical world and on-line databases to stay somehow separable. In the end of the day, we as societies need to decide if everything will be searchable eventually or not. Of course, along the way we will begin to notice just who profits from such searchability- and I suspect strongly that there will be little justice in that aspect of this equation.
I have written a good bit about chronic underemployment; and now to show there is such a thing as too much of a robot thing, thanks to Jason Campbell, I bring you the latest BBC story on robotic shepherds: Robot used to round up cows is hit with farmers. Rover is a four-wheeled all-terrain robot that looks like it belongs in a Mars yard, except for the fact that, in the BBC shots, it is surrounded by dairy cattle. The research team was studying robot-cow interactions, and was excited to find that the very presence of the cow does not send the robot stampeding away. Maybe I wrote that backwards.
The story imagines a future in which robots monitor cows that are about to give birth, and perhaps even inspects the farm for various problems, such as electric fence vagaries and other issues. It is true that massive milk herds are serviced by stunningly expensive roundhouse dairy systems that subtract all humanity out of the milking process for the sake of throughput and efficiency. However, I have personally spent a fair bit of time on a dairy farm, and I must say that, given just how hostile an environment it can be, with mud and other wet spots plentifully distributed, with hay everywhere, tight quarters and all weather conditions- I think I can safely say that the day when robots displace further jobs on these farms is remote.
Some robots aren’t really about to threaten the fabric of society just yet; this is one of them.
I wrote a review of Jaron Lanier’s book, Who Owns the Future, earlier this year. In it I described one of Lanier’s ideas: 3D printing of clothes at home, recycling yesterday’s wear for today’s fashion. There are all sorts of problems with this vision in terms of energy and waste, I argued. So I have to report on Eliza Strickland’s article on the company, Shapeways, in IEEE Spectrum this month. Strickland’s article features a picture of a 3D-printed gown produced by the company for a burlesque performer, Dita Von Teese. Yes, it’s plastic, but it really does look like (very porous) clothing.
But the real take-away from this article is that Shapeways may be on to a very smart niche in the world of 3D printing. They have their manufacturing services in New York, and the business model is that you send them your 3D designs, they jigsaw-puzzle your needs together with many other customers’ needs, just like a board company might do, and then they produce it using high-end 3D printers that you would never have at home. So this intermediate stage between mass-manufacturing and home printing may just be a reasonable middle ground, where people exercise creative expression without the logistical tail and cost of actually doing manufacturing in their own home, from ordering raw materials to keeping the equipment running. Of course Shapeways adds more monetization paths by creating a community marketplace where you can sells your designs to others, and this leads to a sort of physical-world app store.