There is a pair of articles in the current Economist issue well worth reading: The People’s Panopticon and Every Step You Take. These are excellent primers on where personal surveillance technology may be taking us. Panoption starts with the concept of life logging, then introduces more dystopian directions. Sure, small form factor surveillance has clear advantages in terms of limitation of liability and such like for the likes of rental car agencies, police departments and many, many others. But, as always, there is a price that becomes increasingly hard to assess when you marry ubiquity with intelligent real-time analysis and permanent record-keeping. A camera on an elderly person can prompt conversations with loved ones: soon the conversations that result are not exactly the result of two parties, but three: a son or daughter; an older father or mother who has serious senility, and an AI system that is prompting the father/mother in just the right ways to create a reasonable-sounding conversation between the two humans in the loop. Identity starts to enter uncharted territory on this ride.
As I have argued, ubiquity also melts the off-line world into the ether of the on-line world. As the Economist writes: “Head-mounted screens would let people spend time on-line that would previously have been off-line.” One particularly unnerving patent application they discuss in the article is straight from New Mediocracy: a camera that looks outward at billboards the user sees, then looks at the emotional response on the user’s face. The magazine notes that today’s head-mounted cameras only look outward, so this is not really possible. Well…that’s not really entirely true. Research projects are already looking both out and in simultaneously, for instance to view what a driver sees by studying, in wide angle, what is visible in front of the driver, and by looking back at the pupil to see just what the driver is focusing on. First-person Vision is a good example of such a project developing real hardware at my own institution.
Both this article and its sister article, Every Step You Take, talk about face recognition technology as an interesting line in the sand. It is interesting that we are stuck on face recognition specifically, even though there are so very many aspects to privacy, from behavior, license plates, gestures– the list is never-ending– that we need a broader thoughtfulness about if and how we wish for our physical world and on-line databases to stay somehow separable. In the end of the day, we as societies need to decide if everything will be searchable eventually or not. Of course, along the way we will begin to notice just who profits from such searchability- and I suspect strongly that there will be little justice in that aspect of this equation.
I have written a good bit about chronic underemployment; and now to show there is such a thing as too much of a robot thing, thanks to Jason Campbell, I bring you the latest BBC story on robotic shepherds: Robot used to round up cows is hit with farmers. Rover is a four-wheeled all-terrain robot that looks like it belongs in a Mars yard, except for the fact that, in the BBC shots, it is surrounded by dairy cattle. The research team was studying robot-cow interactions, and was excited to find that the very presence of the cow does not send the robot stampeding away. Maybe I wrote that backwards.
The story imagines a future in which robots monitor cows that are about to give birth, and perhaps even inspects the farm for various problems, such as electric fence vagaries and other issues. It is true that massive milk herds are serviced by stunningly expensive roundhouse dairy systems that subtract all humanity out of the milking process for the sake of throughput and efficiency. However, I have personally spent a fair bit of time on a dairy farm, and I must say that, given just how hostile an environment it can be, with mud and other wet spots plentifully distributed, with hay everywhere, tight quarters and all weather conditions- I think I can safely say that the day when robots displace further jobs on these farms is remote.
Some robots aren’t really about to threaten the fabric of society just yet; this is one of them.
I wrote a review of Jaron Lanier’s book, Who Owns the Future, earlier this year. In it I described one of Lanier’s ideas: 3D printing of clothes at home, recycling yesterday’s wear for today’s fashion. There are all sorts of problems with this vision in terms of energy and waste, I argued. So I have to report on Eliza Strickland’s article on the company, Shapeways, in IEEE Spectrum this month. Strickland’s article features a picture of a 3D-printed gown produced by the company for a burlesque performer, Dita Von Teese. Yes, it’s plastic, but it really does look like (very porous) clothing.
But the real take-away from this article is that Shapeways may be on to a very smart niche in the world of 3D printing. They have their manufacturing services in New York, and the business model is that you send them your 3D designs, they jigsaw-puzzle your needs together with many other customers’ needs, just like a board company might do, and then they produce it using high-end 3D printers that you would never have at home. So this intermediate stage between mass-manufacturing and home printing may just be a reasonable middle ground, where people exercise creative expression without the logistical tail and cost of actually doing manufacturing in their own home, from ordering raw materials to keeping the equipment running. Of course Shapeways adds more monetization paths by creating a community marketplace where you can sells your designs to others, and this leads to a sort of physical-world app store.
Samuel Gibbs at The Guardian reports on the creation and demonstration of a metal 3D-printed pistol, made by Solid Concepts, that is capable of firing 50 rounds and doing so with far greater accuracy than the plastic guns reported earlier. What I find interesting about this article is that, as the ecology of possible manufacturing technologies blossoms, our ability to distinguish between expensive, conventional manufacturing that is generally unavailable at home and maker-movement style home hacking will blur in confusing ways. In this case, the “3D printing” is laser sintering, and the machine is (at least for now) too expensive for any of our basements. But as the arc of manufacturing bends toward cheaper, smaller and simpler-to-use, so we can start to see anything built as a harbinger of what our neighbors may build tomorrow, or maybe next year. When I look at the box of metal parts from which this gun is assembled, what I am struck by is not the maker quality of the process, but rather the degree to which the parts look just like what conventional assembly lines would be plying. A genuine question worth reflection may be just how different forms of manufacturing will shake out in the near future: what will be built quick and dirty- perhaps disposable- and what will be made with expensive equipment that individuals will not own or rent? And what is the strange interstitial space we have not yet explored?
Nicholas Carr has an in-depth piece, The Great Forgetting, in this November’s Atlantic Monthly Journal. This article dives into the question of just what happens when we put ever-greater trust in automation that may be safer than we are, but not totally, completely, one hundred percent safe. The article starts with stories about airplane autopilots, and these are indeed excellent examples of what I call Adjustable Autonomy in Robot Futures. The interesting thesis is that increased reliance on automation has the unfortunate side effect of leading to disastrous human error when humans do, in fact, need to take over. Whether from lack of practice or a paucity of situation awareness, the human’s relationship to control changes when we are not really in charge, but just the understudy who might just be called up at a moment’s notice. At one point Carr notes that a solution posited includes irregularly forcing the human to do the work. Imagine the balanced self-driving car: every few days, it throws up its robo hands and announces: now, you drive!
A very useful trope that is an important contextual consideration is defined by Carr as the substitution myth. The basic idea is that we caricature automation as perfectly replacing some human activity with machine dexterity, when in fact the human is not simply freed by the automation. Our work doesn’t disappear- it is replaced by new work because the nature of our relationship to the newly “autonomous” system changes. In other words we live in a complex system. Automation doesn’t replace humans, it simply redefines our responsibilities and accountability in complex and hard-to-predict ways.
Of course, as Carr points out, one theoretical solution to the autopilot (read: automation in general) problem is, if people make errors when automation sometimes turns control over to people, just make the machine be in charge absolutely all the time. I become interested in just what happens to society when we hew towards this vision but, along the way, discover that 100% is a bit harder to reach than we first thought. What interesting dystopian situations face us then?
Thanks to Jonathan Rotner for pointing me toward H. James Wilson’s Wall Street Journal article on analytics in the real world, Wearable Gadgets Transform how Companies do Business. Wilson’s piece is a fascinating example of mediocracy, with the kind of data collection we have come to expect and companies expect to exploit in the Internet making their gradual debut into the physical world beyond our net. Wilson features the Hitachi Business Microscope, which hangs on a lanyard around the neck just like a conference badge, evaluating who you talk to, how often you gesture and essentially how you surf through and interact with the real world, just as cookies might uncover such behavior on the Internet. Network analysis yields fruit concerning whom you interact with in your corporation, and who you seem to ignore. Do you interact more excitedly with some colleagues and become terse with other team members? This is reminiscent of the data collection that would make David Eggers’ Circle firm snap up the new device in the interests of behavioral transparency. Wilson also mentions smart glasses, Google Glass-like devices that become your (third?) eyes, ensuring you pick up the right package and don’t fail to look at the right spot when doing your job. The data mining that all this data could afford boggles the mind- and of course it has the benefit and danger of identifying and eliminating outlier behavior, making our society perhaps more boring and certainly more predictable.
One of the themes I bring up repeatedly in Robot Futures is in regards to power relationships. These devices can be wondrous in providing mindful reflection– if the data were to come right back to the wearer. But when the data flows to the employer, or some outside social analytics consultancy, then the word of the day is likely to be disempowerment. How will we use these new technologies to improve feelings of self-worth rather than gnawing away at personal autonomy, uniqueness and personal control? There’s the challenge.
Cora Currier writes an article from the Drones and Aerial Robotics Conference that is fabulous in how full of rhetorical samples it is from all sides of the great drone debate. Some of my favorites: ‘drones are not responsible;’ ‘they just follow orders, they do things autonomously, they die’ (that one is really great. they follow orders and they do things autonomously? awesome); oh- and ‘they can help accompany a family on vacation in Hawaii.’ Really? We really will have the family drone packed up in our luggage so that when we are walking on a Hawaiian beach it’s reassuringly there, buzzing above our heads?
‘It is transformative technology, but not the way people think.’ – so there is more nuance to this drone concept than we write about and talk about? Interesting. One philosopher says Drones represent a new ontology of social being, and that we will get over the concern in the future just as our forebears got over their anxiety regarding elevators. Elevators?
I am really impressed by the diversity and absurdity of commentary in the article. Reading the article is truly educational if only to demonstrate that this space is utterly uncrystallized. Even our experts are groping, trying to understand how to talk about drones and trying to understand just what autonomous flying robots really represent. This is a rich time for serious discourse, and I do believe accountability, robot smog and power relationships will see profound conceptual shifts as we struggle to understand and live with our impending drone future. Truly, these are not elevators, guns or cars. Those comparisons fall flat in the face of what we will all see in the near future.