The Washington Post’s Craig Whitlock published an article yesterday, Crashes Mount as Military Flies more Drones in the U.S., that is fascinating as a read and also a useful catalyst to look at just how experts respond to the trope of domestic drones crashing across our country. What is interesting here is not so much the specific accident rate of drones- I suspect it will evolve, as a value, until it is not that different from piloted aircraft- but the fact that they do, indeed, crash. And that, therefore, as we massively increase the number of flying aircraft thanks to the economies of scale applied to drones, we will also massively increase air-ground accidents.
My very favorite passage explaining this forthcoming robot smog follows:
Navy officials said the drone came no closer than 40 miles to the Capitol. Jamie Cosgrove, a Navy spokeswoman, said a software anomaly prevented the drone from flying its preprogrammed route in the event of a lost satellite link. The Navy denied a request from The Post for its investigative report on the incident.
I love the use of the phrase “software anomaly.” The correct term in engineering circles is: bug. Their software is buggy. As all software is; and this is the crux of the matter: we distance the human from the control loop and replace human judgement with software, and we ought to remember that all software- every lat bit of it- has bugs. Forever. So our robot smog will not simply consist of well-functioning flying, autonomous craft. It will consist of flying, autonomous and buggy craft.
Enjoy your breakfast.