The Economist‘s January 10th 2015 issue has an article you can read on-line: Meet the market shapers: A new breed of high-tech economist is helping firms crack new markets. This is a fascinating read because it describes how real-time behavioral analysis of who you are as you shop can make your experience on a website so very different from those of your friends. Perhaps you linger, click slowly, scroll a bit. Automatic algorithms will use your lurk behavior, even your mouse click speeds, to estimate ever more accurately whether you’re a serious shopper or just a window viewer. Then, depending on how you are categorized, you may get very efficient information to help you complete a purchase quickly, without risk of being distracted or annoyed by ads. Or if you are browsing, you may be assaulted by ads- because you’re not going to buy anything right off anyway, and the system may as well experiment on you and try to “convert” you.
The article describes how machine learning lets such automatic policies optimize over time, experimenting on each of us with a variety of techniques and tweaks until finding strategies that tend to maximize profit across a variety of shopper archetypes. In the end, all these techniques are about shifting the power dynamic in favor of the Machines. They will collect information that is too vast for mere humans to analyze; they will sift through it, actively experiment on us, and categorize us into separate, distinct classes. Then they will decide how to treat us each. Will this be pathological sometimes, with machine learning edging computers into categorizations and policies that are classist, racist, or wealth-biased? Sure! There’s nothing stopping that. We have no first law of behavioral analytics for robots, with edicts that equity must exist in how people are treated by real-time, decision-making machines. Will we need such rules? Yes, doubtless, we will eventually. Otherwise machines will create echo chambers; machines will refine and optimize our behavior, and machines will turn us into automata every so gently.