Tactical, Operational & Strategic Analysis of Markets, Competitors & Industries
Slaves to the algorithm: Computers could take some tough choices ou..., by Steven Poole. Aeon Magazine, 13 May 2013.
When Garry Kasparov lost his second match against the IBM supercomputer Deep Blue in 1997, people predicted that computers would eventually destroy chess, both as a contest and as a spectator sport. Chess might be very complicated but it is still mathematically finite. Computers that are fed the right rules can, in principle, calculate ideal chess variations perfectly, whereas humans make mistakes. Today, anyone with a laptop can run commercial chess software that will reliably defeat all but a few hundred humans on the planet. Isn’t the spectacle of puny humans playing error-strewn chess games just a nostalgic throwback?
Such a dismissive attitude would be in tune with the spirit of the times. Our age elevates the precision-tooled power of the algorithm over flawed human judgment. From web search to marketing and stock-trading, and even education and policing, the power of computers that crunch data according to complex sets of if-then rules is promised to make our lives better in every way. Automated retailers will tell you which book you want to read next; dating websites will compute your perfect life-partner; self-driving cars will reduce accidents; crime will be predicted and prevented algorithmically. If only we minimise the input of messy human minds, we can all have better decisions made for us. So runs the hard sell of our current algorithm fetish.
But in chess, at least, the algorithm has not displaced human judgment. The imperfectly human players who contested the last round of the Candidates’ Tournament — in a thrilling finish that, thanks to unusual tiebreak rules, confirmed the 22-year-old Norwegian Magnus Carlsen as the winner, ahead of former world champion Vladimir Kramnik — were watched by an online audience of 100,000 people. In fact, the host of the streamed coverage, the chatty and personable international master Lawrence Trent, pointedly refused to use a computer engine (which he called ‘the beast’) for his own analyses and predictions. The idea, he explained, is to try to figure things out for yourself. During a break in the commentary room on the day I was there, Trent was eating crisps and still eagerly discussing variations with his plummily amusing co-presenter, Nigel Short (who himself had contested the World Championship against Kasparov in 1993). ‘He’ll find Qf4; it’s not difficult to find,’ Short assured Trent. ‘Ng8, then it’s…’ ‘It’s game over.’ ‘Game over!’
Chess is an Olympian battle of wits. As with any sport, the interest lies in watching profoundly talented humans operating at the limits of their capability. There does exist a cyborg version of the game, dubbed ‘advanced chess’, in which humans are allowed to use computers while playing. But it is profoundly boring to watch, like a contest over who can use spreadsheet software more effectively, and hasn’t caught on. The ‘beast’ can be a useful helpmeet — Veselin Topalov, a previous challenger for Anand’s world title, used a 10,000-CPU monster in his preparation for that match, which he still lost — but it’s never going to be the main event.
This is a lesson that the algorithm-boosters in the wider culture have yet to learn. And outside the Platonically pure cosmos of chess, when we seek to hand over our decision-making to automatic routines in areas that have concrete social and political consequences, the results might be troubling indeed.
This article is a decent summary of current "big data" skepticism.
Daniel Suarez (aka author Leinad Zeraus)
“Daemon: Bot-mediated Reality”
The Long Now Foundation
This talk was given at Cowell Theatre in Fort Mason Center in San Francisco, California on Friday August 8, 2008
The viral success story of the year is a techno-thriller called Daemon. Software developer Suarez printed the book himself after being turned down by mainstream publishers. Blog raves, Amazon raves, and brief item in Wired magazine turned the book deservedly into a runaway hit.
In this presentation, his first on the subject, Suarez spells out the ideas behind Daemon and its forthcoming sequel, Freedom™: "'Bots' are simple software programs designed to automate tasks - such as finding, retrieving, or acting upon information. Bots set loose on the Internet have been the catalyst behind many revolutionary Web 2.0 technologies. However, the unintended consequences of activating millions of bots in our networks -- bots that wield increasing influence over the activities and opportunities of human beings - may have serious consequences for society."
I can't recommend Daniel Suarez's novel Daemon and its sequel Freedom(™) enough - great reads. This talk gets into the underlying concepts the book is based around.