Will machines outsmart man?

Scientists believe the point of 'Singularity' – where artificial intelligence surpasses that of humans – is closer than we thought





Ray Kurzweil at a conference — as a hologram. Photograph: Ed Murray/Corbis

They are looking for the hockey stick. Hockey sticks are the shape technology startups hope their sales graphs will assume: a modestly ascending blade, followed by a sudden turn to a near-vertical long handle. Those who assembled in San Jose in late October for the Singularity Summit are awaiting the point where machine intelligence surpasses that of humans and takes off near-vertically into recursive self-improvement.
The key, said Ray Kurzweil, inventor of the first reading machine and author of 2005's The Singularity Is Near, is exponential growth in computational power - "the law of accelerating returns". In his favourite example, at the human genome project's initial speed, sequencing the genome should have taken thousands of years, not the 15 scheduled. Seven years in, the genome was 1% sequenced. Exponential acceleration had the project finished on schedule. By analogy, enough doublings in processing power will close today's vast gap between machine and human intelligence.
This may be true. Or it may be an unfalsifiable matter of faith, which is why the singularity is sometimes satirically called "the Rapture for nerds". It makes assessing progress difficult. Justin Rattner, chief technology officer of Intel, addressed a key issue at the summit: can Moore's law, which has the number of transistors packed on to a chip doubling every 18 months, stay in line with Kurzweil's graphs? The end has been predicted many times but, said Rattner, although particular chip technologies have reached their limits, a new paradigm has always continued the pace.
"In some sense - silicon gate CMOS - Moore's law ended last year," Rattner said. "One of the founding laws of accelerating returns ended. But there are a lot of smart people at Intel and they were able to reinvent the CMOS transistor using new materials." Intel is now looking beyond 2020 at photonics and quantum effects such as spin. "The arc of Moore's law brings the singularity ever closer."
Judgment day
Belief in an approaching singularity is not solely American. Peter Cochrane, the former head of BT's research labs, says for machines to outsmart humans it "depends on almost one factor alone - the number of networked sensors. Intelligence is more to do with sensory ability than memory and computing power." The internet, he adds, overtook the capacity of a single human brain in 2006. "I reckon we're looking at the 2020 timeframe for a significant machine intelligence to emerge." And, he said: "By 2030 it really should be game over."
Predictions like this flew at the summit. Imagine when a human-scale brain costs $1 - you could have a pocket full of them. The web will wake up, like Gaia. Nova Spivack, founder of EarthWeb and, more recently, Radar Networks (creator of Twine.com), quoted Freeman Dyson: "God is what mind becomes when it has passed beyond the scale of our comprehension."
Listening, you'd never guess that artificial intelligence has been about 20 years away for a long time now. John McCarthy, one of AI's fathers, thought when he convened the first conference on the subject in 1956, that they'd be able to wrap the whole thing up in six months. McCarthy calls the singularity, bluntly, "nonsense".
Even so, there are many current technologies, such as speech recognition, machine translation, and IBM's human-beating chess grandmaster Deep Blue, that would have seemed like AI at the beginning. "It's incredible how intelligent a human being in front of a connected computer is," observed the CNBC reporter Bob Pisani, marvelling at how clever Google makes him sound to viewers phoning in. Such advances are reminders that there may be valuable discoveries that make attempts at even the wildest ideas worthwhile.


http://www.guardian.co.uk/technology...ai-engineering