Friday, February 25, 2011
Ray Kurzweil and the 'singularity' hoax.
For the uninitiated Ray Kurzweil is a fellow famous for making very bold technology predictions. Beginning in the 80s (and not without criticism) his revised predictions were picked up for Time magazine's cover article. There is also a film on his ideas titled, "Transcendent Man".
When not popping supplements with and enthusiasm of Jack when presented with magic beans, along with similar effectiveness, Mr Kurzweil spends his time musing about the power of computers versus that of the human brain, and has decided that computers would have the power of a human brain by 2020. A decision that still sells books. He has come to this conclusion only after making large and unexplained assumptions about the power of the human brain, and correlating that to the advancing rate of computing power and again assuming a linear curve.
Ray is a Futurist who makes grand statements based on his assumptions which have been effective in generating publicity. However different to other futurists such as Arthur C. Clarke who predicted with stunning accuracy many decades in advance wide scale revolutionary changes such as the Internet, and who wrote riveting stories that encapsulated his deep understanding of humanity and technology. Conversely Ray is to Mr Clarke as Deepak Chopra is to Brian Greene and the world of physics.
He is predicting that computers will be as powerful as brains by, or in, 2020. And that by 2045 machines will be so powerful our entire civilization will be fundamentally different, even to the point where the two are indistinguishable.
His major failure is in understanding how much we don't know about the human brain, and actually how little progress there has really been with computing. His extrapolations from ridiculously low-dimensional data sets into tediously basic "hockey-stick" (exponential) trends therefore give rise to predictions of more of fantasy than science fiction.
In the 64 year history of the transistor the densities in which they can be produced has been advancing exponentially, but you cannot read too much into that alone for a very good reason.
64 years isn't a lot of time - In 1908 the Model T Ford had a top speed of 72Km/h, 64 years later in 1972 the Porsche 917 had a stop speed of 390Km/h, a 540% improvement. 39 years on however and our fastest cars manage less than 5% better (Bugatti Veyron: 407Km/h). Of course safety, fuel consumption, and features improve but the top speed hit a limit (the energy needed to overcome the density of air versus the benefits in doing so). So what's the limit for transistors; according to Intel it will come around 2018 with the 16nm process with 5nm gates, at this point quantum effects ruin the party for everybody. That isn't to say there aren't solutions of course, however the solutions may require completely new technologies that may not be subject to Moore's law at all.
There is of course a lot more to information processing than a transistor, networking, storage, power consumption to name some other big ones. There is work going on now to build an exaflop computer, the fastest supercomputer ever devised, this is supposed to arrive in 2019 but most of the required technologies don't exist yet, it will also use as much power as a small town and cost hundreds of millions of dollars, and by any reasonable estimate it still won't match a human brain.
This is far removed from Ray's 1999 prediction that a $1,000 computer would match the human brain by 2019.
In 1999 in the book "Age of Spiritual Machines" Ray predicted that "The majority of text is generated with speech recognition software", and "Translating telephones (where each caller is speaking a different language) are commonplace". This has not happened, the best we have is thousands of computers pre-calculating translations based on specific algorithms, and the results are poor compared to a human. IBM has a single system that can barely respond accurately to our language (only a small subset of what the brain does), and it still takes up a room.
The failing here has not been technology, it has been with understanding the complexity of the problem.
It's pointless to attempt predictions such as these when nobody actually knows exactly how the brain really works, therefore we have no accurate models let alone the processing power to test them out.
Additionally our computing hardware is so fundamentally far removed from how a brain is built it's an apples to rocket ships comparison.
Neurons and synapses do not work like transistors so the short term density increases of the latter are likely irrelevant to the former. Experts in the fields where his opinions overlap (but not credentials) disagree with him on how close we are to his predictions because they have a reasonably good idea of what the are yet to learn. For example we've recently discovered evidence that neurons not only communicate via axons to synapses (the 'wires') but they might also be listening in to their peers through a weak extracellular electric effect ('wireless').
And I think most importantly it's likely that the brain is more complex than we are even predicting right now. To elaborate - I take exception to plotting 64 years of transistor advances in FLOPS increases against 550 million years of evolution and thinking we're definitely on the right track.
In that same time span as the history of the transistor we've gone from thinking cells were basic and dull to finding hundreds of molecular machines within performing a range of complex tasks rivaling any high-tech factory, we've come to understand there are potentially quantum wave functions existing in certain biological systems, and we've found bacteria can communicate with their own, and other, bacterial species through molecular "languages".
It's quite amazing that in 60 years since the invention of the transistor we've not fundamentally changed computing at all. It's become more dense and power efficient but not different at all, it's exactly the same system of pushing around electrons.
However in the same time we've discovered things about the brain and other biological systems that have blown us away. Think about this, in 1996 Robert Birge (Syracuse University) estimated that the memory capacity of the brain was most likely around 3 terabytes, some then bumped that up to 6TB, 10TB around 2006, 1,000TB a year or two later, Scientific American later suggested 2.5 petabytes in early 2010. However we've also got estimates that say if the storage takes place at the molecular level we're into the 3.6 X 1019 range, that's 31 exabytes.
Our best guesses about the brain's capacity has jumped in the space of a decade by 12,000 fold and we still don't know how the thing works. That's a greater rate of change than transistor density by a large order.
There was a good exchange that sums it up quite well in an interview with Stephen Colbert the neuroscientist David Eagleman ;
David: "In a square millimeter of brain matter there are more connections than there are stars in the Milky way".
I must also point out another recent advance which was engineering E.Coli to store (and encrypt) data at a density of 900TB per gram of bacterium. It's not even optimized for storage whereas the human brain, clearly, is.
Ray needs to look at the advancements in our understanding of biological systems much more than sophomoric extrapolations from Moore's law (which isn't a physical law of course but an observation contained to a fledgling industry).