Comments on a cultural reality between past and future.

This blog describes Metatime in the Posthuman experience, drawn from Sir Isaac Newton's secret work on the future end of times, a tract in which he described Histories of Things to Come. His hidden papers on the occult were auctioned to two private buyers in 1936 at Sotheby's, but were not available for public research until the 1990s.

Tuesday, October 5, 2010

Historical First: Atom Photographed, Quantum Computers Steps Away

A picture of a single Rubidium 85 Atom (2010). Image: M. Andersen/ T. Grünzweig/A. Hilliard/M. McGovern/U. of Otago.

There is a story out at Physorg (here; hat tip: Lee Hamilton's blog) about scientists at the University of Otago who have managed to hold a Rubidium 85 atom in a laser beam and photograph it through a microscope.  Scientists, led by Mikkel Andersen of the Department of Physics, laboured for three years to take this photo.  Andersen has given an interview to NPR which you can listen to here.

By proving they can trap individual atoms, the Otago team is steps away from creating quantum computing circuits of communicating atoms, which can perform complex calculations simultaneously at the atomic level.  Andersen: "What we have done moves the frontier of what scientists can do and gives us deterministic control of the smallest building blocks in our world."  Earlier developments in this field had already prompted physicist Michio Kaku to say that the Silicon Valley will become a rustbelt within the next twenty years (in a 2008 interview here).  Kaku uses Moore's Law to predict how quickly computing will evolve in the next generation. By about 2030, he assumes we could build machines that could run at the speed of human thought, that is, 500 trillion bytes per second.  But right now our most advanced computers have the intelligence of a "retarded cockroach."

A couple of years ago, I read a University of Michigan report (by Dr. James Duderstadt, Emeritus Professor) on projected developments in engineering and computing design.  Duderstadt hints at what these thinking computers could look like (p. 15):
"Over ... two decades, we will evolve from 'giga' technology (in terms of computer operations per second, storage, or data transmission rates) to 'tera to 'peta' and perhaps even 'exa' technology (one billion billion ...). To illustrate with an extreme example, if information technology continues to evolve at its present rate, by the year 2020, the thousand-dollar notebook computer will have a data-processing speed and memory capacity roughly comparable to the human brain (Kurzweil, 1999). Furthermore, it will be so tiny as to be almost invisible, and it will communicate with billions of other computers through wireless technology."
Imagine these things, the size of a bedbugs, flying around, communicating with each other, and as capable of humans in terms of their ability to process information.  Sounds like we will need to stock up on fly swatters.  Duderstadt discusses the Technological Singularity predicted by Ray Kurzweil and others (p. 22):
"John von Neumann once speculated, 'The ever accelerating progress of technology and changes in the mode of human life gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.' At such a technological singularity, the paradigms shift, the old models must be discarded, and a new reality appears, perhaps beyond our comprehension. Some futurists such as Ray Kurzweil and Werner Vinge have even argued that as early as this century we are on the edge of change comparable to the rise of human life on Earth. The precise cause of this change is the imminent creation by technology of entities with greater than human intelligence. For example, as digital technology continues to increase in power a thousand-fold each decade, at some point computers (or large computer networks) might “awaken” with superhuman intelligence. Or biological science may provide the means to improve natural human intellect (Kurzweil, 2005)."
I hope 2030 won't be like one of those frog in the bottle of hot water scenarios, where we won't notice what deep trouble we're in because we'll have gotten used to things (relatively) slowly.  We also have to consider that everyone born 1990 and later bascially has no conscious memory of the pre-personal computer era.  By 2030, the oldest of this group of Millennials will be thirty years old.  They will reach mature adulthood just as this predicted Technological Singularity will hit.  I hope they'll know how to deal with it, because some of it won't be pretty.

In the meantime, here is an eye-opener for the week if you haven't seen it already.  In 2004-2005, a group of researchers in Dallas, Texas created an android of Philip K. Dick that could hold basic conversations.  A short-lived blog about the project is here.  The main site is here.  There are photos of the android here.

This is what happens if your sci-fi stories about androids become too popular. The PKD android. Image: PKD Android Project.

The android has appeared at several venues, such as the Nextfest Exhibition and the San Diego Comic Convention.  Wiki: "The android of Philip K. Dick was included on a discussion panel in a San Diego Comic Con presentation about the [2006] film adaptation of the novel, A Scanner Darkly." Is it my imagination, or does this look like a bad idea?  Imagine what will happen after the Singularity hits, and the people with an inclination to design these things really get going.  I still remember one of Dick's stories, written in an apocalyptic future after a big war caused by androids.  Humans have regressed to a simple, agrarian society, and a boy finds an android head, still sentient after many years, in a forbidden, bombed out zone.  It convinces him to partly repair it.  As I recall, when the boy mentions that it is a robot, the machine replies, "I am not a robot. I am an android."  That line, with the innocent human child unknowingly helping the malevolent last android power up again, was as chilling as any line I've seen in Dick's writing.

No comments:

Post a Comment