Sunday, April 10, 2005

Off-site Essay: The Technological Singularity

Vernor Vinge is a computer scientist and a science fiction author of some reknown. Given this, it shouldn't be a surprise that he spends a lot of his time thinking about the future.

One of the things that Vinge noticed while he was doing this was that the pace of change appeared to be increasing at an exponential rate. He found the the horizon of the future, meaning that point in time beyond which we couldn't reasonably extrapolate, was getting nearing and nearer to the present. This led him to suspect that, in the relatively near future, the pace of change would be so rapid that the future would, in a sense, collide with the present.

He called this point in time The Singularity and identified it with the development of greater than human intelligence (>H for short). He reasoned that once you had an intelligence that was >H, this intelligence would, in turn, be able to produce an even greater intelligence: >(>H). This would be followed be an expanding series of intelligences with the endpoint being an intelligence (or intelligences) so far beyond the scope of human understanding as to be literally incomprehensible. This would also, incidentally, represent the point were human history becomes superfluous (and if that sounds a bit ominous, maybe it should).

A pretty wild idea, no? A lot of people think so. Many people think that it's too wild: the product of a hyperactive imagination. I have, personally, gone back and forth. My current opinion is that an eventual singularity is plausible and perhaps likely (but not necessarily inevitable) but that Vinge's timeline is exceptionally optimistic (supposing, of course, that we want a singularity). I sincerely doubt that we'll see the advent of >H in my lifetime.

You, however, can be the judge. Today's offsite essay is a link to his original paper on the subject, which was presented at a NASA symposium back in 1993.

No comments:

what is this?

Tell me when this blog is updated. . .