"Vernor Vinge - Technological Singularity" - читать интересную книгу автора (Vinge Vernor)


It's fair to call this event a singularity ("the Singularity" for the
purposes of this piece). It is a point where our old models must be
discarded and a new reality rules, a point that will loom vaster and vaster
over human affairs until the notion becomes a commonplace. Yet when it
finally happens, it may still be a great surprise and a greater unknown.
In the 1950s very few saw it: Stan Ulam 1 paraphrased John von Neumann as
saying:

One conversation centered on the ever-accelerating progress of
technology and changes in the mode of human life, which gives the
appearance of approaching some essential singularity in the history of the
race beyond which human affairs, as we know them, could not continue.

Von Neumann even uses the term singularity, though it appears he is
thinking of normal progress, not the creation of superhuman intellect.
(For me, the superhumanity is the essence of the Singularity. Without that
we would get a glut of technical riches, never properly absorbed.)
The 1960s saw recognition of some of the implications of superhuman
intelligence. I. J. Good wrote:

Let an ultraintelligent machine be defined as a machine that can far
surpass all the intellectual activities of any man however clever. Since
the design of machines is one of these intellectual activities, an
ultraintelligent machine could design even better machines; there would
then unquestionably be an "intelligence explosion," and the intelligence of
man would be left far behind. Thus the first ultraintelligent machine is
the last invention that man need ever make, provided that the machine is
docile enough to tell us how to keep it under control. . . . It is more
probable than not that, within the twentieth century, an ultraintelligent
machine will be built and that it will be the last invention that man need
make.

Good has captured the essence of the runaway, but he does not pursue
its most disturbing consequences. Any intelligent machine of the sort he
describes would not be humankind's "tool" -- any more than humans are the
tools of rabbits, robins, or chimpanzees.

Through the sixties and seventies and eighties, recognition of the
cataclysm spread. Perhaps it was the science-fiction writers who felt the
first concrete impact. After all, the "hard" science-fiction writers are
the ones who try to write specific stories about all that technology may do
for us. More and more, these writers felt an opaque wall across the
future. Once, they could put such fantasies millions of years in the
future. Now they saw that their most diligent extrapolations resulted in
the unknowable . . . soon. Once, galactic empires might have seemed a
Posthuman domain. Now, sadly, even interplanetary ones are.

What about the coming decades, as we slide toward the edge? How will
the approach of the Singularity spread across the human world view? For a