NOTES AND REFERENCES
- References for Chapter 1
- References for Chapter 2
- References for Chapter 3
- References for Chapter 4
- References for Chapter 5
- References for Chapter 6
- References for Chapter 7
- References for Chapter 8
- References for Chapter 9
- References for Chapter 10
- References for Chapter 11
- References for Chapter 12
- References for Chapter 13
- References for Chapter 14
- References for Chapter 15
- References for Afterword, 1985
- References for Afterword, 1990
... Engines of Construction* ... The
ideas in this chapter rest on technical arguments presented in my paper
"Molecular Engineering: An Approach
to the Development of General Capabilities for Molecular Manipulation"
(Proceedings of the National Academy of Sciences (USA), Vol.
78, pp. 5275-78, 1981), which presents a case for the feasibility of designing
protein molecules and developing general-purpose systems for directing molecular
assembly.
... "Protein engineering* ... represents..."
See "Protein Engineering," by Kevin Ulmer (Science,
Vol. 219, pp. 666-71, Feb. 11, 1983). Dr. Ulmer is now the director of the
Center for Advanced Research in Biotechnology.
... One dictionary* ... The
American Heritage Dictionary of the English Language, edited
by William Morris (Boston: Houghton Mifflin,
1978).
... modern gene synthesis machines* ...
See "Gene Machines: The Second Wave," by Jonathan B. Tucker (High
Technology, pp. 50-59, March 1984).
... other proteins* serve basic mechanical functions
... See Chapter 27 of Biochemistry, by Albert L. Lelininger
(New York: Worth Publishers, 1975). This standard textbook is an excellent
source of information on the molecular machinery of life. For a discussion
of the bacterial flagellar motor, see "Ion Transport and the Rotation
of Bacterial Flagella," by P. Lauger (Nature,
Vol. 268, pp. 360-62, July 28,1977).
... self-assembling structures* ...
For a description of molecular self-assembly, including that of the T4
phage and the ribosome, see Chapter 36 of Lehninger's Biochemistry,
(referenced above).
... Designing with Protein* ... Nature
has demonstrated a wide range of protein machines, but this will not limit
us to designing with protein. For examples of fairly complex non-protein
structures, see "Supramolecular Chemistry: Receptors, Catalysts, and
Carriers," by Jean-Marie
Lehn (Science,
Vol. 227, pp. 849 - 56, February 22, 1985), which also speaks of designing
"components, circuitry, and systems for signal and information treatment
at the molecular level."
... any protein they can design* ...
Modern techniques can synthesize any desired DNA sequence, which can be
used to direct ribosomes to make any desired amino acid sequence. Adding
prosthetic group is another matter, however.
... These tasks may sound similar* ...
For a comparison of the task of predicting natural protein structures with
that of designing predictable structures, see "Molecular Engineering,"
referenced at the beginning of this section.
... in the journal Nature*
... See "Molecular Technology: Designing proteins and Peptides,"
by Carl Pabo (Nature,
Vol. 301, p.200, Jan. 20, 1983).
... short chains of a few dozen pieces* ...
See "Design, Synthesis, and Characterization of a 34-Residue Polypeptide
That Interacts with Nucleic Acids," by B. Gutte et al. (Nature,
Vol. 281, pp. 650-55, Oct. 25, 1979).
... They have designed from scratch* a protein
... For a reference to this and a general discussion of protein
engineering, see Kevin Ulmer's paper (referenced near the
beginning of this section).
... changing their behaviors* in predictable
ways ... See "A Large Increase in Enzyme-Substrate Affinity
by Protein Engineering," by Anthony J. Wilkinson et al. (Nature,
Vol. 307, pp. 187-88, Jan. 12, 1984). Genetic engineering techniques have
also been used to make an enzyme more stable, with no loss of activity.
See "Disulphide Bond Engineered into T4 Lysozyme: Stabilization
of the Protein Toward Thermal Inactivation," by L. Jeanne Perry and
Ronald Weutzel of Genentech, Inc. (Science,
Vol. 226, pp. 555-57, November 2, 1984).
... according to biologist Garrett Hardin* ...
in Nature and Man's Fate (New York: New American Library, 1959),
p. 283
... in the journal Science*
... See "Biological Frontiers," by Frederick J. Blattner
(Science, Vol.
222, pp. 719-20, Nov. 18, 1983).
... in Applied Biochemistry and Biotechnology*
... See Enzyme Engineering, by William H. Rastetter
(Applied Biochemistry and Biotechnology, Vol. 8, pp. 423-36,
1983). This review article describes several successful efforts to change
the substrate specificity of enzymes.
... two international workshops* on molecular
electronic devices ... For the proceedings of the first, see Molecular
Electronic Devices, edited by Forrest L. Carter (New York: Marcel
Dekker, 1982). The proceedings of the second appear in Molecular Electronic
Devices II, also edited by Forrest L. Carter (New York: Marcel Dekker,
1986). For a summary article, see "Molecular Level Fabrication Techniques
and Molecular Electronic Devices," by Forrest L. Carter (Journal
of Vacuum Science and Technology, B1(4), pp. 953-68, Oct.-Dec.
1983).
... recommended support for basic research* ...
See The Institute
(a publication of the IEEE), January
1984, p. 1.
... VLSI Research Inc* ... Reported
in Microelectronic Manufacturing and Testing, Sept. 1984, p.
49
... a single chemical bond* ... The
strength of a single bond between two carbon atoms is about six nano-newtons,
enough to support the weight of about 30,000 trillion carbon atoms. See
Strong Solids, by A. Kelly, p. 12 (Oxford: Clarendon Press,
1973).
... diamond fiber* ... Diamond is also
over ten times stiffer than aluminum. See Strong Solids (referenced
above), Appendix A, Table 2.
... chemists ... coax reacting molecules ...
See "Sculpting Horizons in Organic Chemistry," by Barry M. Trost
(Science, Vol.
227, pp. 908-16, February 22, 1985), which also mentions organic electrical
conductors and the promise of molecular switches for molecular electronics.
... will do all that proteins can do, and more*
... Chemists are already developing catalysts that improve on enzymes;
see "Catalysts That Break Nature's Monopoly," by Thomas H. Maugh
II (Science,
Vol. 221, pp. 351-54, July 22, 1983). For more on non-protein molecular
tools, see "Artificial Enzymes," by Ronald Breslow (Science,
Vol. 218, pp. 532-37, November 5, 1982).
... assemblers* ... See the first
reference in this section. A device reported in 1982, called the scanning
tunneling microscope, can position a sharp needle near a surface with an
accuracy of a fraction of an atomic diameter. Besides demonstrating the
feasibility of such positioning, it may be able to replace molecular machinery
in positioning molecular tools. See "Scanning Tunneling Microscopy,"
by G. Binnig and H. Rohrer (Physica 127B, pp 37-45, 1985).
... almost any reasonable arrangement* ...
Assemblers will be able to create otherwise improbable arrangements of reactant
molecules (overcoming entropy-of-activation factors), and will be able to
direct the action of highly reactive chemical species. This will allow the
use in controlled synthesis of reactions that would otherwise proceed only
at a negligible rate or with an excessive number and rate of side reactions.
Further, assemblers will be able to apply mechanical forces of bond-breaking
magnitude to provide activation energy for reactions, and they will be able
to employ molecular-scale conductors linked to voltage sources to manipulate
electric fields in a direct and novel fashion. While photochemical techniques
will not be as useful (because typical photon wavelengths are large on a
molecular scale), similar results may sometimes be achieved by transfer
of electronic excitation from molecule to molecule in a controlled, localized
way.
Though assemblers will be powerful (and could even be directed to expand
their own toolkits by assembling new tools), they will not be able to build
everything that could exist. For example, a delicate structure might be
designed that, like a stone arch, would self-destruct unless all its pieces
were already in place. If there were no room in the design for the placement
and removal of a scaffolding, then the structure might be impossible to
build. Few structures of practical interest seem likely to exhibit such
a problem, however. (In fact, the reversibility of the laws governing molecular
motion implies that all destructable objects are, in principle,
constructable; but if the destruction mechanisms all involve an
explosive collapse, then attempts at construction via the reverse mechanism
may have a negligible chance of success, owing to considerations involving
the uncertainty of the trajectories of the incoming parts and the low entropy
of the target state.)
... the DNA-copying machinery* in some cells
... See "Comparative Rates of Spontaneous Mutation,"
by John W. Drake (Nature,
Vol. 221, p. 1132, March 22, 1969). For a general discussion of this machinery,
see Chapter 32 of Lehninger's Biochemistry (referenced
above).
... repairing and replacing* radiation-damaged
parts ... The bacterium Micrococcus radiodurans has vigorous
molecular repair mechanisms that enable it to survive the equivalent of
more than a million years' worth of normal terrestrial background radiation
delivered in a single dose. (See "Inhibition of Repair DNA Synthesis
in M. radiodurans after Irradiation with Gamma-rays," by Shigeru
Kitayama and Akira Matsuyama, in Agriculture and Biological Chemistry.
Vol. 43, pp. 229-305, 1979.) This is about one thousand times the lethal
radiation dose for humans, and enough to make Teflon weak and brittle.
... life has never abandoned* ... Living
organisms have built cell structures and simple molecular devices from lipids
and sugars (and have built shells from silica and lime) but the lack of
programmable assembly systems for these materials has kept life from exploiting
them to form the main parts of complex molecular machines. RNA, like protein,
has a structure directly determined by DNA, and it sometimes serves protein-like
functions. See "First True RNA Catalyst Found" (Science,
Vol. 223, p. 266, Jan. 20, 1984).
... R. B. Merrifield ... used chemical techniques*
... See Lehninger's Biochemistry. p. 119 (referenced
above).
... during the mid-1800s, Charles Babbage* ...
See Chapter 2 of Bit by Bit An Illustrated History of Computers,
by Stan Augarten (New York: Ticknor & Fields, 1984).
... a billion bytes ... in a box a micron wide*
... If two different side groups on a polyethylene-like polymer
are used to represent the ones and zeros of binary code, then the polymer
can serve as a data storage tape. If one were to use, say, fluorine and
hydrogen as the two side groups, and to allow considerable room for tape
reading, writing, and handling mechanisms, then a half cubic micron would
store about a billion bytes. Access times can be kept in the microsecond
range because the tapes can be made very short. A mechanical random-access
memory scheme allows storage of only about 10 million bytes in the same
volume, though this can probably be bettered. For a more detailed discussion,
see "Molecular Machinery and Molecular Electronic Devices," by
K. Eric Drexler, in Molecular Electronic Devices II, edited
by Forrest L. Carter (New York: Marcel Dekker, 1986).
... mechanical signals* ... These could
be sent by pushing and pulling atom-wide rods of carbyne, a form of carbon
in which the atoms are linked in a straight line by alternating single and
triple bonds. See "Molecular Machinery and Molecular Electronic Devices,"
referenced in the above note.
... a scheme proposed* by ... Richard
Feynman ... See his article "Quantum Mechanical Computers"
(Optics News, Vol. 11, pp. 11-20, Feb. 1985). Feynman
concludes that "the laws of physics present no barrier to reducing
the size of computers until bits are the size of atoms, and quantum behavior
holds dominant sway."
... a disassembler* ... There
will be limits to disassemblers as well: For example, one could presumably
design a sensitive structure that would fall apart (or explode) when tampered
with, preventing controlled disassembly.
... "Think of the design process..."
See The Sciences of the Artificial (Second Edition) by Herbert
A. Simon (Cambridge, Mass: MIT
Press, 1981). This book explores a range of issues related to engineering,
problem-solving, economics, and artificial intelligence.
... Both strand and copy ... Because
of the rules for nucleotide pairing, the copies actually resemble photographic
negatives, and only a copy of a copy matches the original itself.
... Biochemist Sol Spiegelman ... A
discussion of his work in this area appears in "The Origin of Genetic
Information," by Manfred Eigen et al. (Scientific American,
Vol. 244, pp. 88-117, April 1981).
... Oxford zoologist Richard
Dawkins ... discusses replicators in The Selfish Gene
(New York: Oxford University Press, 1976). This readable book offers an
excellent introduction to modern concepts of evolution, focusing on germ-line
replicators as the units that undergo variation and selection in evolution.
... As Richard
Dawkins points out ... in The Selfish Gene (see
above).
... Darwin's detested book ... The
Origin of Species, by Charles R. Darwin (London: Charles Murray,
1859).
... the ... ideas of evolution were known before
Darwin ... See p. 59 of The Constitution of Liberty,
by Friedrich A. Hayek (Chicago: University of Chicago Press, 1960) for a
discussion of the earlier work on linguistic, institutional, and even biological
evolution, which apparently developed "the conceptual apparatus that
Darwin employed." See also p. 23 of Law, Legislation and Liberty,
Vol. 1, Rules and Order (Chicago: University of Chicago Press,
1973). Elsewhere, these books discuss the concept of liberty under law and
the crucial distinction between a law and a command. These will be important
to matters discussed in Chapters 11 and 12.
... As Richard
Dawkins puts it ... in The Selfish Gene (see above).
... in The Next Whole Earth Catalog
... Edited by Stewart Brand (Sausalito, California: POINT: distributed
by Random House, New York. 1980).
... Peters and Waterman ... See In
Search of Excellence. Lessons from America's Best-Run Corporations,
by Thomas J. Peters and Robert H. Waterman, Jr. (New York: Warner Books,
1982).
... as Alfred North Whitehead stated ...
in Science and the Modern World (New York: Macmillan Company,
1925).
... only study, imagination, and thought ...
Converting these into good computer graphics and video will help a lot,
though.
... Richard
Dawkins calls ... "Meme - is a meme that was launched
in the last chapter of The Selfish Gene (see above).
... selfish motives can encourage cooperation
... In The Evolution of Cooperation (New York: Basic
Books, 1984) political scientist Robert Axelrod uses a multisided computer
game and historical examples to explore the conditions required for cooperation
to evolve among selfish entities. Being nice, retaliatory, and
forgiving is important to evolving stable cooperation. Chapter 7 of this
valuable book discusses. "How to Promote Cooperation."
... In The Extended Phenotype ...
by Richard
Dawkins (San Francisco: W. H. Freeman, 1982).
... This meme package infected the Xhosa people
... See "The Self-Destruction of the Xosas," Elias Canetti,
Crowds and Power (New York: Continuum, 1973), p. 193.
... "The critical attitude..."
From Conjectures and Refutations: The Growth of Scientific Knowledge,
by Sir Karl Popper (New York: Basic
Books, 1962).
... Richard
Feynman ... gave a talk ... "There's
Plenty of Room at the Bottom," reprinted in Miniaturization,
edited by H. D. Gilbert (New York: Reinhold, 1961).
... Bertrand Russell observed ... Quoted
by Karl Popper in Objective Knowledge: An Evolutionary Approach
(Oxford: Clarendon Press, 1972).
... to seem true or ... to be
true ... Ideas that have evolved to seem true (at least
to uncritical minds) can in fact be quite false. An excellent work that
compares naive human judgment to judgment aided by scientific and statistical
techniques is Human Inference, a book by Richard Nisbett and
Lee Ross in the Century Psychology Series (Englewood Cliffs, New Jersey:
Prentice-Hall, 1980). It shows that, just as we suffer from optical illusions
and blind spots, so we suffer from cognitive illusions and blind spots.
Other experiments show that untutored people share systematic misunderstandings
of such elementary facts as the direction a ball will move when whirled
in a circle and then released; learned medieval philosophers (who neglected
to test their ideas against reality) evolved whole systems of "science"
based on identical misunderstandings. See "Intuitive Physics,"
by Michael McClosky (Scientific American, Vol. 248, pp. 122-30,
Apr. 1983).
... survivors ... huddle so close together ...
Strictly speaking, this applies only to survivors that are themselves uniform,
general theories. The theory that all rocks will fall straight up next Wednesday
has not been disproved (and would have practical consequences), but the
special reference to Wednesday makes it nonuniform.
... as Karl Popper points out ... See
his Logic of Scientific Discovery, pp. 124 and 419 (New York:
Harper & Row, 1965). See also Objective Knowledge, p. 15.
... As ... Ralph E. Gomory says ...
in "Technology Development (Science,
Vol. 220, pp. 576-80, May 6, 1983).
... their interplay of function and motion ...
These were clear only at low speeds; while Leonardo surely had some intuitive
sense of dynamics, a description of dynamics adequate to predict the behavior
of fast-moving, high-acceleration machine parts did not arrive until Newton.
... designs even in the absence of the tools
... Familiarity with the steady progress in chip fabrication technology
has led some companies to design microprocessors whose manufacture required
techniques not available at the time of their design.
... computer-aided design of molecular systems
... To do this well will require the simulation of molecular systems.
A discussion of one system for molecular simulation appears in Robert Bruccoleri's
doctoral thesis, "Macromolecular Mechanics and Protein Folding"
(Harvard University, May 1984). For the results of a simulation, see "Dynamics
and Conformational Energetics of a Peptide Hormone: Vasopressin," by
A. T. Hagler et al. (Science,
Vol. 227, pp. 1309-15, Mar. 15, 1985). These references both describe classical
simulations, which describe how molecules move in response to forces; such
simulations will be adequate for most parts of a typical molecular machine.
Other work requires more fundamental (and more costly) quantum mechanical
simulations, which describe the distribution of electrons in molecules.
These calculations will be required to describe the forming and breaking
of bonds by assembler tools. For a discussion of molecular simulations that
include quantum mechanical calculations of bond formation, see "Theoretical
Chemistry Comes Alive: Full Partner with Experiment," by William H.
Goddard III (Science,
Vol. 227, pp. 912-23, Feb. 22, 1985). See also Lecture Notes in Chemistry,
19, Computational Aspects for Large Chemical Systems, by Enrico
Clementi (New York: Springer-Verlag, 1980). Finally, for a discussion of
present design tools, see "Designing Molecules by Computer," by
Jonathan B. Tucker (High Technology, pp. 52-59, Jan. 1984).
Parallel processing computers will greatly aid computational chemistry and
computer-aided design.
... design ahead ... Early design-ahead
efforts seem likely to aim at defining a workable assembler system; it need
not be ideal, so long as it has fairly broad capabilities. Once the capabilities
of this standard assembler design are fairly well specified - even before
the design is complete - it will become possible (11) to begin developing
a library of nanomachine designs suited to construction by this standard
assembler (or by assemblers that the standard assembler could construct),
and (2) to prepare a corresponding library of procedures for the assembly
of these designs. Then when the first crude assembler is developed, it can
be used (perhaps through an intermediate stage of tool building) to build
a standard assembler. This in turn could be used to build anything in the
design library.
Early assemblers will greatly extend our ability to make things. With even
limited design ahead, the advent of assemblers will result almost immediately
in substantial jumps in the quality of hardware. Since assemblers will be
built by assemblers, some form of self-replicating system will be an immediate
natural consequence of design ahead and the assembler breakthrough. Accordingly,
the advent of assemblers may make possible not only a jump in hardware quality,
but the almost immediate mass production of that hardware in unprecedented
quantities (see Chapter 4). For better or for worse, this will make possible
an unusually abrupt change in technology, in economics, and in world affairs.
... If every tool, when ordered ...
From Scientific Quotations: The Harvest of a Quiet Eye, selected
by A. L. Mackay, edited by M. Ebison (New York: Crane, Russak, 1977).
... a NASA scientist ... Former NASA
administrator Robert Frosch said much the same thing at the IEEE Centennial
Technical Convocation (see The
Institute, p. 6, Dec. 1984).
... replicators, such as viruses, bacteria ...
In an evolutionary sense, an animal's genes are replicators, but the animal
itself is not; only changes to genes, not changes to an animal's body, are
replicated in later generations. This distinction between genetic replicators
and the systems they shape is essential to understanding evolution, but
use of the term "replicator" to refer to the whole system is more
convenient when discussing replicating systems as productive assets.
... Fujitsu Fanuc ... See "Production:
A Dynamic Challenge". by M. E. Merchant (IEEE
Spectrum, pp. 36-39, May 1983). This issue of the IEEE
Spectrum contains an extensive discussion of computer-based automation.
... will instead resemble factories ...
Cell-style organization nonetheless has advantages. For example, despite
various active-transport mechanisms, cells typically transport molecular
components by diffusion rather than by conveyors. This effectively connects
every machine to every other (in the same membrane compartment) in a robust
fashion; conveyors, in contrast, can break down, requiring repair or replacement.
But it may be that properly implemented conveyor-based transportation has
strong advantages and yet did not evolve. Conveyor-based systems would be
harder to evolve because they require a new molecular machine to have a
suitable location, orientation, and interface to the conveyor before it
can function. If it failed to meet any of these requirements it would be
useless, and selective pressures would generally eliminate it before a useful
variant had a chance to appear. For a new molecular machine to function
in a diffusion-based system, though, it need only be present. If it does
something useful, selection will favor it immediately.
... A fast enzyme ... See Albert L.
Lehninger's Biochemistry, p. 208 (in Chapter 1 references).
Further, each molecule of the enzyme catalase can break down 40 million
hydrogen peroxide molecules per second; see Enzyme Structure and Mechanism,
by Alan Fersht, p. 132 (San Francisco: W. H. Freeman & Co., 1977). In typical
enzymatic reactions, molecules must wander into position with respect to
the enzyme's "tools," then wait for random thermal vibrations
to cause a reaction, and then wander out of the way again. These steps take
up almost all of the enzyme's time; the time required to form or break a
bond is vastly smaller. Because the electrons of a bond are over a thousand
times lighter and more mobile than the nuclei that define atomic positions,
the slower motion of whole atoms sets the pace. The speed of typical atoms
under thermal agitation at ordinary temperatures is over 100 meters per
second, and the distance an atom must move to form or break a bond is typically
about a ten billionth of a meter, so the time required is about one trillionth
of a second. See Chapter 12 of Molecular Thermodynamics, by
John H. Knox (New York: Wiley-Interscience,
1971).
... about fifty million times more rapidly ...
This scaling relationship may be verified by observing (1) that mechanical
disturbances travel at the speed of sound (arriving in half the time if
they travel half as far) and (2) that, for a constant stress in the arm
material, reducing the arm length (and hence the mass per unit cross section)
by one half doubles the acceleration at the tip while halving the distance
the tip travels, which allows the tip to move back and forth in half the
time (since the time required for a motion is the square root of a quantity
proportional to the distance traveled divided by the acceleration).
... a copy ... of the tape ... Depending
on the cleverness (or lack of cleverness) of the coding scheme, the tape
might have more mass than the rest of the system put together. But since
tape duplication is a simple, specialized function, it need not be performed
by the assembler itself.
... only a minute fraction misplaced ...
due to rare fluctuations in thermal noise, and to radiation damage during
the assembly process. High-reliability assemblers will include a quality-control
system to identify unwanted variations in structure. This system could consist
of a sensor arm used to probe the surface of the workpiece to identify the
unplanned bumps or hollows that would mark a recent mistake. Omissions (typically
shown by hollows) could be corrected by adding the omitted atoms. Misplaced
groups (typically shown by bumps) could be corrected by fitting the assembler
arm with tools to remove the misplaced atoms. Alternatively, a small workpiece
could simply be completed and tested. Mistakes could then be discarded before
they had a chance to be incorporated into a larger, more valuable system.
These quality-control steps will slow the assembly process somewhat.
... working, like muscle ... See the
notes for Chapter 6.
... The world stands ... Quoted from
Business Week, March
8, 1982.
... As Daniel Dennett ... points out ...
See "Why the Law of Effect Will Not Go Away," in Daniel C. Dennett,
Brainstorms: Philosophical Essays on Mind and Psychology (Cambridge,
Mass: MIT Press, 1981). This
book explores a range of interesting issues, including evolution and artificial
intelligence.
... Marvin Minsky ... views the mind ...
See his book The Society of Mind (New York: Simon
& Schuster, to be published in 1986). I have had an opportunity to review
much of this work in manuscript form; it offers valuable insights about
thought, language, memory, developmental psychology, and consciousness -
and about how these relate to one another and to artificial intelligence.
... Any system or device ... American
Heritage Dictionary, edited by William Morris (Boston: Houghton
Mifflin Company, 1978).
... Babbage had built ... See Bit
by Bit, in Chapter 1 references.
... the Handbook of Artificial Intelligence
... edited by Avron Barr and Edward A. Feigenbaum (Los Altos, Calif:
W. Kaufmann, 1982).
... As Douglas Hofstadter urges ...
See "The Turing Test: A Coffeehouse Conversation" in The
Mind's I, composed and arranged by Douglas R. Hofstadter and Daniel
C. Dennett (New York: Basic
Books, 1981).
... We can surely make machines ...
As software engineer Mark Miller puts it, "Why should people be able
to make intelligence in the bedroom, but not in the laboratory?"
... "I believe that by the end of the century..."
From "Computing Machinery and Intelligence," by Alan M. Turing
(Mind, Vol. 59, No. 236, 1950); excerpted in The Mind's
I (referenced above).
... a system could show both kinds ...
Social and technical capabilities might stem from a common basis, or from
linked subsystems; the boundaries can easily blur. Still, specific AI systems
could be clearly deserving of one name or the other. Efforts to make technical
AI systems as useful as possible will inevitably involve efforts to make
them understand human speech and desires.
... social AI ... Advanced
social AI systems present obvious dangers. A system able to pass the Turing
test would have to be able to plan and set goals as a human would - that
is, it would have to be able to plot and scheme, perhaps to persuade people
to give it yet more information and ability. Intelligent people have done
great harm through words alone, and a Turing-test passer would of necessity
be designed to understand and deceive people (and it would not necessarily
have to be imbued with rigid ethical standards, though it might be). Chapter
11 discusses the problem of how to live with advanced AI systems, and how
to build AI systems worthy of trust.
... "May not machines carry out ..."
See the earlier reference to Turing's paper, "Computing Machinery and
Intelligence."
... Developed by Professor Douglas Lenat ...
and described by him in a series of articles on "The Nature of Heuristics"
(Artificial Intelligence, Vol. 19, pp. 189-249, 1982; Vol.
21, pp. 31-59 and 61-98, 1983; Vol. 23, pp. 269-93, 1984).
... Traveller
TCS ... See preceding reference, Vol. 21,
pp. 73-83.
... EURISKO has shortcomings ... Lenat
considers the most serious to be EURISKO's limited ability to evolve new
representations for new information.
... In October of 1981 ... In the fall of 1984
... See "The 'Star Wars' Defense Won't Compute," by Jonathan
Jacky (The Atlantic,
Vol. 255, pp. 18-30, June 1985).
... in the IEEE
Spectrum ... See "Designing the Next Generation,"
by Paul Wallich (IEEE Spectrum,
pp. 73-77, November 1983).
... fresh insights into human psychology ...
Hubert Dreyfus, in his well-known book What Computers Can't Do: The
Limits of Artificial Intelligence (New York: Harper & Row, 1979),
presents a loosely reasoned philosophical argument that digital computers
can never be programmed to perform the full range of human intellectual
activities. Even if one were to accept his arguments, this would not affect
the main conclusions I draw regarding the future of AI: the automation of
engineering design is not subject to his arguments because it does not require
what he considers genuine intelligence; duplicating the human mind by means
of neural simulation avoids (and undermines) his philosophical arguments
by dealing with mental processes at a level where those arguments do not
apply.
... virus-sized molecular machines ...
See Chapter 7.
... build analogous devices ... These
devices might be electromechanical, and will probably be controlled by microprocessors;
they will not be as simple as transistors. Fast neural simulation of the
sort I describe will be possible even if each simulated synapse must have
its properties controlled by a device as complex as a microprocessor.
... experimental electronic switches ...
which switch in slightly over 12 picoseconds are described in "The
HEMT: A Superfast Transistor," by Hadis Morkoc and Paul M. Solomon
(IEEE Spectrum,
pp. 28-35, Feb. 1984).
... Professor Robert Jastrow ... in
his book The Enchanted Loom: The Mind in the Universe (New
York: Simon & Schuster, 1981).
... will fit in less than a cubic centimeter
... The brain consists chiefly of wirelike structures (the axons
and dendrites) and switchlike structures (the synapses). This is an oversimplification,
however, because at least some wirelike structures can have their resistance
modulated on a short time scale (as discussed in "A Theoretical Analysis
of Electrical Properties of Spines," by C. Koch and T. Poggio, MIT
AI Lab Memo No. 713, April 1983). Further, synapses behave less like switches
than like modifiable switching circuits; they can be modulated on a short
time scale and entirely rebuilt on a longer time scale (see "Cell Biology
of Synaptic Plasticity," by Carl W. Cotman and Manuel Nieto-Sampedro,
Science, Vol.
225, pp. 1287-94, Sept. 21, 1984).
The brain can apparently be modeled by a system of nanoelectronic components
modulated and rebuilt by nanomachinery directed by mechanical nanocomputers.
Assume that one nanocomputer is allotted to regulate each of the quadrillion
or so "synapses" in the model brain, and that each also regulates
corresponding sections of "axon" and "dendrite." Since
the volume of each nanocomputer (if equivalent to a modern microprocessor)
will be about 0.0003 cubic micron (See "Molecular Machinery and Molecular
Electronic Devices," referenced in Chapter 1), these devices will occupy
a total of about 0.3 cubic centimeter. Dividing another 0.3 cubic centimeter
equally between fast random-access memory and fairly fast tape memory would
give each processor a total of about 3.7K bytes of RAM and 275K bytes of
tape. (This sets no limit to program complexity, since several processors
could share a larger program memory.) This amount of information seems far
more than enough to provide an adequate model of the functional state of
a synapse. Molecular machines (able to modulate nanoelectronic components)
and assembler systems (able to rebuild them) would occupy comparatively
little room. Interchange of information among the computers using carbyne
rods could provide for the simulation of slower, chemical signaling in the
brain.
Of the nanoelectronic components, wires will occupy the most volume. Typical
dendrites are over a micron in diameter, and serve primarily as conductors.
The diameter of thin wires could be less than a hundredth of a micron, determined
by the thickness of the insulation required to limit electron tunneling
(about three nanometers at most). Their conductivity can easily exceed that
of a dendrite. Since the volume of the entire brain is about equal to that
of a ten-centimeter box, wires a hundred times thinner (one ten-thousandth
the cross section) will occupy at most 0.01 of a cubic centimeter (allowing
for their being shorter as well). Electromechanical switches modulated by
molecular machinery can apparently be scaled down by about the same factor,
compared to synapses.
Thus, nanoelectronic circuits that simulate the electrochemical behavior
of the brain can apparently fit in a bit more than 0.01 cubic centimeter.
A generous allowance of volume for nanocomputers to simulate the slower
functions of the brain totals 0.6 cubic centimeter, as calculated above.
A cubic centimeter thus seems ample.
... dissipating a millionfold more heat ...
This may be a pessimistic assumption, however. For example, consider axons
and dendrites as electrical systems transmitting signals. All else being
equal, millionfold faster operation requires millionfold greater currents
to reach a given voltage threshold. Resistive heating varies as the current
squared, divided by the conductivity. But copper has about forty
million times the conductivity of neurons (see "A Theoretical Analysis
of Electrical Properties of Spines," referenced above), reducing resistive
heating to less than the level assumed (even in a device like that described
in the text, which is somewhat more compact than a brain). For another example,
consider the energy dissipated in the triggering of a synapse: devices requiring
less energy per triggering would result in a power dissipation less than
that assumed in the text. There seems no reason to believe that neurons
are near the limits of energy efficiency in information processing; for
a discussion of where those limits lie, see "Thermodynamics of Computation
- A Review," by C. H. Bennett (International Journal of Theoretical
Physics, Vol. 21, pp. 219-53, 1982). This reference states that neurons
dissipate an energy of over one billion electron-volts per discharge. Calculations
indicate that electrostatically activated mechanical relays can switch on
or off in less than a nanosecond, while operating at less than 0.1 volt
(like neurons) and consuming less than a hundred electron-volts per operation.
(There is no reason to believe that mechanical relays will make the best
switches, but their performance is easy to calculate.) Interconnect capacitances
can also be far lower than those in the brain.
... pipe ... bolted to its top ... This
is a somewhat silly image, since assemblers can make connectors that work
better than bolts and cooling systems that work better than flowing water.
But to attempt to discuss systems based entirely on advanced assembler-built
hardware would at best drag in details of secondary importance, and would
at worst sound like a bogus prediction of what will be built, rather
than a sound projection of what could be built. Accordingly, I
will often describe assembler-built systems in contexts that nanotechnology
would in fact render obsolete.
... As John McCarthy ... points out ...
See Machines Who Think, by Pamela McCorduck, p. 344 (San Francisco:
W. H. Freeman & Company, 1979). This book is a readable and entertaining
overview of artificial intelligence from the perspective of the people and
history of the field.
... As Marvin Minsky has said ... in
U.S. News & World Report, P. 65, November 2, 1981.
... engineers know of alternatives ...
For orbit-to-orbit transportation, one attractive alternative is using rockets
burning fuel produced in space from space resources.
... result will be the "lightsail"
... For further discussion, see "Sailing on Sunlight May Give
Space Travel a Second Wind" (Smithsonian, pp. 52-61, Feb.
1982), "High Performance Solar Sails and Related Reflecting Devices,"
AIAA Paper 79-1418, in Space Manufacturing III, edited by Jerry
Grey and Christine Krop (New York: American Institute of Aeronautics and
Astronautics, 1979), and MIT Space Systems Laboratory Report 5-79, by K.
Eric Drexler. The World Space Foundation (P.0. Box Y, South Pasadena, Calif.
91030) is a nonprofit, membership-oriented organization that is building
an experimental solar sail and supporting the search for accessible asteroids.
... The asteroids ... are flying mountains of
resources ... For a discussion of asteroidal resources, see "Asteroid
Surface Materials: Mineralogical Characterizations from Reflectance Spectra,"
by Michael J. Gaffey and Thomas B. McCord (Space Science Reviews,
No. 21, p. 555, 1978) and "Finding "Paydirt" on the Moon
and Asteroids," by Robert L. Staehle (Astronautics and Aeronautics,
pp. 44-49, November 1983).
... as permanent as a hydroelectric dam ...
Erosion by micrometeoroids is a minor problem, and damage by large meteoroids
is extremely rare.
... Gerard O'Neill ... See his book
The High Frontier: Human Colonies in Space (New York: William
Morrow, 1976). The Space Studies Institute (285 Rosedale Road, P.0. Box
82, Princeton, N.J. 08540) is a nonprofit, membership-oriented organization
aimed at advancing the economic development and settlement of space, working
chiefly through research projects. The L5 Society (1060 East Elm, Tucson,
Ariz. 85719) is a nonprofit, membership-oriented organization aimed at advancing
the economic development and settlement of space, working chiefly through
public education and political action.
... Using this energy to power assemblers ...
How much electric power can a given mass of solar collector supply? Since
electric energy is readily convertible to chemical energy, this will indicate
how rapidly a solar collector of a given mass can supply enough energy to
construct an equal mass of something else. Experimental amorphous-silicon
solar cells convert sunlight to electricity with about 10 percent efficiency
in an active layer about a micron thick, yielding about 60 kilowatts of
power per kilogram of active mass. Assembler-built solar cells will apparently
be able to do much better, and need not have heavy substrates or heavy,
low-voltage electrical connections. Sixty kilowatts of power supplies enough
energy in a few minutes to break and rearrange all the chemical bonds in
a kilogram of typical material. Thus a spacecraft with a small fraction
of its mass invested in solar collectors will be able to entirely rework
its own structure in an hour or so. More important, though, this calculation
indicates that solar-powered replicators will be able to gather enough power
to support several doublings per hour.
... The middle layer of the suit material ...
To have the specified strength, only about one percent of the material's
cross-sectional area must consist of diamond fibers (hollow telescoping
rods, in one implementation) that run in a load-bearing direction. There
exists a regular, three-dimensional woven pattern (with fibers running in
seven different directions to support all possible types of load, including
shear) in which packed cylindrical fibers fill about 45 percent of the total
volume. In any given direction, only some of the fibers can bear a substantial
load, and using hollow, telescoping fibers (and then extending them by a
factor of two in length) makes the weave less dense. These factors consume
most of the margin between 45 percent and one percent, leaving the material
only as strong as a typical steel.
For the suit to change shape while holding a constant internal volume at
a constant pressure, and do so efficiently, the mechanical energy absorbed
by stretching material in one place must be recovered and used to do mechanical
work in contracting material in another place - say, on the other side of
a bending elbow joint. One way to accomplish this is by means of electrostatic
motors, reversible as generators, linked to a common electric power system.
Scaling laws favor electrostatic over electromagnetic motors at small sizes.
A design exercise (with applications not limited to hypothetical space suits)
resulted in a device about 50 nanometers in diameter that works on the principle
of a "pelletron"-style Van de Graaff generator, using electron
tunneling across small gaps to charge pellets and using a rotor in place
of a pellet chain. (The device also resembles a bucket-style water wheel.)
DC operation would be at 10 volts, and the efficiency of power conversion
(both to and from mechanical power) seems likely to prove excellent, limited
chiefly by frictional losses. The power-conversion density (for a rotor
rim speed of one meter per second and pellets charged by a single electron)
is about three trillion watts per cubic meter. This seems more than adequate.
As for frictional losses in general, rotary bearings with strengths of over
6 nano-newtons can be made from carbon bonds - see Strong Solids,
by A. Kelly (Oxford: Clarendon Press, 1973) - and bearings using a pair
of triple-bonded carbon atoms should allow almost perfectly unhindered rotation.
Roller bearings based on atomically perfect hollow cylinders with bumps
rolling gear-fashion on atomically perfect races have at least two significant
energy-dissipation modes, one resulting from phonon (sound) radiation through
the slight bumpiness of the rolling motion, the other resulting from scattering
of existing phonons by the moving contact point. Estimates of both forms
of friction (for rollers at least a few nanometers in diameter moving at
modest speeds) suggest that they will dissipate very little power, by conventional
standards.
Electrostatic motors and roller bearings can be combined to make telescoping
jackscrews on a submicron scale. These can in turn be used as fibers in
a material able to behave in the manner described in the text.
... now transmits only a tenth of the force ...
An exception to this is a force that causes overall acceleration: for example,
equilibrium demands that the forces on the soles of the feet of a person
standing in an accelerating rocket provide support, and the suit must transmit
them without amplification or diminution. Handling this smoothly may be
left as an exercise for future control-system designers and nanocomputer
programmers.
... the suit will keep you comfortable ...
Disassemblers, assemblers, power, and cooling - together, these suffice
to recycle all the materials a person needs and to maintain a comfortable
environment. Power and cooling are crucial.
As for power, a typical person consumes less than 100 watts, on the average;
the solar power falling on a surface the size of a sheet of typing paper
(at Earth's distance from the Sun) is almost as great. If the suit is covered
with a film that acts as a high-efficiency solar cell, the sunlight striking
it should provide enough power. Where this is inadequate, a solar-cell parasol
could be used to gather more power.
As for cooling, all power absorbed must eventually be disposed of as waste
heat - in a vacuum, by means of thermal radiation. At body temperature,
a surface can radiate over 500 watts per square meter. With efficient solar
cells and suitable design (and keeping in mind the possibility of cooling
fins and refrigeration cycles), cooling should be no problem in a wide range
of environments. The suit's material can, of course, contain channels for
the flow of coolant to keep the wearer's skin at a preferred temperature.
... a range of devices greater than ... yet built
... A pinhead holds about a cubic millimeter of material (depending
on the pin, of course). This is enough room to encode an amount of text
greater than that in a trillion books (large libraries hold only
millions). Even allowing for a picture's being worth a thousand words, this
is presumably enough room to store plans for a wide enough range of devices.
... in a morning ... The engineering
AI systems described in Chapter 5, being a million times faster than human
engineers, could perform several centuries' worth of design work in a morning.
... replicating assemblers that work in space
... Assemblers in a vacuum can provide any desired environment
at a chemical reaction site by positioning the proper set of molecular tools.
With proper design and active repair-and-replacement mechanisms, exposure
to the natural radiation of space will be no problem.
... move it off Earth entirely ... But
what about polluting space? Debris in Earth orbit is a significant hazard
and needs to be controlled, but many environmental problems on Earth cannot
occur in space: space lacks air to pollute, groundwater to contaminate,
or a biosphere to damage. Space is already flooded with natural radiation.
As life moves into space, it will be protected from the raw space environment.
Further, space is big the volume of the inner solar system alone is many
trillions of times that of Earth's air and oceans. If technology on Earth
has been like a bull in a china shop, then technology in space will be like
a bull in an open field.
... As Konstantin Tsiolkovsky wrote ...
Quoted in The High Frontier: Human Colonies in Space, by Gerard
K. O'Neill (New York: William Morrow, 1976).
... drive a beam far beyond our solar system
... This concept was first presented by Robert L. Forward in 1962.
... Freeman Dyson ... suggests ... He
discussed this in a talk at an informal session of the May 15, 1980, "Discussion
Meeting on Gossamer Spacecraft," held at the jet Propulsion Laboratory
in Pasadena, California.
... Robert Forward ... suggests ...
See his article "Roundtrip Interstellar Travel Using Laser-Pushed Lightsails,"
(Journal of Spacecraft and Rockets, Vol. 21, pp. 187-95, Jan.-Feb.
1984). Forward notes the problem of making a beam-reversal sail light enough,
yet of sufficient optical quality (diffraction limited) to do its job. An
actively controlled structure based on thin metal films positioned by nanometer-scale
actuators and computers seems a workable approach to solving this problem.
But nanotechnology will allow a different approach to accelerating lightsails
and stopping their cargo. Replicating assemblers will make it easy to build
large lasers, lenses, and sails. Sails can be made of a crystalline dielectric,
such as aluminum oxide, having extremely high strength and low optical absorptivity.
Such sails could endure intense laser light, scattering it and accelerating
at many gees, approaching the speed of light in a fraction of a year. This
will allow sails to reach their destinations in near-minimal time. (For
a discussion of the multi-gee acceleration of dielectric objects, see "Applications
of Laser Radiation Pressure," by A. Ashkin [Science,
Vol. 210, pp. 1081-88, Dec. 5, 1980].)
In flight, computer-driven assembler systems aboard the sail (powered by
yet more laser light from the point of departure) could rebuild the sail
into a long, thin traveling-wave accelerator. This can then be used to electrically
accelerate a hollow shell of strong material several microns in radius and
containing about a cubic micron of cargo; such a shell can be given a high
positive charge-to-mass ratio. Calculations indicate that an accelerator
1,000 kilometers long (there's room enough, in space) will be more than
adequate to accelerate the shell and cargo to over 90 percent of the speed
of light. A mass of one gram per meter for the accelerator (yielding a one-ton
system) seems more than adequate. As the accelerator plunges through the
target star system, it fires backward at a speed chosen to leave
the cargo almost at rest. (For a discussion of the electrostatic acceleration
of small particles, see "Impact Fusion and the Field Emission Projectile,"
by E. R. Harrison [Nature,
Vol. 291, pp. 472-73, June 11, 1981].)
The residual velocity of the projectile can be directed to make it strike
the atmosphere of a Mars- or Venus-like planet (selected beforehand by means
of a large space-based telescope). A thin shell of the sort described will
radiate atmospheric entry heat rapidly enough to remain cool. The cargo,
consisting of an assembler and nanocomputer system, can then use the light
of the local sun and local carbon, hydrogen, nitrogen, and oxygen (likely
to be found in any planetary atmosphere) to replicate and to build larger
structures.
An early project would be construction of a receiver for further instructions
from home, including plans for complex devices. These can include rockets
able to get off the planet (used as a target chiefly for its atmospheric
cushion) to reach a better location for construction work. The resulting
system of replicating assemblers could build virtually anything, including
intelligent systems for exploration. To solve the lightsail stop ping problem
for the massive passenger vehicles that might follow, the system could build
an array of braking lasers as large as the launching lasers back home. Their
construction could be completed in a matter of weeks following delivery
of the cubic-micron "seed." This system illustrates one way to
spread human civilization to the stars at only slightly less than the speed
of light.
... space near Earth holds ... Two days'
travel at one gee acceleration can carry a person from Earth to any point
on a disk having over 20 million times the area of Earth - and this calculation
allows for a hole in the middle of the disk with a radius a hundred times
the Earth-Moon distance. Even so, the outer edge of the disk reaches only
one twentieth of the way to the Sun.
... enough energy in ten minutes ...
Assuming conversion of solar to kinetic energy with roughly 10 percent efficiency,
which should be achievable in any of several ways.
... Dr. Seymour Cohen ... argues ...
See his article "Comparative Biochemistry and Drug Design for Infectious
Disease" (Science,
Vol. 205, pp. 964-71, Sept. 7, 1979).
... Researchers at Upjohn Company ...
See "A Conformationally Constrained Vasopressin Analog with Antidiuretic
Antagonistic Activity," by Gerald Skala et al. (Science,
Vol. 226, pp. 443-45, Oct. 26, 1984).
... a dictionary definition of holism ...
The
American Heritage Dictionary of the English Language, edited
by William Morris (Boston: Houghton Mifflin
Company, 1978).
... aided by sophisticated technical AI systems
... These will be used both to help design molecular instruments
and to direct their use. Using devices able to go to specified locations,
grab molecules, and analyze them, the study of cell structures will become
fairly easy to automate.
... separated molecules can be put back together
... Repair machines could use devices resembling the robots now
used in industrial assembly work. But reassembling cellular structures will
not require machines so precise (that is, so precise for their size).
Many structures in cells will self-assemble if their components are merely
confined together with freedom to bump around; they need not be manipulated
in a complex and precise fashion. Cells already contain all the tools needed
to assemble cell structures, and none is as complex as an industrial robot.
... the T4 phage ... self-assembles
... See pp. 1022-23 of Biochemistry, by Albert L.
Lehninger (New York: Worth Publishers, 1975).
... lipofuscin ... fills over ten percent ...
Lipofuscin contents vary with cell type, but some brain cells (in old animals)
contain an average of about 17 percent; typical lipofuscin granules are
one to three microns across. See "Lipofuscin Pigment Accumulation as
a Function of Age and Distribution in Rodent Brain," by William Reichel
et al. (Journal of Gerontology, Vol. 23, pp. 71-81, 1968).
See also "Lipoprotein Pigments - Their Relationship to Aging in the
Human Nervous System,"- by D. M. A. Mann and P. O. Yates (Brain,
Vol. 97, pp. 481-88, 1974).
... about one in a million million million ...
The implied relationship is not exact but shows the right trend: for example,
the second number should be 2.33 uncorrected errors in a million million,
and the third should be 4.44 in a million million million (according to
some fairly complex calculations based on a slightly more complex correction
algorithm).
... compare DNA molecules ... make corrected
copies ... Immune cells that produce different antibodies have
different genes, edited during development. Repairing these genes will require
special rules (but the demonstrated feasibility of growing an immune system
shows that the right patterns of information can be generated).
... will identify molecules in a similar way
... Note that any molecule damaged enough to have an abnormal effect
on the molecular machinery of the cell will by the same token be damaged
enough to have a distinctive effect on molecular sensors.
... a complex and capable repair system ...
For a monograph that discusses this topic in more detail, including calculations
of volumes, speeds, powers, and computational loads, see "Cell Repair
Systems," by K. Eric Drexler (available through The
Foresight Institute, Palo Alto, Calif.).
... will be in communication ... For
example, by means of hollow fibers a nanometer or two in diameter, each
carrying a carbyne signaling rod of the sort used inside mechanical nanocomputers.
Signal repeaters can be used where needed.
... to map damaged cellular structures ...
This need not require solving any very difficult pattern recognition problems,
save in cases where the cell structure is grossly disrupted. Each cell structure
contains standard types of molecules in a pattern that varies within stereotyped
limits, and a simple algorithm can identify even substantially-damaged proteins.
Identification of the standard molecules in a structure determines its type;
mapping it then becomes a matter of filling in known sorts of details.
... in a single calendar year ... Molecular
experiments can be done about a millionfold faster than macroscopic experiments,
since an assembler arm can perform actions at a million times the rate of
a human arm (see Chapter 4). Thus, molecular machines and fast AI systems
are well matched in speed.
... extended ... lifespan ... by 25 to 45 percent
... using 2-MEA, BHT, and ethoxyquine; results depended on the
strain of mouse, the diet, and the chemical employed. See "Free Radical
Theory of Aging," by D. Harman (Triangle, Vol. 12, No.
4, pp. 153-58, 1973).
... Eastman Kodak ... according to the
Press-Telegram, Long Beach, Calif., April 26, 1985.
... rely on new science ... Cell repair
will also rely on new science, but in a different way. As discussed in Chapter
3, it makes sense to predict what we will learn about, but
not what we will learn. To extend life by means of cell repair
machines will require that we learn about cell structures before
we repair them, but what we learn will not affect the feasibility
of those repairs. To extend life by conventional means, in contrast, will
depend on how well the molecular machinery of the body can repair itself
when properly treated. We will learn more about this, but what we learn
could prove discouraging.
... durability has costs ... See "Evolution
of Aging," a review article by T. B. L. Kirkwood (Nature,
Vol. 270, pp. 301-4, 1977).
... As Sir Peter Medawar points out ...
in The Uniqueness of the Individual (London: Methuen, 1957).
See also the discussion in The Selfish Gene, pp. 42-45 (in
Chapter 2 references).
... Experiments by Dr. Leonard Hayflick ...
See the reference above, which includes an alternative (but broadly similar)
explanation for Hayflick's result.
... A mechanism of this sort ... For
a reference to a statement of this theory (by D. Dykhuizen in 1974) together
with a criticism and a rebuttal, see the letters by Robin Holliday, and
by John Cairns and Jonathan Logan, under "Cancer and Cell Senescence"
(Nature, Vol. 306, p.
742, December 29, 1983).
... could harm older animals by stopping ...
division ... These animals could still have a high cancer rate
because of a high incidence of broken clocks.
... cleaning machines to remove these poisons
... One system's meat really is another's poison; cars "eat"
a toxic petroleum product. Even among organisms, some bacteria thrive on
a combination of methanol (wood alcohol) and carbon monoxide (see "Single-Carbon
Chemistry of Acetogenic and Methanogenic Bacteria," by J. G. Zeikus
et al., Science,
Vol. 227, pp. 1167-71, March 8, 1985), while others have been bred that
can live on either trichlorophenol or the herbicide 2,4,5-T. They can even
defluorinate pentafluorophenol. (See "Microbial Degradation of Halogenated
Compounds," D. Chousal et al., Science,
Vol. 228, pp. 135-42, April 12, 1985.)
... cheap enough to eliminate the need for fossil
fuels ... through the use of fuels made by means of solar energy.
... able to extract carbon dioxide from the air
... In terms of sheer tonnage, carbon dioxide is perhaps our biggest
pollution problem. Yet, surprisingly, a simple calculation shows that the
sunlight striking Earth in a day contains enough energy to split all the
carbon dioxide in the atmosphere into carbon and oxygen (efficiency considerations
aside). Even allowing for various practical and aesthetic limitations, we
will have ample energy to complete this greatest of cleanups in the span
of a single decade.
... Alan Wilson ... and his co-workers ...
See "Gene Samples from an Extinct Animal Cloned," by J. A. Miller
(Science News, Vol. 125, p. 356, June 9, 1984).
... O my friend ... The Iliad,
by Homer (about the eighth century B.C.), as quoted by Eric Hoffer in The
True Believer (New York: Harper & Brothers, 1951). (Sarpedon is indeed
killed in the battle.)
... Gilgamesh, King of Uruk ... From
The Epic of Gilgamesh, translated by N. K. Sandars (Middlesex:
Penguin Books, 1972).
... To Jacques Dubourg ... In Mr.
Franklin, A Selection from His Personal Letters, by L. W. Labaree
and W. J. Bell, Jr. (New Haven: Yale University Press, 1956), pp. 27-29.
... a new heart, fresh kidneys, or younger skin
... With organs and tissues grown from the recipient's own cells,
there will be no problem of rejection.
... The changes ... are far from subtle ...
Experiments show that variations in experience rapidly produce visible variations
in the shape of dendritic spines (small synapse-bearing protrusions on dendrites).
See "A Theoretical Analysis of Electrical Properties of Spines,"
by C. Koch and T. Poggio (MIT AI Lab Memo No. 713, April 1983).
In "Cell Biology of Synaptic Plasticity" (Science,
Vol. 225, pp. 128794, Sept. 21, 1984), Carl W. Cotman and Manuel Nieto-Sampedro
write that "The nervous system is specialized to mediate the adaptive
response of the organism... To this end the nervous system is uniquely modifiable,
or plastic. Neuronal plasticity is largely the capability of synapses to
modify their function, to be replaced, and to increase or decrease in number
when required." Further, "because the neocortex is believed to
be one of the sites of learning and memory, most of the studies of the synaptic
effect of natural stimuli have concentrated on this area." Increases
in dendritic branching in the neocortex "are caused by age (experience)
in both rodents and humans. Smaller but reproducible increases are observed
after learning of particular tasks ..." These changes in cell structure
can occur "within hours."
For a discussion of short- and long-term memory, and of how the first may
be converted into the second, see "The Biochemistry of Memory: A New
and Specific Hypothesis," by Gaty Lynch and Michel Baudty (Science,
Vol. 224, pp. 1057-63, June 8, 1984).
At present, every viable theory of long-term memory involves changes in
the structure and protein content of neurons. There is a persistent popular
idea that memory might somehow be stored (exclusively?) "in RNA molecules,"
a rumor seemingly fostered by an analogy with DNA, the "memory"
responsible for heredity. This idea stems from old experiments suggesting
that learned behaviors could be transferred to uneducated flatworms by injecting
them with RNA extracted from educated worms. Unfortunately for this theory,
the same results were obtained using RNA from entirely uneducated yeast
cells. See Biology Today, by David Kirk, p. 616 (New York:
Random House, 1975).
Another persistent popular idea is that memory might be stored in the form
of reverberating patterns of electrical activity, a rumor seemingly fostered
by an analogy with the dynamic random-access memories of modern computers.
This analogy, however, is inappropriate for several reasons: (1) Computer
memories, unlike brains, are designed to be erased and reused repeatedly.
(2) The patterns in a computer's "long-term memory" - its magnetic
disk, for example - are in fact more durable than dynamic RAM. (3) Silicon
chips are designed for structural stability, while the brain is designed
for dynamic structural change. In light of the modern evidence for long-term
memory storage in long-lasting brain structures, it is not surprising that
"total cessation of the electrical activity of the brain does not generally
delete memories, although it may selectively affect the most recently stored
ones" (A. J. Dunn). The electrical reverberation theory was proposed
by R. Lorente de No (Journal of Neurophysiology, Vol. 1, p.
207) in 1938. Modern evidence fails to support such theories of ephemeral
memory.
... "striking morphological changes"
... For technical reasons, this study was performed in mollusks,
but neurobiology has proved surprisingly uniform. See "Molecular Biology
of Learning: Modulation of Transmitter Release," by Eric R. Kandel
and James H. Schwartz (Science,
Vol. 2 18, pp. 433-43, Oct. 29, 1982), which reports work by C. Baily and
M. Chen.
... until after vital functions have ceased ...
The time between expiration and dissolution defines the window for successful
biostasis, but this time is uncertain. As medical experience shows, it is
possible to destroy the brain (causing irreversible dissolution of mind
and memory) even while a patient breathes. In contrast, patients have been
successfully revived after a significant period of so-called "clinical
death." With cell repair machines, the basic requirement is that brain
cells remain structurally intact; so long as they are alive, they
are presumably intact, so viability provides a conservative indicator.
There is a common myth that the brain "cannot survive" for more
than a few minutes without oxygen. Even if this were true regarding survival
of the (spontaneous) ability to resume function, the survival of
characteristic cell structure would still be another matter. And
indeed, cell structures in the brains of expired dogs, even when kept at
room temperature, show only moderate changes after six hours, and many cell
structures remain visible for a day or more; see "Studies on the Epithalamus,"
by Duane E. Haines and Thomas W. Jenkins (Journal of Comparative Neurology,
Vol. 132, pp. 405-17, Mar. 1968).
But in fact, the potential for spontaneous brain function can survive for
longer than this myth (and the medical definition of "brain death")
would suggest. A variety of experiments employing drugs and surgery show
this: Adult monkeys have completely recovered after a sixteen-minute cutoff
of circulation to the brain (a condition, called "ischemia," which
clearly blocks oxygen supply as well); see "Thiopental Therapy After
16 Minutes of Global Brain Ischemia in Monkeys, by A. L. Bleyaert et al.
(Critical Care Medicine, Vol. 4, pp. 130-31, Mar./Apr. 1976).
Monkey and cat brains have survived for an hour at body temperature without
circulation, then recovered electrical function; see "Reversibility
of Ischemic Brain Damage," by K.-A. Hossmann and Paul Kleihues (Archives
of Neurology, Vol. 29, pp. 375-84, Dec. 1973). Dr. Hossmann concludes
that any nerve cell in the brain can survive" for an hour without blood
(after the heart stops pumping, for example). The problem is not that nerve
cells die when circulation stops, but that secondary problems (such as a
slight swelling of the brain within its tight-fitting bone case) can prevent
circulation from resuming. When chilled to near freezing, dog brains have
recovered electrical activity after four hours without circulation (and
have recovered substantial metabolic activity even after fifteen days)"
see "Prolonged Whole-Brain Refrigeration with Electrical and Metabolic
Recovery," by Robert J. White et al. (Nature,
Vol. 209, pp. 1320-22, Mar. 26, 1966).
Brain cells that retain the capability for spontaneous revival
at the time when they undergo biostasis should prove easy to repair. Since
success chiefly requires that characteristic cell structures remain intact,
the time window for beginning biostasis procedures is probably at least
several hours after expiration, and possibly longer. Cooperative hospitals
can and have made the time much shorter.
... fixation procedures preserve cells ...
For high-voltage electron micrographs showing molecular-scale detail in
cells preserved by glutaraldehyde fixation, see "The Ground Substance
of the Living Cell," by Keith R. Porter and Jonathan B. Tucker (Scientific
American, Vol. 244, pp. 56-68, Mar. 1981). Fixation alone does not
seem sufficient; long-term stabilization of structure seems to demand freezing
or vitrification, either alone or in addition to fixation. Cooling in nitrogen
- to minus 196 degrees C - can preserve tissue structures for many thousands
of years.
... solidification without freezing ...
See "Vitrification as an Approach to Cryopreservation," by G.
M. Fahy et al. (Cryobiology, Vol. 21, pp. 407-26, 1984).
... Mouse embryos ... See "Ice-free
Cryopreservation of Mouse Embryos at -196 degrees C by Vitrification,"
by W. F. Rall and G. M. Fahy (Nature,
Vol. 313, pp. 573-75, Feb. 14, 1985).
... Robert Ettinger ... published a book ...
The Prospect of Immortality (New York: Doubleday, 1964; a preliminary
version was privately published in 1962).
... many human cells revive spontaneously
... It is well known that human sperm cells and early embryos survive
freezing and storage; in both cases, successes have been reported in the
mass media. Less spectacular successes with other cell types (frozen and
thawed blood is used for transfusions) are numerous. It is also interesting
to note that, after treatment with glycerol and freezing to minus 20 degrees
C, cat brains can recover spontaneous electrical activity after over 200
days of storage; see "Viability of Long Term Frozen Cat Brain In
Vitro," by I. Suda, K. Kito, and C. Adachi (Nature,
Vol. 212, pp. 268-70, Oct. 15, 1966).
... researching ways to freeze and thaw viable
organs ... A group at the Cryobiology Laboratory of The American
Red Cross (9312 Old Georgetown Road, Bethesda, Md. 20814) is pursuing the
preservation of whole human organs to allow the establishment of banks of
organs for transplantation; see "Vitrification as an Approach to Cryopreservation,"
referenced above.
... cell repair ... has been a consistent theme
... As I found when the evident feasibility of cell repair finally
led me to examine the cryonics literature. Robert Ettinger's original book,
for example (referenced above), speaks of the eventual development of "huge
surgeon machines" able to repair tissues' "cell by cell, or even
molecule by molecule in critical areas." In 1969 Jerome B. White gave
a paper on "Viral Induced Repair of Damaged Neurons with Preservation
of Long Term Information Content," proposing that means might be found
to direct repair using artificial viruses; see the abstract quoted in Man
into Superman, by Robert C. W. Ettinger (New York: St. Martin's Press,
1972, p. 298). In "The Anabolocyte: A Biological Approach to Repairing
Cryo-injury" (Life Extension Magazine, pp. 80-83, July/August
1977), Michael Darwin proposed that it might be possible to use genetic
engineering to make highly modified white blood cells able to take apart
and reconstruct damaged cells. In "How Will They Bring Us Back, 200
Years From Now?" (The Immortalist, Vol. 12, pp. 5-10,
Mar. 1981), Thomas Donaldson proposed that systems of molecular machines
(with devices as small as viruses and aggregates of devices as large as
buildings, if need be) could perform any needed repairs on frozen tissues.
The idea of cell repair systems has thus been around for many years. The
concepts of the assembler and the nanocomputer have now made it possible
to see clearly how such devices can be built and controlled, and that they
can in fact fit within cells.
... the animals fail to revive ... Hamsters,
however, have been cooled to a temperature which froze over half the water
content in their bodies (and brains), and have then revived with complete
recovery; see Biological Effects of Freezing and Supercooling,
by Audrey U. Smith (Baltimore: Williams & Wilkins, 1961).
... As Robert Prehoda stated ... in
Designing the Future: The Role of Technological Forecasting
(Philadelphia: Chilton Book Co., 1967).
... discouraged the use of a workable biostasis
technique ... Other factors have also been discouraging - chiefly
cost and ignorance. For a patient to pay for a biostasis procedure and to
establish a fund that provides for indefinite storage in liquid nitrogen
now costs $35,000 or more, depending on the biostasis procedure chosen.
This cost is typically covered by purchasing a suitable life insurance policy.
Facing this cost and having no clear picture of how freezing damage can
be repaired, only a few patients out of millions have so far chosen this
course. The small demand, in turn, has prevented economies of scale from
lowering the cost of the service. But this may be about to change. Cryonics
groups report a recent increase in biostasis contracts, apparently stemming
from knowledge of advances in molecular biology and in the understanding
of future cell repair capabilities.
Three U.S. groups presently offer biostasis services. In order of their
apparent size and quality, they are:
- The Alcor Life Extension Foundation,
4030 North Palm No. 304, Fullerton, Calif. 92635, (714) 738-5569. (Alcor
also has a branch and facilities in southern Florida.)
- Trans Time,
Inc., 1507 63rd Street, Emeryville, Calif. 94707, (415) 655-9734.
- The Cryonics
Institute, 24041 Stratford, Oak Park, Mich. 48237, (313) 967-3115.
For practical reasons based on experience, they require that legal and financial
arrangements be completed in advance.
... this preserves neural structures ...
The growth of ice crystals can displace cell structures by a few millionths
of a meter, but it does not obliterate them, nor does it seem likely to
cause any significant confusion regarding where they were before being displaced.
Once frozen, they move no further. Repairs can commence before thawing lets
them move again.
... clearing ... the major blood vessels ...
Current biostasis procedures involve washing out most of a patient's blood;
the nanomachines recover any remaining blood cells as they clear the circulatory
system.
... throughout the normally active tissues ...
This excludes, for example, the cornea, but other means can be used to gain
access to the interior of such tissues, or they can simply be replaced.
... that enter cells and remove the glassy protectant
... Molecules of protectant are bound to one another by bonds so
weak that they break at room temperature from thermal vibrations. Even at
low temperatures, protectant-removal machines will have no trouble pulling
these molecules loose from surfaces.
... a temporary molecular scaffolding ...
This could be built of nanometer-thick rods, designed to snap together.
Molecules could be fastened to the scaffolding with devices resembling double-ended
alligator clips.
... the machines label them ... Labels
can be made from small segments of coded polymer tape. A segment a few nanometers
long can specify a location anywhere within a cubic micron to one-nanometer
precision.
... report ... to a larger computer within the
cell ... In fact, a bundle of nanometer-diameter signal-transmission
fibers the diameter of a finger (with slender branches throughout the patient's
capillaries) can in less than a week transmit a complete molecular description
of all a patient's cells to a set of external computers. Though apparently
unnecessary, the use of external computers would remove most of the significant
volume, speed, and power-dissipation constraints on the amount of computation
available to plan repair procedures.
... identifies cell structures from molecular
patterns ... Cells have stereotyped structures, each built from
standard kinds of molecules connected in standard ways in accordance with
standard genetic programs. This will greatly simplify the identification
problem.
... Richard
Feynman saw ... He pointed out the possibility of making devices
with wires as little as ten or a hundred atoms wide; see "There's
Plenty of Room at the Bottom," in Miniaturization,
edited by H. D. Gilbert (New York: Reinhold, 1961), pp. 282-96.
... Robert T. Jones wrote ... in "The
Idea of Progress" (Astronautics and Aeronautics, p. 60,
May 1981).
... Dr. Lewis Thomas wrote ... in "Basic
Medical Research: A Long-Term Investment" (Technology Review,
pp. 46-47, May/June 1981).
... Joseph Lister published ... See
Volume V, "Fine Chemicals" in A History of Technology,
edited by C. J. Singer and others (Oxford: Clarendon Press, 1958).
... Sir Humphry Davy wrote ... See A
History of Technology, referenced above.
... the limiting speed is nothing so crude or
so breakable ... The principle of relativity of motion means that
"moving" objects may be considered to be at rest - meaning that
a spaceship pilot trying to approach the speed of light wouldn't even know
in what direction to accelerate. Further, simple Minkowski diagrams show
that the geometry of space-time makes traveling faster than light equivalent
to traveling backward in time - and where do you point a rocket to move
in that direction?
... Arthur
C. Clarke wrote ... in Profiles of the Future: An Inquiry
into the Limits of the Possible, first edition (New York: Harper
& Row, 1962).
... its properties limit all that we can do ...
For an account of some modern theories that attempt to unify all physics
in terms of the behavior of the vacuum, see "The Hidden Dimensions
of Spacetime," by Daniel Z. Freedman and Peter van Nieuwenhuizen (Scientific
American, Vol. 252, pp. 74-81, Mar. 1985).
... peculiarities far more subtle ...
For example, quantum measurements can affect the outcome of other quantum
measurements instantaneously at an arbitrarily great distance - but the
effects are only statistical and of a subtle sort that has been mathematically
proved to be unable to transmit information. See the very readable discussion
of Bell's theorem and the Einstein-Podolsky-Rosen paradox in Quantum
Reality by Nick Herbert (Garden City, New York: Anchor Press/Doubleday,
1985). Despite rumors to the contrary (some passed on in the final pages
of Quantum Reality), nothing seems to suggest that consciousness
and the mind rely on quantum mechanics in any special way. For an excellent
discussion of how consciousness works (and of how little consciousness we
really have) see Marvin Minsky's The Society of Mind (New York:
Simon & Schuster, 1986).
... Your victim might have said something vague
... But now physics can answer those questions with clear mathematics.
Calculations based on the equations of quantum mechanics show that air is
gaseous because nitrogen and oxygen atoms tend to form tightly bonded pairs,
unbonded to anything else. Air is transparent because visible light lacks
enough energy to excite its tightly bound electrons to higher quantum states,
so photons pass through without absorption. A wooden desk is solid because
it contains carbon atoms which (as shown by quantum mechanical calculations)
are able to form the tightly bonded chains of cellulose and lignin. It is
brown because its electrons are in a variety of states, some able to be
excited by visible light; it preferentially absorbs bluer, higher-energy
photons, making the reflected light yellowish or reddish.
... Stephen W. Hawking states ... In
"The Edge of Spacetime" (American Scientist, Vol.
72, pp. 355-59, Jul.-Aug. 1984).
... Few other stable particles are known ...
Electrons, protons, and neutrons have stable antiparticles with virtually
identical properties save for opposite charges and the ability to annihilate
when paired with their twins, releasing energy (or lighter particles). They
thus have obvious applications in energy storage. Further, antimatter objects
(made from the antiparticles of ordinary matter) may have utility as negative
electrodes in high-field electrostatic systems: the field would have no
tendency to remove positrons (as it would electrons), making mechanical
disruption of the electrode surface the chief limit to field strength. Such
electrodes would have to be made and positioned without contacting ordinary
matter, of course.
Various physical theories predict a variety of other stable particles (and
even massive, particle-like lines), but all would be either so weakly interacting
as to be almost undetectable (like neutrinos, only more so) or very massive
(like hypothesized magnetic monopoles). Such particles could still be very
useful, if found.
... Trying to change a nucleus ... The
molecular and field effects used in nuclear magnetic resonance spectroscopy
change the orientation of a nucleus, but not its structure.
... the properties of well-separated nuclei ...
It has been suggested that excited nuclei might even be made to serve as
the lasing medium in a gamma-ray laser.
... would present substantial difficulties ...
Before nuclei are pushed close enough together to interact, the associated
atomic structures merge to form a solid, metal-like "degenerate matter,"
stable only under enormous pressure. When the nuclei finally do interact,
the exchange of neutrons and other particles soon transmutes them all into
similar kinds, obliterating many of the patterns one might seek to build
and use.
... insulating against heat ... This
is a simple goal to state, but the optimal structures (at least where some
compressive strength is required) may be quite complex. Regular crystals
transmit heat well, making irregularity desirable, and irregularity means
complexity.
... the runners-up will often be nearly as good
... And in some instances, we may design the best possible system,
yet never be sure that better systems do not exist.
... Richard Barnet writes ... in The
Lean Years: Politics in the Age of Scarcity (New York: Simon
& Schuster, 1980).
... Jeremy Rifkin (with Ted Howard) has written
... Entropy: A New World View (New York: Viking Press,
1980).
... "The ultimate moral imperative, then
..." Despite this statement, Rifkin has since struck off on
a fresh moral crusade, this time against the idea of evolution and against
human beings' modifying genes, even in ways that viruses and bacteria
have done for millions of years. Again, he warns of cosmic consequences.
But he apparently still believes in the tightly sealed, ever dying world
he described in Entropy: "We live by the grace of sacrifice.
Every amplification of our being owes its existence to some diminution somewhere
else. "Having proved in Entropy that he misunderstands
how the cosmos works, he now seeks to advise us about what it wants:
"The interests of the cosmos are no different from ours... How then
do we best represent the interests of the cosmos? By paying back to the
extent to which we have received." But he seems to see all human achievements
as fundamentally destructive, stating that "the only living legacy
that we can ever leave is the endowment we never touched," and declaring
that "life requires death." For more misanthropy and misconceptions,
see Algeny, by Jeremy Rifkin (New York: Viking, 1983).
For a confident assertion that genetic engineering is impossible in the
first place, made by Rifkin's "prophet and teacher," Nicholas
Georgescu Roegen, see The Entropy Law and the Economic Process
(Cambridge, Mass: Harvard University Press, 1971).
... exponential growth will overrun ...
The demographic transition - the lowering of average birthrates with economic
growth - is basically irrelevant to this. The exponential growth of even
a tiny minority would swiftly make it a majority, and then make it consume
all available resources.
... exploding outward at near the speed of light
... The reason for this rests on a very basic evolutionary argument.
Assume that a diverse, competitive civilization begins expanding into space.
What groups will be found at the frontier? Precisely those groups that expand
fastest. The competition for access to the frontier provides an evolutionary
pressure that favors maximum speed of travel and settlement, and that maximum
speed is little short of the speed of light (see the notes to Chapter 6).
In a hundred million years, such civilizations would spread not just across
galaxies, but across intergalactic space. That a thousand or a million times
as many civilizations might collapse before reaching space, or might survive
without expanding, is simply irrelevant. A fundamental lesson of evolution
is that, where replicators are concerned, a single success can outweigh
an unlimited number of failures.
... need not contain every possible chemical
... Even the number of possible DNA molecules 50 nucleotides long
(four to the fiftieth power) is greater than the number of molecules in
a glass of water.
... The Limits to Growth ...
by Donella H. Meadows et al. (New York: Universe Books, 1972).
... Mankind at the Turning Point
... by Mihajlo D. Mesarovic and Eduard Pestel (New York: Dutton,
1974).
... trouble enough controlling viruses and fruit
flies ... We have trouble even though they are made of conventional
molecular machinery. Bacteria are also hard to control, yet they are superficially
almost helpless. Each bacterial cell resembles a small, rigid, mouthless
box - to eat, a bacterium must be immersed in a film of water that can carry
dissolved nutrients for it to absorb. In contrast, assembler-based "superbacteria"
could work with or without water; they could feed their molecular machinery
with raw materials collected by "mouths" able to attack solid
structures.
... AI systems could serve as ... strategists,
or fighters ... See "The Fifth Generation: Taking Stock,"
by M. Mitchell Waldrop (Science,
Vol. 226, pp. 1061-63, Nov. 30, 1984), and "Military Robots,"
by Joseph K. Corrado (Design News, pp. 45-66, Oct. 10, 1983).
... none, if need be ... To be precise,
an object can be assembled with a negligible chance of putting any atoms
in the wrong place. During assembly, errors can be made arbitrarily unlikely
by a process of repeated testing and correction (see the notes
for Chapter 4). For example, assume that errors are fairly common. Assume
further that a test sometimes fails, allowing one in every thousand errors
to pass undetected. If so, then a series of twenty tests will make the chance
of failing to detect and correct an error so low that the odds of misplacing
a single atom would be slight, even in making an object the size of the
Earth. But radiation damage (occurring at a rate proportional to the object's
size and age) will eventually displace even correctly placed atoms, so this
degree of care would be pointless.
... a cosmic ray can unexpectedly knock atoms
loose from anything ... It might seem that shielding could eliminate
this problem, but neutrinos able to penetrate the entire thickness of the
Earth - or Jupiter, or the Sun - can stilt cause radiation damage, though
at a very small rate. See "The Search for Proton Decay," by J.
M. LoSecco et al. (Scientific American, Vol. 252, p. 59, June
1985).
... Stratus Computer Inc., for example ...
See "Fault-Tolerant Systems in Commercial Applications," by Omri
Serlin (Computer, Vol. 17, pp. 19-30, Aug. 1984).
...design diversity ... See "Fault
Tolerance by Design Diversity: Concepts and Experiments," by Algirdas
Avizienis and John P. J. Kelly (Computer, Vol. 17, pp. 67-80,
Aug. 1984).
... redundancy ... multiple DNA strands ...
The bacterium Micrococcus radiodurans apparently has quadruple-redundant
DNA, enabling it to survive extreme radiation doses. See "Multiplicity
of Genome Equivalents in the Radiation-Resistant Bacterium Micrococcus
radiodurans," by Mogens T. Hansen, in Journal of Bacteriology,
pp. 7 1-75, Apr. 1978.
... other effective error-correcting systems
... Error correction based on multiple copies is easier to explain,
but digital audio disks (for example) use other methods that allow error
correction with far less redundant information. For an explanation of a
common error-correcting code, see "The Reliability of Computer Memory,"
by Robert McEliece (Scientific American, Vol. 248, pp. 88-92,
Jan. 1985).
... intelligence will involve mental parts ...
See The Society of Mind, by Marvin Minsky (New York: Simon
& Schuster, 1986).
... "The Scientific Community Metaphor"
... by William A. Kornfeld and Carl Hewitt (MIT Al Lab Memo No.
641, Jan. 1981).
... AI systems can be made trustworthy ...
Safety does not require that all AI systems be made trustworthy,
so long as some are trustworthy and help us plan precautions for
the rest.
... One proposal... This is a concept
being developed by Mark Miller and myself; it is related to the ideas discussed
in "Open Systems," by Carl Hewitt and Peter de Jong (MIT AI Lab
Memo No. 692, Dec. 1982).
... more reliably than ... human engineers ...
if only because fast AI systems (like those described in Chapter 5) will
be able to find and correct errors a million times faster.
... other than specially designed AI programs
... See "The Role of Heuristics in Learning by Discovery,"
by Douglas B. Lenat, in Machine Learning, edited by Michalski
et al. (Palo Alto, Calif: Tioga Publishing Company, 1983). For a discussion
of the successful evolution of programs designed to evolve, see pp. 243-85.
For a discussion of the unsuccessful evolution of programs intended to evolve
but not properly designed to do so, see pp. 288-92.
... neglect to give replicators similar talents
... Lacking these, "gray goo" might be able to replace
us and yet be unable to evolve into anything interesting.
... to correct its calculations ...
Calculations will allow the system to picture molecular structures that
it has not directly characterized. But calculations may lead to ambiguous
results in borderline cases-actual results may even depend on random tunneling
or thermal noise. In this case, the measurement of a few selected atomic
positions (performed by direct mechanical probing of the workpiece's surface)
should suffice to distinguish among the possibilities, thus correcting the
calculations. This can also correct for error buildup in calculations of
the geometry of large structures.
... Each sensor layer ... As described,
these sensor layers must be penetrated by wires, which might seem to present
a security problem: what if something were to get past the sensors by eating
its way along a wire? In practice, anything that can transmit signals and
power (including optical fibers, mechanical transmission systems, and so
forth) could be used in place of wires. These channels can be made secure
by basing them on materials with extreme properties: if a very fine wire
is made of the most conductive material, or if a mechanical transmission
system is made of the strongest material (and used near its breaking stress),
then any attempt to replace a segment with something else (such as an escaping
replicator) will show up as a greater electrical resistance or a fractured
part. Thus the transmission systems themselves can act as sensors. For the
sake of redundancy and design diversity, different sensor layers could be
penetrated by different transmission systems, each transmitting signals
and power to the next.
... If we destroy the records of the protein
designs ... But how could people be made to forget? This is not
really necessary, since their knowledge would be dispersed. In developing
modern hardware systems, different teams work on different parts; they need
not know what another team's part is (much less how it is made), because
only how it interacts really matters to them. In this way people reached
the Moon, though no one person or team ever fully knew how to do it; it
could be likewise with assemblers.
Since the first assembler designs will be historic documents, it might be
better to store them securely, rather than destroy them. Eventually they
can become part of the open literature. But hiding design information will
at best be a stopgap, since the methods used for the design of
the first assembler system will be harder to keep secret. Further, sealed
assembler labs might be used to develop and test machines that can make
assemblers, even machines that can themselves be made without assemblers.
... no fixed wall will be proof against large-scale,
organized malice ... Sealed assembler labs can work despite this.
They do not protect their contents from the outside; in fact, they are designed
to destroy their contents when tampered with. Instead, they protect the
outside from their contents - and their sealed work spaces are too small
to hold any large-scale system, malicious or not.
... giving the attacker no obvious advantage
... In the examples cited, organized entities were pitted against
similar entities. These entities could, of course, be vaporized by hydrogen
bombs, but faced with the prospect of retaliation in kind, no attacker has
yet seen an advantage in launching a nuclear strike.
... occupy hostile powers ... In principle,
this could be a minimal form of occupation, controlling only research laboratories,
but even this would require a degree of coercion roughly equivalent to conquest.
... as open as possible ... It may be
possible to devise forms of inspection that give a group great confidence
in what a system under development will (and will not) be able to do,
without letting that group learn how those systems are made. Compartmentalized
development of a system's components could, in principle, allow several
groups to cooperate without any single group's being able to build and use
a similar system independently.
... we naturally picture human hands aiming it
... For a discussion of autonomous spacecraft, see "Expanding
Role for Autonomy in Military Space," by David D. Evans and Maj. Ralph
R. Gajewski (Aerospace America, pp. 74-77, Feb. 1985). See
also "Can Space Weapons Serve Peace?" by K. Eric Drexler (L5
News, Vol. 9, pp. 1-2, Jul. 1983).
... while providing each with some protection
... Saying that a symmetrical, 50 percent effective shield would
be worthless is like saying that a bilateral 50 percent reduction in nuclear
missiles - a real breakthrough in arms control - would be worthless. The
practicality of such a shield is another matter. Until really good active
shields become possible, the question is not one of making a nuclear attack
harmless, but at best of making it less likely.
... limiting technology transfer ...
In fact, President Reagan has spoken of giving away U.S. space defense technology
to the Soviet Union. See the New York Times, p. A15, March
30, 1983. See also - Sharing "Star Wars, technology with Soviets a
distant possibility, says head of Pentagon study group," by John Horgan
(The Institute,
p. 10, Mar. 1984). Richard Ullman, professor of international affairs at
Princeton University, has proposed a joint defense program with extensive
sharing of technology; see "U.N.-doing Missiles" (New York Times,
p. A23, Apr. 28, 1983).
In principle, a joint project could proceed with little technology transfer.
There is a great difference between (1) knowing what a device cannot
do, (2) knowing what it can do, (3) knowing what it is,
and (4) knowing how to make it. These define four levels of knowledge,
each (more or less) independent of the levels beyond it. For example, if
I were to hand you a plastic box, a superficial examination might convince
you that it cannot fly or shoot bullets, but not tell you what it can do.
A demonstration might then convince you that it can serve as a cordless
telephone. By inspecting it more closely, you could trace its circuits and
gain an excellent idea of what it is and of what its operating limits are.
But you still wouldn't necessarily know how to make one.
The essence of an active shield lies in what it cannot do - that
is, that it cannot be used as a weapon. To conduct a joint active-shield
project relying on high-technology components, one would need to share knowledge
chiefly on levels (1) and (2). This requires at least limited sharing on
level (3), but need not require any on level (4).
... basic issues common to all active shields
... Such as those of their control, purpose, and reliability, and
the fundamental issue of political understanding and acceptance.
... a U.S. National Science Foundation survey
... as quoted by NSF Director John B. Slaughter (Time,
p. 55, June 15, 1981).
... Advice and Dissent ...
subtitled "Scientists in the Political Arena," by
Joel Primack and Frank von Hippel (New York: Basic
Books, 1974).
... Hazel Henderson argues ... in Creating
Alternative Futures: The End of Economics (New York: Berkley Publishing,
1978).
... Harrison Brown likewise argues ...
in The Human Future Revisited: The World Predicament and Possible
Solutions (New York: Norton, 1978).
... Debates ... over the safety of nuclear power
... For a discussion of the failures at the Three Mile Island nuclear
power plant, and a discussion of (1) the remarkable degree of agreement
on the problems reached by an expert panel and (2) how the media mangled
the story, and (3) how the federal government had failed to respond to reality,
see "Saving American Democracy," by John G. Kemeny, president
of Dartmouth College and chairman of the presidential commission on Three
Mile Island (Technology Review, pp. 65-75, June/July 1980).
He concludes that "the present system does not work."
... Disputes over facts ... Worse yet,
two people can agree on the facts and on basic values (say, that
wealth is good and pollution is bad) and yet disagree about building a factory
- one person may be more concerned about wealth, and the other about pollution.
In emotional debates, this can lead each side to accuse the other of perverted
values, such as favoring poverty or caring nothing for the environment.
Nanotechnology will ease such conflicts by changing the trade-offs. Because
we can have much more wealth and much less pollution, old opponents
may more often find themselves in agreement.
... AI researchers ... See "The
Scientific Community Metaphor," by William A. Kornfeld and Carl Hewitt
(MIT AI Lab Memo No. 641, Jan. 1981); see also the discussion of "due-process
reasoning" in "The Challenge of Open Systems," by Carl Hewitt
(Byte, Vol. 10, pp. 223-41, April 1985).
... procedures somewhat like those of courts
... These might use written communications, as in journals, rather
than face-to-face meetings: judging the truth of a statement by the manner
in which it is said is useful in courts, but plays a lesser role in science.
... Kantrowitz ... originated the concept ...
He did so in the mid-1960s. See his discussion in "Controlling Technology
Democratically" (American Scientist, Vol. 63, pp. 505-9,
Sept.-Oct. 1975).
... used (or proposed) as a government institution
... This is the original usage of the term "science court,"
and many criticisms of the due-process idea have stemmed from this aspect
of the proposal. The fact forum approach is genuinely different; Dr. Kantrowitz
is presently pursuing it under the name "Scientific Adversary Procedure."
... backed by the findings of an expert committee
... which in 1960 had drawn up a proposed space program for the
Air Force. It emphasized that learning to assemble systems in Earth orbit
(such as space stations and Moon ships) was at least as important as building
bigger boosters. During the subsequent debate on how to reach the Moon,
Kantrowitz argued that Earth-orbital assembly would be perhaps ten times
less expensive than the giant-rocket, lunar-orbit-rendezvous approach that
was finally chosen. But political factors intervened, and the matter never
received a proper public hearing. See "Arthur Kantrowitz Proposes a
Science Court," an interview by K. Eric Drexler (L5 News,
Vol. 2, p. 16, May 1977).
For an account of another abuse of technical decision-making during Apollo,
see The Heavens and the Earth: A Political History of the Space Age,
by William McDougall, pp. 315-16 (New York: Basic
Books, 1985).
... a proposed procedure ... This is
described in "The Science Court Experiment: An Interim Report,"
by the Task Force of the Presidential Advisory Group on Anticipated Advances
in Science and Technology (Science,
Vol. 193, pp. 653-56, Aug. 20, 1976).
... a colloquium on the science court ...
see the Proceedings of the Colloquium on the Science Court,
Leesburg, Virginia, Sept. 20-21, 1976 (National Technical Information Center,
document number PB261 305). For a summary and discussion of the criticisms
voiced at the colloquium, see "The Science Court Experiment: Criticisms
and Responses," by Arthur Kantrowitz (Bulletin of the Atomic
Scientists, Vol. 33, pp. 44-49, Apr. 1977).
... could move toward due process ...
The formation of the Health Effects Institute of Cambridge, Massachusetts,
created in 1980 to bring together adversaries in the field of air pollution,
has been a step in this direction. See "Health Effects Institute Links
Adversaries," by Eliot Marshall (Science,
Vol. 227, pp. 729-30, Feb. 15, 1985).
... knowledge is ... guarded ... An
open question is the extent to which non-public procedures embodying some
due process principles can improve the judging of classified information.
... an experimental procedure ... Reported
in "Science court, would tackle knotty technological issues,"
by Leon Lindsay (Christian Science Monitor, p. 7, Mar. 23,
1983).
... Roger Fisher and William Ury ...
See Getting to Yes (Boston: Houghton
Mifflin Company, 1981).
... Both sides ... The procedures described
here treat issues as two-sided, but this may seem too limited, because "issues,"
as commonly understood, often have many sides. In the energy debate, for
example, gas, coal, nuclear, and solar power all have their advocates. Yet
multisided issues contain many two-sided questions: Is the probability of
a reactor meltdown low or high? Are the effects of coal burning on acid
rain small or large? Will a solar collector cost little or much? Are gas
reserves small or large? Multisided issues thus often resolve at their factual
roots into numerical micro-questions.
Judicious scientists and engineers will seldom argue for high or low numbers
as such; they will argue for the particular numbers they think most likely,
or simply state evidence. But since holding a forum presupposes a dispute,
advocates will be involved, and they will often wish to push far in one
direction - nuclear advocates would like to prove that reactors are very
cheap and safe; their opponents would like to prove that they are very expensive
and deadly. Because numbers measuring cost and risk can only be larger or
smaller, these micro-questions will tend to be two-sided.
... Tohru Moto-oka ... He is a professor
at Tokyo University and the titular head of Japan's Fifth Generation Computer
Project.
... one system's structure ... Their
approach to hypertext,
now in the demonstration stage, is called the Xanadu system. I have examined
the proprietary data structures on which their system is based, and it is
clear that powerful hypertext systems are indeed possible. For a less ambitious
yet still quite powerful system, see "A Network-Based Approach to Text-Handling
for the Online Scientific Community," a thesis by Randall H. Trigg
(University of Maryland, Department of Computer Science, TR-1346, Nov. 1983).
... Theodor Nelson's books ... See Computer
Lib/Dream Machines (self-published, distributed by The Distributors,
South Bend, Ind., 1974), and Literary Machines (Swarthmore,
Pa: Ted Nelson, 1981). Computer Lib is an entertaining and
idiosyncratic view of computers and their potential, including hypertext;
a new edition is in preparation. Literary Machines focuses
on hypertext.
... Time
magazine reports ... on p. 76, June 13, 1983.
... increasing the quantity of information available
... A hypertext
system might store the most commonly used information in the home, or in
a local branch library. Compact disks of the sort used for audio recordings
cost about three dollars to manufacture and can store as much text as about
500 books. See "Audio Analysis II: Read-only Optical Disks," by
Christopher Fry (Computer Music Journal, Vol. 9, Summer 1985).
... bring abundance and long life to all who
wish them ... But the limits to exponential growth ensure that
universal, unconditional abundance cannot last indefinitely. This raises
questions regarding the distribution and ownership of space resources. Three
basic approaches might be considered:
One is a first-come, first-served approach, like the claiming of homesteads
or mining sites through use. This has roots in the Lockean principle that
ownership may be established by mixing one's labor with a previously unowned
resource. But this might allow a person with a suitable replicator to turn
it loose in space to rework - and thus claim - every unclaimed object in
the universe, as fast as it could be reached. This winner-take-all approach
has little moral justification, and would have unpleasant consequences.
A second extreme would be to distribute ownership of space resources equally
among all people, and to keep redistributing them to maintain equality.
This, too, would have unpleasant consequences. In the absence of universal,
stringent, compulsory limitations on childbearing, some groups would continue
to grow exponentially; evolutionary principles virtually guarantee this.
In a surprisingly short time, the result of endless redistribution would
be to drag the standard of living of every human being down to
the minimum level that allows any group to reproduce. This would
mean hunger and poverty more extreme and universal than that of any Third
World country. If 99 percent of the human race voluntarily limited its birth
rate, this would merely allow the remaining one percent to expand until
it absorbed almost all the resources.
A third basic approach (which has many variations) takes a middle path:
it involves distributing ownership of the resources of space (genuine, permanent,
transferable ownership) equally among all people - but doing so only once,
then letting people provide for their progeny (or others') from their own
vast share of the wealth of space. This will allow different groups to pursue
different futures, and it will reward the frugal rather than the profligate.
It can provide the foundation for a future of unlimited diversity for the
indefinite future, if active shields are used to protect people from aggression
and theft. No one has yet voiced a plausible alternative.
From a socialist perspective, this approach means equal riches for all.
From a libertarian perspective, it violates no one's property rights and
provides a basis for a future of liberty. In Thomas Schelling's terms, equal
division is a focal point solution in a coordination game (see The
Strategy of Conflict, by Thomas Schelling, Cambridge, Mass: Harvard
University Press, 1960). What "equal division" actually means
is a messy question best left to lawyers.
For this approach to work, agreement will be needed not just on a principle
of division, but on a date. Space has been declared by treaty to be "the
common heritage of all mankind," and we need to choose an Inheritance
Day. Schelling's analysis suggests the importance, in a coordination game,
of finding a specific, plausible proposal and of making it visible
as soon as possible. Does a date suggest itself? A round-numbered space-related
anniversary would seem appropriate, if it were not tied exclusively to the
U.S. or U.S.S.R., or too soon, or too near a millennial date on the calendar.
These constraints can be met; the most plausible candidate is perhaps April
12, 2011: the thirtieth anniversary of the flight of the world's first reusable
spacecraft, the space shuttle, and the fiftieth anniversary of the flight
of the first human into space, Yuri Gagarin.
If, before this date, someone finds and employs a means to raise human reproduction
rates by a factor of ten or more, then Inheritance Day should immediately
be made retroactive to April 12 of the preceding year, and the paperwork
sorted out later.
... to secure a stable, durable peace ...
Active shields can accomplish this reliably only through the use of redundancy
and ample safety margins.
... but nature seems uncooperative ...
For a discussion of the apparent impossibility of time machines in general
relativity, see "Singularities and Causality Violation," by Frank
J. Tipler in Annals of Physics, Vol. 108, pp. 1-36, 1977. Tipler
is open-minded; in 1974 he had argued the other side of the case.
... patterns that resemble ... "fractals"
... Fractal patterns have similar parts on different scales - as,
a twig may resemble a branch which in turn resembles a tree, or as gullies,
streams, and rivers may all echo each other's forms. See The Fractal
Geometry of Nature, by Benoit B. Mandelbrot (San Francisco: W. H.
Freeman, 1982).
... Several groups are now working ...
The information in this paragraph comes from Kevin Ulmer, formerly Director
of Exploratory Research at Genex and now director of the Center for Advanced
Research in Biotechnology (established by the University of Maryland and
the National Bureau of Standards, among others). The group at the NBS has
combined a quantum-mechanical simulation of about forty atoms near the active
site of an enzyme with a Newtonian simulation of the rest of the molecule;
this combination of techniques is the sort needed to describe both the mechanical
action of an assembler arm and the rearrangement of bonds by its tools.
... computers to plan molecular synthesis ...
Advances in this area are summarized by E. J. Corey et al. in "Computer-Assisted
Analysis in Organic Synthesis (Science,
Vol. 228, pp. 408-18, Apr. 26, 1985).
... Forrest Carter's group ... (Personal
communication from Forrest Carter.)
... The Economist reports ... in "When
Chips Give Way to Molecules" (The Economist, Vol. 295, pp. 95-96, May
11, 1985).
... Arthur Kantrowitz has completed ...
These procedures examined the weapons and computer systems proposed for
ballistic missile defense systems. The first was conducted between Richard
Garwin and Edward Gerry and the second between Herbert Lin and Charles Hutchinson;
all four are widely known advocates of opposing positions on these issues.
Among the dozens of mutually agreed-on statements were: (1) that there is
no known fundamental limit to laser power and brightness other than cost,
and (2) that error-free programming is not required for defenses to work,
but (3) that no system has been publicly presented which would be both cost-effective
and able to survive attack. (Personal communication from Arthur Kantrowitz.)
... at Brown University ... See "Personal
Computers on Campus," by M. Mitchell Waldrop (Science,
Vol. 228, pp. 438-44, April 26, 1985).
... this was actually accomplished in 1988 ...
A good review of this and related work may be found in "Protein Design,
a Minimalist Approach," by William F. DeGrado, Zelda R. Wasserman,
and James D. Lear (Science,
Vol. 243, pp. 622-28, 1989).
... a Nobel prize was shared ... Of
particular interest are the Nobel lectures of the two currently active researchers.
See "Supramolecular Chemistry - Scope and Perspectives: Molecules,
Supermolecules, and Molecular Devices," by Jean-Marie
Lehn (Angewandte Chemie International Edition in English,
Vol. 27, pp. 89-112, 1988) and "The Design of Molecular Hosts, Guests,
and Their Complexes," by Donald
J. Cram (Science,
Vol. 240, pp. 760-67, 1988).
... observed and modified individual molecules
... See "Molecular Manipulation Using a Tunnelling Microscope,"
by J. S. Foster, J. E. Frommer, and P. C. Arnett (Nature,
Vol. 331, pp. 324-26, 1988).
... Computer-based tools ... Software
useful for computer-aided design of protein molecules has been described
in "Computer-Aided Model-Building Strategies for Protein Design,"
by C. P. Pabo and E. G. Suchanek (Biochemistry, Vol. 25, pp.
5987-91, 1986) and in "Knowledge-Based Protein Modelling and Design,"
by Tom Blundell et al. (European Journal of Biochemistry, Vol.
172, pp. 513-20, 1988); a program which reportedly yields excellent results
in designing hydrophobic side-chain packings for protein core regions is
described by Jay W. Ponder and Frederic M. Richards in "Tertiary Templates
for Proteins" (Journal of Molecular Biology, Vol. 193,
pp. 775-91, 1987). The latter authors have also done work in molecular modelling
(an enormous and active field); see "An Efficient Newton-like Method
for Molecular Mechanics Energy Minimization of Large Molecules" (Journal
of Computational Chemistry, Vol. 8, pp. 1016-24, 1987). Computational
techniques derived from molecular mechanics have been used to model quantum
effects on molecular motion (as distinct from quantum-mechanical modelling
of electrons and bonds); see "Quantum Simulation of Ferrocytochrome
c," by Chong Zheng et al. (Nature,
Vol. 334, pp. 726-28, 1988).
... A recent summary .. K. Eric Drexler,
"Machines of Inner Space," in 1990 Yearbook of Science and
the Future, edited by D. Calhoun, pp. 160-77 (Chicago: Encyclopaedia
Britannica, 1989).
... A variety of technical papers ...
These include the following papers (which will be collected and rewritten
as parts of my forthcoming
technical book): "Nanomachinery: Atomically Precise Gears and Bearings,"
in the proceedings of the IEEE Micro Robots and Teleoperators Workshop (Hyannis,
Massachusetts: IEEE, 1987); "Exploring
Future Technologies," in The Reality Club, edited by J.
Brockman, pp. 129-50 (New York: Lynx Books, 1988); "Biological and
Nanomechanical Systems: Contrasts in Evolutionary Capacity," in Artificial
Life, edited by C. G. Langton, pp. 501-19 (Reading, Massachusetts:
Addison-Wesley, 1989); and "Rod Logic and Thermal Noise in the Mechanical
Nanocomputer," in Molecular Electronic Devices III, edited
by F. L. Carter (Amsterdam: Elsevier Science Publishers B.V., in press).
For information on the availability of technical papers, please contact
the Foresight Institute at the address
given in the Afterword.
[ Table of Contents
]
[ Previous Section ]
© Copyright 1986, K. Eric Drexler. All rights reserved.
Published and maintained by Russell Whitaker.
Last updated: 23 September 1996