Skeptoid PodcastSkeptoid on Facebook   Skeptoid on Twitter   Skeptoid on Spotify   iTunes   Google Play

Members Portal

Support Us Store

 

Free Book

 

Attack of the Nanobots!

Donate A look at the belief that nanotechnology may result in an army of self-replicating machines that consume society.  

by Brian Dunning

Filed under Conspiracy Theories, Environment, General Science

Skeptoid Podcast #317
July 3, 2012
Podcast transcript | Subscribe

Listen on Apple Podcasts Listen on Spotify

Share Tweet Reddit

Attack of the Nanobots!

Today we're going to point the skeptical eye at nanotechnology, which as you may have noticed, has popped up on the agenda of quite a few activist groups. Nanotechnology is one of those words that sounds impressive and perhaps even a little forboding, but it turns out that not very many people have a very good idea of what it means. About the only thing most of us know is that it involves very small objects, possibly nanoscale robots, strange materials that leverage quantum effects, and perhaps the ability to run out of control and spread like some sort of synthetic virus. We're going to see what nanotechnology really is, we're going to compare that to some of the popular fears about it, and finally see what actual dangers there may be.

The short definition of nanotechnology is the manipulation of matter at the atomic or molecular level. At this scale, we usually measure things in nanometers, which are billionths of a meter. Using our highest resolution imaging technologies — various types of scanning probe microscopes — we can actually "see" individual atoms. We can't resolve them visually since they're so much smaller than wavelengths of light, but by oscillating a physical probe over a surface, we can measure the interaction between the tip of the probe and the surface, and thus detect the presence of individual atoms; or more commonly, larger structures made of atoms and/or molecules. The probe is so sharp that only a single atom sits at its tip. A scanning tunneling microscope (STM) detects changes in the electrical current when electrons jump between the sample and the conductive tip, through an effect called quantum tunneling, when an electron disappears from one atom and reappears at another nearby. A newer variant is the atomic force microscope (AFM) where a cantilever with an atomically sharp probe is deflected by various electrostatic and other forces as it interacts with the sample, and the cantilever's movements are measured with a laser. Having a friend who designs them, I've played with both machines quite a lot, and had hours of fun crashing the head and trying to move atoms around.

Perhaps the best known example of a nanomaterial is a carbon nanotube, basically a tube made of carbon atoms that join together in a hexagonal pattern and form the simplest, smallest jungle gym possible. It would never be practical to construct nanomaterials the way I did, by manually moving atoms around one at a time; so instead we rely on self-assembly. We create conditions under which the atoms or molecules naturally join together in just the way that we want. We can control the temperature, electrical conditions, the availability of the desired source chemicals, the presence of catalyst materials and templates, and other such conditions.

Self-assembly permits mass production of nanoscale materials, at least of those with limited complexity. There are well over a thousand such products on the market today, in just about every product category, and growing every week. The Project on Emerging Nanotechnologies maintains a public database of such products, including every consumer product you can think of, from television screens to cosmetics, to industrial products. For the most part, these products are in material sciences; substances that are stronger, lighter, better adhering, having lower friction, or whatever quality a customer wants in a material. The basic idea behind nanotechnology is that it lets us custom design the exact, best material for the job, when that particular arrangement of molecules may not exist in nature, or may not be possible to construct using conventional engineering.

But of course, there is, potentially, a dark side. Anytime we have a new technology, that technology can be used for good or for bad. Nanothermite is an explosive that gets its name from super-fine particles of the oxidizing agent that's incorporated, which produce a higher speed of the explosive reaction. Generally, any time a product with a weapons application incorporates a nanoscale material (which, as these materials come into wider and wider use, is increasingly common), that weapon can be (and often is) called a "nanoweapon". Far more often than not, a nanoweapon is not a swarm of tiny robots that invade your brain, it's a conventional weapon that incorporates a nanoscale material.

The civic and cultural opposition to nanotech that exists is not entirely uninformed. The city of Berkeley, California is the world's first that regulates nanotechnology. It's not that they have a goofy fear of clouds of brain-invading mini-robots, it's that they believe new materials may pose unknown dangers. The University of California, Berkeley and the Lawrence Berkeley National Laboratory are both located inside the city, and both do nanotech research. Some fear that tiny particles might be hazardous to the respiratory system in the same way as asbestos and particulate air pollution, which is a perfectly rational concern, and thus the field of nanotoxicology. But any new material can be unexpectedly hazardous, and this is a problem that is in no way unique to nanotech. Singling out nanoscale materials as especially risky is logically comparable to singling out blue cars as especially likely to run out of gas.

It is the concept of the self-replicating nanobot that captures the imagination of the science fan, as well as that of the science fiction fan (as ably depicted in Neal Stephenson's novel The Diamond Age) and the conspiracy theorist who fears a nanotech doomsday. The theoretical exponential growth of such machines — and their consumption of the entire planet in doing so — is called ecophagy, consumption of the environment. Mathematician and computer scientist John von Neumann is the one who became the namesake of von Neumann machines, defined by a universal computer, able to run any program it is given, paired with a universal constructor, able to build anything it is instructed. This machine is the basic self-replicator. But it was Eric Drexler's 1986 book Engines of Creation that first effectively put the concept into the public's eye. It was refreshed with even greater public awareness in a famous Wired magazine article by Bill Joy called "Why the Future Doesn't Need Us." Drexler coined the popular term "gray goo" to describe the proverbial living mass of nanobots. He famously gave an example where such a runaway reaction could produce more gray goo than the mass of the entire Earth in less than two days, but here's the rub: assuming the availability of both raw materials and a fuel source.

Lessons from the natural world seem to suggest that nanobots might be able to breed and multiply in the same way. Termites are an obvious example. Wood is a fuel, so termites are able to power their activity and breed. Everywhere there is an available fuel source and suitable raw materials, a self-replicating robot is theoretically possible.

However, in Robert Freitas' paper Some Limits to Global Ecophagy by Biovorous Nanoreplicators, with Public Policy Recommendations., he showed that nanobots would likely be highly sensitive to how optimal is their environment. He goes into great detail regarding chemical availability and efficiency, noting that carbon-based nanobots would do well consuming the ecosphere, but would not be well suited to consume the Earth's crust; suggesting that perhaps aluminum, titanium, or boron based architectures would be better. He wrote:

Unlike almost any other natural material, biomass can serve both as a source of carbon and as a source of power for nanomachine replication. Ecophagic nanorobots would regard living things as environmental carbon accumulators, and biomass as a valuable ore to be mined for carbon and energy. Of course, biosystems from which all carbon has been extracted can no longer be alive but would instead become lifeless chemical sludge.

Nanoscale machines would have to do their construction in the same way that I did with the STM: picking and placing one atom at a time. Drexler proposed a tiny manipulator capable of performing atomic-precision operations. It would consist of approximately four million atoms, and would be capable of about one million operations per second. But the manipulator itself requires a substantial infrastructure. It requires command and control, mobility, and not insignificantly, a power source. Freitas expanded on Drexler's design and came up with a complete machine incorporating two Drexler manipulators and the necessary power source, a nanoscale computer, and mobility; its only function would be self-replication. It would be made of diamondoid, a class of materials (mainly hydrocarbons) that are structured in the same basic way as diamond: a three-dimensional lattice that is structurally stiff. Freitas' complete nanobot, which is a reasonable minimum, requires at least seventy million atoms. If the nanobot is to be endowed with capabilities other that self-replication, the design would have to grow accordingly.

Such nanobots actually have a lot going against them. Their power requirements are high, and they'll generate a lot of waste heat. A handful of gray goo, assuming it is fueled and actively reproducing, would be hot enough to boil water and burn your hand. In a letter to Freitas, cryptology and nanotech expert Ralph Merkle suggested much more efficient nanobots that would not incinerate their environment, but at the cost of lower productivity.

Nanobots would be finicky eaters. Unless placed in an environment consisting solely of the carbon and hydrogen atoms needed to construct diamondoid clones, nanobots would quickly become mired in a swamp of useless, unwanted molecules, and even each other. Their ability to spread physically is likely always going to be confined to a very specific given resource.

Nanobots are also inherently inflexible. While living creatures are highly adaptible and flexible, machines are not. As Merkle noted in a talk at Stanford University, a Boeing 747 is simply a worthless hunk of scrap metal without the active support of fuel, maintenance, spare parts, and constant attention. Deny these things to it, and the 747 will no longer function. Similarly, realistic nanobots are also likely to fail without external support.

Key to this is what Merkle calls broadcast architecture. While von Neumann machines contain their own blueprints and are able to replicate those instructions autonomously, machines built under broadcast architecture are unable to do anything unless instructions are given to them from a single external computer. This lack of autonomous capability is a built-in inherent safety measure. If we turn off its instructions, the nanobot stops. Onboard control systems such as those described by Drexler and Freitas, made up of a countable number of atoms, are too limited to do much of anything at all unless that instruction is provided.

Will nanotechnology result in gray goo sweeping our planet, turning ourselves, our constructions, and the very rocks into dust and sludge? It's not very likely. There is no useful benefit in building simple self-replicators, and it's both less feasible and less useful to build von Neumann machines compared to Merkle machines. No doubt someone's going to try it someday, but will the nanobots successfully walk where only theory has talked? If they do, we may never know.


By Brian Dunning

Please contact us with any corrections or feedback.

 

Shop apparel, books, & closeouts

Share Tweet Reddit

Cite this article:
Dunning, B. "Attack of the Nanobots!" Skeptoid Podcast. Skeptoid Media, 3 Jul 2012. Web. 28 Mar 2024. <https://skeptoid.com/episodes/4317>

 

References & Further Reading

Drexler, E. Engines of Creation. New York: Doubleday, 1986.

Eigler, D., Schweizer, E. "Positioning Single Atoms with a Scanning Tunneling Microscope." Nature. 5 Apr. 1990, Number 344: 524-526.

Freitas, R. "Some Limits to Global Ecophagy by Biovorous Nanoreplicators, with Public Policy Recommendations." Robert A. Freitas Jr. Robert A. Freitas Jr., 1 Apr. 2000. Web. 30 Jun. 2012. <http://www.rfreitas.com/Nano/Ecophagy.htm>

Joy, B. "Why the Future Doesn't Need Us." Wired. 1 Apr. 2000, Volume 8, Number 4.

Lederberg, J. "Infectious History." Science. 14 Apr. 2000, Number 288: 287-293.

Merkle, R. "Self Replicating Systems and Low Cost Manufacturing." Nanotechnology. Xerox PARC, 1 Dec. 1994. Web. 15 Jun. 2012. <http://www.zyvex.com/nanotech/selfRepNATO.html>

 

©2024 Skeptoid Media, Inc. All Rights Reserved. Rights and reuse information

 

 

 

Donate

 

 


Shop: Apparel, books, closeouts

 

 

Now Trending...

Tartaria and the Mud Flood

The Truth About Remote Viewing

The UFO Rogues Gallery Takes Over America, Part 1

Environmental Working Group and the Dirty Dozen

The UFO Rogues Gallery Takes Over America, Part 2

The Siberian Hell Sounds

On Railroad Tracks and Roman Chariots

Foo Fighters

 

Want more great stuff like this?

Let us email you a link to each week's new episode. Cancel at any time: