StudyBoss » Technology » Nanotechnology Report Essay

Nanotechnology Report Essay

Nanotechnology is an anticipated manufacturing technology giving thorough, inexpensive control of the structure of matter. The term has sometimes been used to refer to any technique able to work at a submicron scale Molecular manufacturing will enable the construction of giga-ops computers smaller than a cubic micron; cell repair machines; personal manufacturing and recycling appliances; and much more. Nanotechnology Broadly speaking, the central thesis of nanotechnology is that almost any chemically stable structure that can be specified can in fact be built.

This possibility was first advanced by Richard Feynman in 1959 when he said: “The principles of physics, as far as I can see, do not speak against the possibility of maneuvering things atom by atom. ” (Feynman won the 1965 Nobel prize in physics). This concept is receiving increasing attention in the research community. There have been three international conferences directly on molecular nanotechnology as well as a broad range of conferences on related subjects.

Science said “The ability to design and manufacture devices that are only tens or hundreds of atoms across promises rich rewards in electronics, catalysis, and materials. The scientific rewards should be just as great, as researchers approach an ultimate level of control – assembling matter one atom at a time. ” “Within the decade, Foster or some other scientist is likely to learn how to piece together atoms and molecules one at a time using the STM . ” (Referring to John Foster of IBM Almaden labs, who spelled “IBM” by pushing xenon atoms around with a scanning tunnelling microscope.

Eigler and Schweizer at IBM reported on “. the use of the STM at low temperatures (4K) to position individual xenon atoms on a single- crystal nickel surface with atomic precision. This capacity has allowed us to fabricate rudimentary structures of our own design, atom by atom. The processes we describe are in principle applicable to molecules also”. Drexler has proposed the assembler, a device having a submicroscopic robotic arm under computer control. It will be capable of holding and positioning reactive compounds in order to control the precise location at which chemical reactions take place.

This general approach should allow the construction of large atomically precise objects by a sequence of precisely controlled chemical reactions, building objects molecule by molecule. If designed to do so, assemblers will be able to build copies of themselves, that is, to replicate. Because they will be able to copy themselves, assemblers will be inexpensive. We can see this by recalling that many other products of molecular machines–firewood, hay, potatoes–cost very little. By working in large teams, assemblers and more specialized nanomachines will be able to build objects cheaply.

By ensuring that each atom is properly placed, they will manufacture products of high quality and reliability. Left-over molecules would be subject to this strict control as well, making the manufacturing process extremely clean. The plausibility of this approach can be illustrated by the ribosome. Ribosomes manufacture all the proteins used in all living things on this planet. A typical ribosome is relatively small (a few thousand cubic nanometers) and is capable of building almost any protein by stringing together amino acids (the building blocks of proteins) in a precise linear sequence.

To do this, the ribosome has a means of grasping a specific amino acid (more precisely, it has a means of selectively grasping a specific transfer RNA, which in turn is chemically bonded by a specific enzyme to a specific amino acid), of grasping the growing polypeptide, and of causing the specific amino acid to react with and be added to the end of the polypeptide. The instructions that the ribosome follows in building a protein are provided by mRNA (messenger RNA). This is a polymer formed from the four bases adenine, cytosine, guanine, and uracil. A sequence of several hundred to a few thousand such bases codes for a specific protein.

The ribosome “reads” this “control tape” sequentially, and acts on the directions it provides. In an analogous fashion, an assembler will build an arbitrary molecular structure following a sequence of instructions. The assembler, however, will provide three-dimensional positional and full orientational control over the molecular component (analogous to the individual amino acid) being added to a growing complex molecular structure (analogous to the growing polypeptide). In addition, the assembler will be able to form any one of several different kinds of chemical bonds, not just the single kind (the peptide bond) that the ribosome makes.

Calculations indicate that an assembler need not inherently be very large. Enzymes “typically” weigh about 10^5 amu (atomic mass units). while the ribosome itself is about 3 x 10^6 amu. The smallest assembler might be a factor of ten or so larger than a ribosome. Current design ideas for an assembler are somewhat larger than this: cylindrical “arms” about 100 nanometers in length and 30 nanometers in diameter, rotary joints to allow arbitrary positioning of the tip of the arm, and a worst-case positional accuracy at the tip of perhaps 0. o 0. 2 nanometers, even in the presence of thermal noise.

Even a solid block of diamond as large as such an arm weighs only sixteen million amu, so we can safely conclude that a hollow arm of such dimensions would weigh less. Six such arms would weigh less than 10^8 amu. The assembler requires a detailed sequence of control signals, just as the ribosome requires mRNA to control its actions. Such detailed control signals can be provided by a computer. A feasible design for a molecular computer has been presented by Drexler.

This design is mechanical in nature, and is based on sliding rods that interact by blocking or unblocking each other at “locks. ” This design has a size of about 5 cubic nanometers per “lock” (roughly equivalent to a single logic gate). Quadrupling this size to 20 cubic nanometers (to allow for power, interfaces, and the like) and assuming that we require a minimum of 10^4 “locks” to provide minimal control results in a volume of 2 x 10^5 cubic nanometers (. 0002 cubic microns) for the computational element. (This many gates is sufficient to build a simple 4-bit or 8-bit general purpose computer, e. a 6502).

An assembler might have a kilobyte of high speed (rod-logic based) RAM, (similar to the amount of RAM used in a modern one-chip computer) and 100 kilobytes of slower but more dense “tape” storage – this tape storage would have a mass of 10^8 amu or less (roughly 10 atoms per bit – see below). Some additional mass will be used for communications (sending and receiving signals from other computers) and power. In addition, there will probably be a “toolkit” of interchangable tips that can be placed at the ends of the assembler’s arms.

When everything is added up a small assembler, with arms, computer, “toolkit,” etc. should weigh less than 10^9 amu. Escherichia coli (a common bacterium) weigh about 10^12 amu. Thus, an assembler should be much larger than a ribosome, but much smaller than a bacterium. It is also interesting to compare Drexler’s architecture for an assembler with the Von Neumann architecture for a self replicating device. Von Neumann’s “universal constructing automaton” had both a universal Turing machine to control its functions and a “constructing arm” to build the “secondary automaton.

The constructing arm can be positioned in a two-dimensional plane, and the “head” at the end of the constructing arm is used to build the desired structure. While Von Neumann’s construction was theoretical (existing in a two dimensional cellular automata world), it still embodied many of the critical elements that now appear in the assembler. Should we be concerned about runaway replicators? It would be hard to build a machine with the wonderful adaptability of living organisms.

The replicators easiest to build will be inflexible machines, like automobiles or industrial robots, and will require special fuels and raw materials, the equivalents of hydraulic fluid and gasoline. To build a runaway replicator that could operate in the wild would be like building a car that could go off-road and fuel itself from tree sap. With enough work, this should be possible, but it will hardly happen by accident. Without replication, accidents would be like those of industry today: locally harmful, but not catastrophic to the biosphere.

Catastrophic problems seem more likely to arise though deliberate misuse, such as the use of nanotechnology for military aggression. Chemists have been remarkably successful at synthesizing a wide range of compounds with atomic precision. Their successes, however, are usually small in size (with the notable exception of various polymers). Thus, we know that a wide range of atomically precise structures with perhaps a few hundreds of atoms in them are quite feasible. Larger atomically precise structures with complex three-dimensional shapes can be viewed as a connected sequence of small atomically precise structures.

While chemists have the ability to precisely sculpt small collections of atoms there is currently no ability to extend this capability in a general way to structures of larger size. An obvious structure of considerable scientific and economic interest is the computer. The ability to manufacture a computer from atomically precise logic elements of molecular size, and to position those logic elements into a three- dimensional volume with a highly precise and intricate interconnection pattern would have revolutionary consequences for the computer industry.

A large atomically precise structure, however, can be viewed as simply a collection of small atomically precise objects which are then linked together. To build a truly broad range of large atomically precise objects requires the ability to create highly specific positionally controlled bonds. A variety of highly flexible synthetic techniques have been considered in . We shall describe two such methods here to give the reader a feeling for the kind of methods that will eventually be feasible. We assume that positional control is available and that all reactions take place in a hard vacuum.

The use of a hard vacuum allows highly reactive intermediate structures to be used, e. g. , a variety of radicals with one or more dangling bonds. Because the intermediates are in a vacuum, and because their position is controlled (as opposed to solutions, where the position and orientation of a molecule are largely random), such radicals will not react with the wrong thing for the very simple reason that they will not come into contact with the wrong thing. Normal solution-based chemistry offers a smaller range of controlled synthetic possibilities.

For example, highly reactive compounds in solution will promptly react with the solution. In addition, because positional control is not provided, compounds randomly collide with other compounds. Any reactive compound will collide randomly and react randomly with anything available. Solution-based chemistry requires extremely careful selection of compounds that are reactive enough to participate in the desired reaction, but sufficiently non-reactive that they do not accidentally participate in an undesired side reaction.

Synthesis under these conditions is somewhat like placing the parts of a radio into a box, shaking, and pulling out an assembled radio. The ability of chemists to synthesize what they want under these conditions is amazing. Much of current solution-based chemical synthesis is devoted to preventing unwanted reactions. With assembler-based synthesis, such prevention is a virtually free by-product of positional control. To illustrate positional synthesis in vacuum somewhat more concretely, let us suppose we wish to bond two compounds, A and B.

As a first step, we could utilize positional control to selectively abstract a specific hydrogen atom from compound A. To do this, we would employ a radical that had two spatially distinct regions: one region would have a high affinity for hydrogen while the other region could be built into a larger “tip” structure that would be subject to positional control. A simple example would be the 1-propynyl radical, which consists of three co-linear carbon atoms and three hydrogen atoms bonded to the sp3 carbon at the “base” end.

The radical carbon at the radical end is triply bonded to the middle carbon, which in turn is singly bonded to the base carbon. In a real abstraction tool, the base carbon would be bonded to other carbon atoms in a larger diamondoid structure which provides positional control, and the tip might be further stabilized by a surrounding “collar” of unreactive atoms attached near the base that would prevent lateral motions of the reactive tip. The affinity of this structure for hydrogen is quite high.

Propyne (the same structure but with a hydrogen atom bonded to the “radical” carbon) has a hydrogen-carbon bond dissociation energy in the vicinity of 132 kilocalories per mole. As a consequence, a hydrogen atom will prefer being bonded to the 1-propynyl hydrogen abstraction tool in preference to being bonded to almost any other structure. By positioning the hydrogen abstraction tool over a specific hydrogen atom on compound A, we can perform a site specific hydrogen abstraction reaction. This requires positional accuracy of roughly a bond length (to prevent abstraction of an adjacent hydrogen).

Quantum chemical analysis of this reaction by Musgrave et. al. show that the activation energy for this reaction is low, and that for the abstraction of hydrogen from the hydrogenated diamond (111) surface (modeled by isobutane) the barrier is very likely zero. Having once abstracted a specific hydrogenatom from compound A, we can repeat the process for compound B. We can now join compound A to compound B by positioning the two compounds so that the two dangling bonds are adjacent to each other, and allowing them to bond.

This illustrates a reaction using a single radical. With positional control, we could also use two radicalssimultaneously to achieve a specific objective. Suppose, for example, that two atoms A1 and A2 which are part of some larger molecule are bonded to each other. If we were to position the two radicals X1 and X2 adjacent to A1 and A2, respectively, then a bonding structure of much lower free energy would be one in which the A1-A2 bond was broken, and two new bonds A1-X1 and A2-X2 were formed.

Because this reaction involves breaking one bond and making two bonds (i. , the reaction product is not a radical and is chemically stable) the exact nature of the radicals is not critical. Breaking one bond to form two bonds is a favored reaction for a wide range of cases. Thus, the positional control of two radicals can be used to break any of a wide range of bonds. A range of other reactions involving a variety of reactive intermediate compounds (carbenes are among the more interesting ones) are proposed in , along with the results of semi-empirical and ab initio quantum calculations and the available experimental evidence.

Another general principle that can be employed with positional synthesis is the controlled use of force. Activation energy, normally provided by thermal energy in conventional chemistry, can also be provided by mechanical means. Pressures of 1. 7 megabars have been achieved experimentally in macroscopic systems. At the molecular level such pressure corresponds to forces that are a large fraction of the force required to break a chemical bond.

A molecular vise made of hard diamond-like material with a cavity designed with the same precision as the reactive site of an enzyme can provide activation energy by the extremely precise application of force, thus causing a highly specific reaction between two compounds. To achieve the low activation energy needed in reactions involvingradicals requires little force, allowing a wider range of reactions to be caused by simpler devices (e. g. , devices that are able to generate only small force). Further analysis is provided in .

Feynman said: “The problems of chemistry and biology can be greatly helped if our ability to see what we are doing, and to do things on an atomic level, is ultimately developed – a development which I think cannot be avoided. ” Drexler has provided the substantive analysis required before this objective can be turned into a reality. We are nearing an era when we will be able to build virtually any structure that is specified in atomic detail and which is consistent with the laws of chemistry and physics. This has substantial implications for future medical technologies and capabilities.

One consequence of the existence of assemblers is that they are cheap. Because an assembler can be programmed to build almost any structure, it can in particular be programmed to build another assembler. Thus, self reproducing assemblers should be feasible and in consequence the manufacturing costs of assemblers would be primarily the cost of the raw materials and energy required in their construction. Eventually (after amortization of possibly quite high development costs), the price of assemblers (and of the objects they build) should be no higher than the price of other complex structures made by self-replicating systems.

Potatoes – which have a staggering design complexity involving tens of thousands of different genes and different proteins directed by many megabits of genetic information – cost well under a dollar per pound. The three paths of protein design (biotechnology), biomimetic chemistry, and atomic positioning are parts of a broad bottom up strategy: working at the molecular level to increase our ability to control matter. Traditional miniaturization efforts based on microelectronics technology have reached the submicron scale; these can be characterized as the top down strategy.

Cite This Work

To export a reference to this article please select a referencing style below:

Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.

Leave a Comment