It’s all about scale. Once we shifted our reference frame to molecular or atomic level, a whole new world of possibilities has emerged in front of us.
First raised by the famed physicist Richard Feynman in 1959, the idea of making things at the foundation particle level has triggered a revolution in many fields, including physics, chemistry, biology, material science, medical science, life science, electrical engineering, bio-engineering, chemical-engineering, manufacturing, military and space engineering, etc.. A nanometer (1 nm) is equal to one billionth of a meter and the nanoscale is typically between 0.1-100 nm in size. Most atoms are 0.1-0.2 nm wide; a DNA strand is about 2 nm; a blood cell is about several thousand nm and a strand of human hair is about 80,000 nm in thickness. At nanoscale, quantum mechanics dominates and matters can display many unique properties that are not available under normal scale.
Experimental nanotechnology did not come into tangible existence until 1981 when an IBM research lab in Switzerland built the first scanning tunneling microscope (STM) which allowed the possibility of scanning individual atoms. Subsequently in the following decade, moving single atoms became possible. Other newly discovered techniques around the same time also made manipulations of atoms a true engineering reality, although not a small feat even in today. In 1991, the first carbon nanotube was created, wrapped by a single sheet of graphene (from carbon atoms), which is about 6 times lighter and 100 times stronger than steel. Its properties and electrical conductivity make it a favorite candidate as a nanoscale building block in many applications, especially in high-tech world.
Today, one application area of nanotechnology that has attracted some intense focus is on the extremely small-scale electronic circuitry design. With increasing difficulties to meet Moore’s Law of doubling density of transistors on a single IC with shrinking sizes desired in modern electronics, limitations of silicon chips, including heat and energy constraints, become more obvious and costly to maneuver. Nanotechnology has already rushed to the promising frontier as the replacements for future chips. Many creative ideas are being tested at present.
The graphene chips have already been created in various forms, but they can be too easily damaged in the assembly process to make it a practical production choice for most computers. IBM released an advanced version of the graphene chip in early 2014 by using a new manufacture technique to address the fragility problem. However, at the meantime, the possibility of other nanomaterials to compete with graphene has already come into the stage, for example, some new nanomaterial and assembly method demonstrated by Berkley Lab this year. At atomic levels, once assembled properly, many particles or mixed structures could potentially display the electrical and optical properties needed for building nanochips. This area of the chip manufacturing development will likely see intense competitions in the future.
Various techniques have long been investigated along the ideas of overcoming the limitations in existing chip design by modifying the silicon structure, but many challenges remain in engineering. Earlier this year, UC Davis researchers established a bottom-up approach to add nanowires on top of the silicon, which could create circuitry of smaller dimensions, bear higher temperatures and allow light emission for photonic applications that traditional silicon chips are incapable of. They found a way to grow other nanomaterials on top of silicon crystals to form nanopillars as the stations for nanowires to connect and function like transistors, thus form complex circuits. The most appealing aspect of this method is that it does not require significant changes to today’s manufacture process of silicon-based ICs.
Another completely new thinking of chip design concept at nanoscale comes from utilizing the quantum nature of the natural particles to define binary “0” and “1” instead of from silicon crystals. For example, by switching the direction of a single photon hitting a single atom residing in one of the two atomic states, the resulted direction of the photon could well represent the “0” and “1” logic. Quantum computing is therefore born. If manipulated successfully, quantum states may possibly allow simultaneous existence of more than just “0s” and “1s”, which could promise more powerful potentials for future computers.
With drastically different and innovative landscape in chip designs enabled by nanotechnology today, what are the needs and implications to future software design? One field in the near future that software will definitely play a significant part along with nanotechnology development is the need for logical algorithms to control and manage the “self-replicate” process of the “active” nanocells, especially for AI developments. For application software development, TriStrategist thinks that once nanoscale manufacturing becomes a norm in computer and electronic industry, its flexibility and versatility could only imply that chips can be more adaptably designed based on myriad human needs and the software application programs that run on top of it.
*[Note:] This blog was in fact drafted and published on August 27, 2014, to make up for previous week’s vacation absence.