We have completed maintenance on DiscoverMagazine.com and action may be required on your account. Learn More

Our Brightest Hopes for Keeping Up With Moore's Law

Scientists are trying out strange technological tricks to make computer chips tinier and more powerful.

Jun 29, 2010 4:07 PMNov 20, 2019 10:23 PM

Newsletter

Sign up for our email newsletter for the latest science news
 
Photo Credits: All text by Nina Bai; Image: Texas Instruments

Moore's Law will turn 45 this year and it's facing a mid-life crisis of sorts. In 1965, Gordon Moore (the future co-founder of Intel), observed [pdf] that the number of transistors that could fit onto an integrated circuit roughly doubled every year. His simple prediction, which he later adjusted to a doubling every two years, has become a de facto industry standard, and in some ways, a self-fulfilling prophecy.

Despite constant murmurs about Moore's Law's imminent demise (including by Moore himself), researchers keep churning out faster, more powerful computers right on schedule. How long can technology continue its exponential trajectory towards smaller, faster, cheaper? Here's a look at our brightest hopes for keeping up with Moore's Law.

The first integrated circuit, pictured here, was invented by Jack Kilby in 1958; it measured 1.6 by 11.1 millimeters and featured a single transistor. Compare that to the processors in your laptop, which contain hundreds of millions of transistors, each measuring just 45 nanometers across.

Photo Credits: Image: R. Stanley Williams, Hewlett Packard Laboratories

The most obvious solution to packing in more computing power is to shrink the computer chip. Commercial transistors are now as small as 32 nanometers, but that's closing in on the limits of current fabrication technology.

As any city dweller knows, the way to make the most of limited space is to build up. Researchers have come up with a crossbar design for computer chips, essentially building a layer of nanowires on top of another layer of nanowires at a perpendicular angle. The intersections between the sets of wires are known as memristors, a new circuit element that can store information even after the current is turned off.

Researchers at HP have built prototype memristors, shown here, that may eventually allow computers to mimic the human brain in their ability to retain information and grow stronger with use.

Photo Credits: Image: IBM

As engineers figure out how to cram more chip power into smaller spaces, they also have to tackle the problem of excess heat. Built-in cooling fans and heat sinks (like the thermally conductive aluminum case of the Macbook Air) take off some of the heat. The next step may be tiny plumbing systems that allow water to flow through and whisk away heat.

IBM has designed such a system of hermetically sealed, double-layered pipes of silicon and silicon oxide just .002 inches in diameter, illustrated here, which it hopes to make commercially available in a few years. IBM says short-circuiting won't be a problem.

Photo Credits: Image: Intel

The flow of current through a transistor is regulated by tiny switches, known as gates, which must be electronically isolated. As chips get smaller, the insulating material, traditionally silicon dioxide, has gotten thinner and thinner. Reduced to only a few atoms thick, silicon dioxide is apt to leak current, putting up a road-block to further technical advancement.

A major innovation has been the development of better insulators made of an alloy of the element hafnium, which reduces heat loss and conserves power. Hafnium-based insulators are now used in the 45-nanometer generation of chips made by Intel, shown here below a chip from 1993.

Photo Credits: Image: Liang Pan and Cheng Sun, UC Berkeley

The intricate patterns on today's computer chips are made by a technique known as photolithography. First, a film of photosensitive material, or photoresist, is applied onto a silicon wafer. Exposure to a pattern of intense light causes the photoresist to harden into a protective mask. The wafer is then washed in chemical baths, which etches the unprotected areas. This process can be repeated many times to etch complex and tiny circuit patterns onto the silicon wafer.

The limits of photolithography are set by the wavelength of light. By using ultraviolet light and superlenses capable of subwavelength imaging, scientists say they can push the photolithography boundary down to about 20 nanometers. To get beyond that resolution, researchers are investigating plasmonic lenses, illustrated here, that use excited electrons to focus light into even shorter wavelengths; theoretically, this technique could be used to etch circuit features as small as 5 to 10 nanometers.

Photo Credits: Image: Dmitry V. Kosynkin, Rice University

Though for decades silicon has been the standard material for transistors, there is much hype over the possibility of replacing it with new nanomaterials, such as flexible, one-atom-thick sheets of carbon known as graphene. However, for graphene to have the necessary semiconducting properties, it must be cut into ribbons less than 10 nanometers wide.

Material scientists are making graphene ribbons by unfurling carbon nanotubes. One group stuck the nanotubes onto a polymer film and used ionized argon gas to slice open each tube, resulting in ribbons as narrow as 6 nanometers wide. Another group exposed nanotubes to sulfuric acid and potassium permanganate, a strong oxidizing agent, which strained the carbon bonds and caused the tubes to unzip lengthwise, as shown here.

Some researchers are investigating other promising ways to make graphene an effective semiconductor, like using two-layer graphene along with a special insulating polymer or punching holes in graphene to create a semiconducting "nanomesh," but it remains to be seen if any of these techniques will produce viable chips.

Photo Credits: Image: IBM

As circuit elements shrink, the task of assembling them into the right structure requires new tricks too. Scientists are taking inspiration from one of mother nature's patented designs: DNA. Researchers at IBM have found a way to make viral DNA strands self-assemble into scaffolds on which millions of carbon nanotubes can be placed, creating a cheaper, more efficient alternative to today's silicon chips.

In a technique known as DNA origami, sequences of DNA are custom-designed so that the strands fold into predetermined two- and three-dimensional shapes. Researchers predict that chips assembled this way could be as small as 6 nanometers, though it may be a decade before the results go commercial.

Photo Credits: Image: Robert Lettow

A departure from the chip-shrinking pursuit of Moore's Law is optical computing, which instead aims to accelerate the transfer of information to the speed of light, by replacing electrons with photons. So far, the most likely implementation of optical technology are optical interconnects that would replace the relatively slow copper wires now used to link processor chips to one another. These optical "data pipes" are made from light-emitting indium phosphate and silicon.

A completely light-based computer chip is still a long ways away, but there's reason to hope. Recently, scientists have designed the first transistor for laser beams. The transistor, made of a single dye molecule frozen in suspension to -272 C, can act as a switch to control the power of a laser beam passing through it.

Photo Credits: Image: J. Jost, NIST

If we manage to keep pace with Moore's Law for a few more decades, the ultimate challenge would come at the level of a single atom, electron, or perhaps photon. At that scale, computing would be governed by quantum mechanics, an intimidating yet tantalizing prospect. Vastly more efficient than classical computing, which relies on logic gates that flip between two states, 1 or 0, quantum computing would have access to multiple quantum states, or qubits, at the same time. This would allow quantum computers to take on multiple calculations simultaneously instead of serially.

Traits that could be used as qubits include the different spins of electrons, the magnetic orientation of ions, or the photons emitted by ions. Researchers have already built the first solid-state quantum processor and a device, shown here, that uses lasers to manipulate the qubits of super-chilled beryllium ions.

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.