On Monday night's Terminator, part of the plot revolved around a new microprocessor that promised to work at the "12-nanometer node." The Connor clan became very interested in this chip, since it's exactly the kind of technology that might enable a cyborg to have an artificial intelligence system powerful enough to make it a lethal killing machine and deliver clever quips. Engineers currently refer to different generations of microprocessor technologies as "nodes." Each node is labelled with a length that roughly corresponds to the size of the smallest components that can be etched into the surface of a silicon chip, measured in nanometers (nm). On nanometer equals about 0.00000003 inches. The first real microprocessor CPU, the 4004, was built at the 10,000-nm node in 1971. The CPU's that go into today's desktop and laptop computers are built at the 45-nm node. Smaller nodes means you can cram more transitors and other components onto a microchip, and the ability to move to smaller and smaller nodes has been the driving force behind Moore's Law, which states that the amount of computing power you can buy for a given amount of money doubles every 18 to 24 months. Moore's Law is what has let us go from computers that took up entire rooms in the 1940's to the iPhone. But Moore's Law is beginning to break down -- it's getting astronomically expensive to build the manufacturing plants that can handle each smaller manufacturing node, as the engineering challenges mount. It's estimated that the 16-nm node will be reached in 2018. Beyond this point it is feared that silicon transistors can not be fabricated, as their electronic operation will be overwhelmed by bizarre quantum effects such as tunneling. Engineers are working on ways to transcend the limitations of silicon chips -- proposed solutions include molecular electronics or devices that rely not on an electron's charge, but a quantum property known as spin. It seems that silicon, the material that has come to symbolize the high-tech future over the last 35 years, may well join vacuum tubes and ferrite core memories in the dustbin of computing history.