Last April, at a meeting of the American physical society in Washington, D.C., representatives of three independent laboratories announced new high-precision measurements of the strength of the force of gravity. To the astonishment of the audience, the three measurements disagreed with one another by considerable amounts, and worse, none of them matched the value that physicists have accepted as correct for more than a decade. No one could offer so much as a hint to explain the discrepancies.
To illustrate the magnitude of the predicament, imagine a felon hunted by the police. They know that he is hiding somewhere along a street of ten blocks, with ten houses on each block. On the basis of previous information, the police have concentrated their surveillance on a particular house in the middle of the second block, when suddenly three new and presumably trustworthy witnesses appear. One places the miscreant in the very first house of the first block, the second singles out a dwelling near the end of the first block, while the third witness points to a house way across town at the other end of the street, more than eight blocks from the stakeout.
Experiments to measure G are painfully sensitive to every stray gravitational influence, from sparrows flying over the roof to earthquakes in the antipodes.
What are the cops to do? Go with the majority and move their operation over to the first block? Take an average and wait somewhere in the third block? Try to pick the most reliable witness and concentrate on a single house? Stretch their net to cover the entire ten-block street? Or stay put, discounting the new reports because they contradict one another? Physicists trying to make sense of the new measurements are facing the same unsatisfactory choices.
The goal of the measurements is easy to understand. According to Isaac Newton, any two material objects in the universe attract each other with a force that is proportional to the mass of the objects and that diminishes with their distance from each other. To quantify this phenomenon, physicists define as G the magnitude of the attraction that two one-kilogram masses, exactly one meter apart, exert on each other. Strictly speaking, G is an odd quantity with no intuitive meaning, so for this reason physicists take the liberty of referring to it in more familiar terms as a force. In this case, the value of G is 15.0013 millionths of a millionth of a pound. (G is not to be confused with g, the acceleration of gravity near the surface of Earth, or with the g-force, the effect of an acceleration on a body.)
The conceptual simplicity of measuring the strength of gravity contrasts sharply with the practical difficulty of carrying it out. There are two fundamental reasons for the elusiveness of G. For one thing, gravity is pathetically feeble. If the two chunks of matter were ten times closer, or about four inches apart, the force, though it would rise to 100 G, would still amount to no more than about a billionth of a pound—the weight of an average E. coli bacterium.
The other, more subtle, problem is that gravity, unlike all the other forces of nature, cannot be shielded. Electricity and magnetism, for example, which keep molecules from disintegrating, can be neutralized. Positive charges cancel negative charges, south poles offset north poles. Shielding makes it possible to insulate electrical conductors so they can be handled safely, even if they carry 220 lethal volts, and for the same reason, radios, which feed on electromagnetic radiation, fade in highway tunnels. No such shielding is available for gravity, and hence experiments to measure G are painfully sensitive to every stray gravitational influence, from sparrows flying over the laboratory roof to earthquakes in the antipodes.
Newton, who formulated the universal law of gravity and used it to explain a wealth of phenomena, including the orbits of planets, the tides of the ocean, and the flattening of Earth at its poles, did not need to know the value of G. Nor, for that matter, do NASA engineers who plot the paths of space probes with breathtaking precision. Most applications of the theory of gravity depend only on relative values, such as the ratio of the acceleration of the moon to that of an apple, which can be determined with much greater precision than the absolute value of G.
The first accurate measurement of G was not made, in fact, until 1797, more than a century after the discovery of the law of gravity, and it arose from a classic experiment performed by the English nobleman Henry Cavendish. Cavendish was an eccentric. Although he was said to be “the richest of all learned men, and very likely also the most learned of all the rich,” he lived frugally, spending his wealth only on books and scientific equipment. Morbidly taciturn and pathologically reclusive, he was such a confirmed misogynist that he communicated with his female housekeeper only by written notes.
Winfried Michaelis’s group in Brunswick, Germany, creates an electric field by means of two electrostatic generators to hold one end of a crossbar in place while the other end (not shown) is subjected to a minute gravitational tug. the crossbar floats on a pool of mercury.
Yet for all his bizarre behavior, Cavendish was one of the most original and productive scientists of his generation. The ingenious device he employed for measuring G, called a torsion balance, had been built by the clergyman and amateur naturalist John Michell and was invented simultaneously by the French electrical pioneer Charles Coulomb, but in Cavendish’s skillful hands it revolutionized the science of precision measurements. Almost all of the hundreds of subsequent determinations of G have used the torsion balance. Furthermore, it has been adapted for countless other applications, such as seismological measurements and electrical calibration—wherever precise control over very small forces is called for.
The conceptual basis of the torsion balance is the observation that it doesn’t take much force to induce a twist, or torsion, in a long, thin wire hanging from the ceiling. (A hanged man twists even in a faint breeze.) If a horizontal crossbar is hung from the lower end of the wire, in the manner of a rod in a mobile, it can serve as a pointer for indicating the angle through which the wire has been twisted. Once such a torsion balance has been calibrated, it becomes a measuring device for minuscule forces applied to one end of the crossbar: a small horizontal push results in a sizable angle of twist.
Cavendish attached a small lead ball to one end of the crossbar, brought an enormous weight on a fixed support to a point slightly in front of the ball, and then watched the wire twist as the ball was attracted to the weight. (Actually, to balance his apparatus, he placed identical balls at both ends of the crossbar, dumbbell fashion, and doubled the attraction by mounting two large weights symmetrically as close to the balls as he could get without their touching.) By measuring the minute twist induced in the wire in this contrivance, Cavendish read off the actual force that caused it. From this, and the measured dimensions of the apparatus, he was able to deduce the value of G by means of simple proportions. The result was in the ballpark of the modern value, but what a huge ballpark it was. Cavendish estimated his precision at about 7 percent, which translates into locating the fugitive felon somewhere within the span of 100 blocks.
Modern measurements are almost a thousand times better, pinning the culprit down to a specific house (although the disagreements among the new results take the bloom off this achievement). But the uncertainty in the value of G remains astronomical by today’s exacting standards. Historically, G was the first universal constant of physics, and ironically it is by a wide margin the least well known. Modern physics is built on such numbers as the speed of light (c); the charge of an electron (e); and the quantum of action (h), which determines the sizes of atoms. Some of these constants have been measured to within one part in 100 million, others to a few parts per million. Our ignorance of G, compared with all of them, shocks by its crudeness.
The constants c, e, and h are entangled with one another in a tight web of interconnections that spans the microworld, in the sense that all measurements of atomic and nuclear properties must ultimately be expressed in terms of these and a small handful of other numbers. Such entanglement entails a complex system of cross-checks and mutual constraints that help fix the fundamental constants with impressive precision. Unfortunately, G does not participate in any of these relationships, because gravity plays no role in the atom. The gravitational attraction among atomic constituents is 30 or 40 orders of magnitude weaker than the competing electrical and nuclear forces and is thus completely irrelevant. In the end G stands naked and aloof, the ancient, unapproachable king of the fundamental constants.
So why not just leave it alone? Why do scientists devote their energies and careers to a better determination of G instead of pursuing more profitable ends? Currently there is no practical value in knowing its magnitude. Neither astronomy nor geology nor space exploration would benefit from a new measurement, so these are not motivations for the new experiments. Instead scientists want to measure G as a matter of principle—just because it’s there. And that is how science progresses. In the late nineteenth century astronomers struggled to tease out a tiny anomaly in the orbit of Mercury—an aberration that would never affect a calendar or the prediction of an eclipse. They measured it just because it was there, with no inkling that it would soon emerge as the sole experimental anchor of a revolutionary new conception of space and time—the general theory of relativity.