What Is the Scientific Method and How Did It Shape Science?

How careful observation, strict reasoning and clever hypotheses guided the great human endeavor of science.

By Cody Cottier
Feb 2, 2021 10:38 PMMar 17, 2023 8:13 PM
Statue of Aristotle a great greek philosopher - shutterstock
(Credit: Ververidis Vasilis/Shutterstock)

Newsletter

Sign up for our email newsletter for the latest science news
 

Around the turn of the 6th century B.C., beside the Aegean Sea in the city of Miletus, the first Greek philosopher concluded that “all is water.” His name was Thales. His pupil, Anaximander, disagreed — he believed the underlying substance of the universe was “indefinite stuff.” His own student, Anaximenes, thought it was air. 

These ideas seem fantastic, but in them the scientific mind is taking root. They’re arguably the first competing hypotheses, marking “a shift away from mythological explanations,” says Brian Hepburn, a philosopher of science at Wichita State University. Setting aside gods and supernatural forces, these philosophers instead base their understanding of nature on observation. In other words, they employ a rudimentary form of what we now call scientific method

Thales and his disciples influenced Aristotle, who in turn deeply influenced every prominent Western philosopher for the next two and a half millennia — that includes Francis Bacon, who repackaged scientific method for the modern age and set the agenda for the Scientific Revolution of the 16th and 17th centuries.

Since then, science has earned its place among the most fruitful human enterprises. Not only is it “the most robust and rigorous tradition we have of applying scrutiny to stories about the world,” Hepburn says, but it also “allows you to do things like build an Internet or a GPS satellite, or send a rocket to the moon.” And, as the cosmologist Hermann Bondi put it, “There is no more to science than its method.” 

But for all its success, and all the legend surrounding it, this blueprint for knowledge-gathering isn’t as simple as it appears in textbooks. For 500 years scientists and philosophers have argued over how it ought to work, and these days many question whether it even makes sense to search for the scientific method — history suggests there are many.

The Beginning of Method

What makes science, science? The details vary tremendously across time, space and field of study. For Aristotle, its foundation was passive observation of nature. In the modern age, it often involves experimentation, too. Besides these, according to the Stanford Encyclopedia of Philosophy, the most common elements are “inductive and deductive reasoning, and the formation and testing of hypotheses and theories.”

Bacon — often called the “father of empiricism” — thought of these strategies as an intellectual toolset for when our own cognitive abilities fall short. “The unassisted hand and the understanding left to itself possess little power,” he writes in the opening lines of the Novum Organum (“New Organon” — a reference to Aristotle’s logical treatise, the Organon, in which he gave perhaps the first guidelines for scientific inquiry). “Effects are produced by the means of instruments and helps,” Bacon continues, “which the understanding requires no less than the hand.”

As the microscope and telescope reveal spheres of reality hidden to the naked eye, so scientific method grants us myopic humans a view into the deeper structure of the natural world. This is crucial, since science often deals with objects and processes that are inaccessible, whether physically (the center of the Earth), temporally (the evolution of life) or intellectually (quantum mechanics).

Apart from his method, Bacon made no major discoveries himself. But a contemporary of his, Galileo Galilei — also sometimes called the “father of science” — put the new method to good use in his famous motion experiments and astronomical observations. Then came science’s next superstar, Isaac Newton, with his monumental laws of motion and gravitation. In the Principia Mathematica, Newton even formulated his own methodological rules, or “regulae philosophandi,” for scientific reasoning. 

Induction vs. Deduction

Scientists, and humans in general, use two modes of reasoning: inductive and deductive. Inductive reasoning moves from particular observations (all the birds I’ve seen have wings) to general claims (all birds have wings). Deductive reasoning works in the other direction, from information you already know (all men are mortal) to specific conclusions (Steve is a man, so Steve is mortal).

Both Bacon and Newton considered themselves inductivists. Newton made a point to say he “feigned no hypotheses,” using derisively what is now a perfectly conventional scientific term. Charles Darwin also claimed to operate inductively when he constructed the theory of evolution, building from the ground up with his studies of Galapagos finches and other animals.

But the century after Newton brought David Hume, an Enlightenment philosopher who framed one of the great epistemological dilemmas: The problem of induction. Basically, he argued, there is no justification for assuming that what you do know has any bearing on what you don’t know, or that the future will resemble the past. It makes no difference how many times a rock falls back to Earth after you throw it up —  no logical necessity requires it to do so the next time. 

In science, inductivism’s counterpart is hypothetico-deductivism, in which you start with a hypothesis, deduce its implications, and test for them. A hypothesis can be as complex as Newton’s idea that “all matter exerts force on all other matter,” or as simple as “all knives are sharp.” From there, you predict that X will happen if your hypothesis is correct, and see if X does happen. In this case, X is that you will find only sharp knives, so if you find a dull one you must reject the hypothesis. On the other hand, if you find 100 knives and each one is sharp, you might move the hypothesis up a rung on the ladder of confirmation.

The Rigorous Road to Theory

It should be clear from this example that no hypothesis can ever be definitively confirmed (the investigator has yet to encounter a butter knife). In fact, the scientist’s goal should be to disprove the hypothesis, as emphasized by the influential 20th-century philosopher Karl Popper. If a claim can’t be disproven, it’s not falsifiable. And, as he wrote in The Logic of Scientific Discovery, “in so far as it is not falsifiable, it does not speak about reality” (the existence of ghosts and God, for example, fall into this category).

But it’s equally important to understand that this inevitable uncertainty doesn’t diminish the aim of science. In Popper’s system of “falsificationism,” the more scientists test a theory without refuting it, the more likely it is to be accurate — the better “corroborated” it is, in laboratory lingo. If it survives enough scrutiny, it earns the title of “theory,” like Albert Einstein’s  general relativity or Nicolaus Copernicus’s heliocentrism.

When scientists use the word “theory,” they don’t mean it in the colloquial sense of “just a theory.” Quite the opposite: It’s the highest honor an empirical explanation can receive. A theory may be wrong, of course, since the problem of induction prevents absolute certainty. But the rigors of scientific method do make it unlikely that any will gain consensus without formidable evidence. 

Scientists are especially vigilant in assessing evidence that contradicts a well-established theory. When a team of researchers claimed in 2011 to have detected neutrinos moving faster than light — doing the impossible, that is — the British physicist Jim Al-Khalili was so prepared to defend the cosmic speed limit that he vowed to “eat my boxer shorts on live TV” if it turned out to be true. Luckily, within a year, further analysis spared him the embarrassment.

Scientists can be sometimes too obstinate in resisting new theories. Near the turn of the 20th century, when data on Mercury’s orbit discredited Newton’s longstanding gravitational laws, many physicists were wary. Even after decades of evidence favored relativity over Newtonian physics, “there were some who never accepted it,” says Peter Vickers, a philosopher of science at Durham University. “They went to their graves saying, ‘The theory’s fine.’” Max Planck, the founder of quantum theory, cynically remarked that a new scientific truth doesn’t triumph by convincing everyone, “but rather because its opponents eventually die.”

Even so, new generations have always accepted the truth (or the best available version). That, Vickers says, is the point. Scientific method is no guarantee against error in individual cases — it just ensures the wider scientific world will set the record straight. “People are going in all sorts of different directions, and many are blind alleys,” he says. “But the idea is that the community as a whole moves forward.”

Back to the Scientists

Vickers counts himself among the “science-first” philosophers of science, a recent movement that focuses on how scientists actually practice their craft. Considering their “manifest successes,” if a philosophical view claims they should operate differently than they do, he’d sooner toss out the philosophical view. “That comes from the track record of science,” he says. “At some point it’s undeniable that scientists have got something right.”

Take virology. Virus particles are far too small to actually observe as they go about infecting hosts and ravaging their bodies. An extreme skeptic might say scientists don’t truly understand these processes, but that didn’t prevent the eradication of polio (or the rapid development of a COVID-19 vaccine). At the astronomical end of the spectrum, no one has viewed the sun from close enough to be sure it’s a star, yet “no scientist in the world would doubt that,” Vickers says. “We’ve established some things beyond all reasonable doubt.” 

All that’s to say: Science is doing fine. Indeed, it’s unclear to what extent the shifting philosophical trends have shaped the course of science. (“Probably not much, Vickers says, although some prominent scientists have cited Popper as a major influence. Bondi’s above quote continues: “There is no more to science than its method, and there is no more to its method than Popper has said.”)

To accept a science-first philosophy is also to abandon the notion of the one true scientific method, as many philosophers have in recent decades. Nicolas Rasmussen, a historian of science at the University of New South Wales, likened the “ongoing quest” for a single method (or even a few) to “the forlorn leaping of salmon against an insurmountable dam.”

At the broadest level, of course, science deploys all the methodological elements. But zooming in on specialized fields and individual scientists, it’s clear they differ. As a theoretical physicist, Einstein used deductive reasoning to arrive at his theories, experimenting only in his own mind. Then there’s Alexander Fleming, who discovered penicillin inductively, by noticing a peculiar mold in a petri dish and examining its properties.

Two celebrated scientists, yet “the kinds of practice they’re doing from day to day are completely different,” Vickers says. “I think a lot of philosophers now would look at all those theories of scientific method and say all of those get used sometimes, by somebody, in some context. And that’s what you want.”

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 LabX Media Group