10 Breakthrough Moments in Medicine

Ideas and innovations that solved some of medicine’s most confounding mysteries.

By Christian Millman
Oct 23, 2019 5:00 AMMay 9, 2020 9:56 PM
Hysteria 1800s Medicine - Wellcome Images
In the 1800s, hysteria was a diagnosis for anything doctors didn’t understand about female patients. (Credit: Wellcome Library, London/Wellcome Images)

Newsletter

Sign up for our email newsletter for the latest science news
 

Here’s a simple question to ponder, although not an easy one to answer: Which medical advances from the past 100 years have made the greatest impact? And if you then posed that question to some of the world’s leading physicians and top intellectuals, how would they respond? What would they say to something that holds untold millions of lives in its answer?

Luckily, this is not a rhetorical exercise. Just last year, the Medical Research Council in London, a research-funding initiative with global reach, posed this very question to these very types of people. Many of the survey results are in the following pages, alongside other breakthroughs of such profound importance throughout the centuries that they changed the very core of medical practice.

Antibiotics and Their Incalculable Impact

When the Medical Research Council conducted its survey of medical advances of greatest import, the largest number of responses was for the discovery of antibiotics by Alexander Fleming. “Without antibiotics, modern medicine as we know it would be unrecognizable,” wrote Stephen Whitehead, chief executive of the Association of the British Pharmaceutical Industry.

A dramatic statement for a dramatic discovery — and one that owes its existence to the decidedly nondramatic fact that Fleming was a slob. In 1928, Fleming was researching the properties of the well-known Staphylococcus bacterium, which continues to haunt us today in the form of MRSA, the antibiotic-resistant superbug.

One September morning, he entered his messy lab to begin work and noticed that one of his staph cultures had been overgrown by a fungus. Ordinarily, such a thing would have necessitated nothing more than throwing out the petri dish.

But this fungus was different. It was from the Penicillium genus, and all the staph colonies near it had died while those farther away were normal. At first, he called the bacteria-killing substance it was secreting “mould juice” before finally settling on the more formal name of penicillin.

After determining penicillin’s ability to kill many kinds of gram-positive bacteria — such as those that caused scarlet fever, meningitis, diphtheria and bacterial pneumonia — Fleming abandoned most of his work with the new drug because of the difficulties in producing large amounts of it. The job of mass-producing penicillin fell to two Oxford researchers roughly 10 years later: Howard Florey and Ernst Chain.

So while Fleming continues to receive the lion’s share of recognition for penicillin, all three researchers actually won the 1945 Nobel Prize in Medicine. Florey and Chain receded into historical anonymity while Fleming’s reputation lives on. So does his original laboratory, which has been turned into a museum in London.

It’s still pretty messy.

The World-Changing Application of Germ Theory

It’s a peculiarity that most people today know the name Lister only from the label of a tear-inducing mouthwash. That’s a loss of historical significance on par with Einstein becoming nothing but the name of a bagel franchise in the future.

Although germ theory — the understanding that microorganisms cause many diseases — was first proposed in the 16th century and honed by the work of Louis Pasteur 300 years later, it wasn’t until Sir Joseph Lister actually began applying that knowledge in the 1860s that medicine changed for the better because of it.

Lister was a surgeon in Scotland during a time when most of his peers considered it a status symbol to sport unwashed hands and bloodstained gowns as they moved from operation to operation. Lister, who was familiar with the work of Pasteur and others, made the connection between the lack of sanitation and “ward fever,” the high rate of patient infections and deaths unrelated to the preceding surgeries.

In an attempt to control infections, he was the first to implement the kind of sterile procedures that are the norm today. He changed gowns and gloves, and he washed his hands thoroughly between patients. He also sterilized surgical instruments and operating rooms by using a “donkey engine” (like the one shown) to spray everything with a fine mist of carbolic acid, a known disinfectant.

Many other surgeons scoffed at Lister — until the rate of infections and ward fevers fell dramatically after his surgeries.

These days, the dangers of hospital-acquired infections are well known, and hospitals and other health providers that do not follow sanitary procedures are held accountable through regulatory actions and lawsuits. All because of a renegade Scottish surgeon whose contributions to medicine have saved many millions of lives.

Prevention, Not Treatment

Since the time of Galen and Hippocrates, medicine’s purpose has been to heal the sick. While that remains the noblest of undertakings, a British doctor named Edward Jenner thought medicine could be something more. What if, he surmised, you could prevent people from getting sick in the first place?

That idea took root in 1796, when he noticed something unusual about milkmaids. Those who worked closely with cows and contracted an illness called cowpox didn’t contract the horror that was smallpox. Exceptionally contagious, smallpox killed hundreds of millions, or even billions, of people since prehistory, sometimes causing the collapse of entire civilizations.

Cowpox, by contrast, caused many of the same symptoms as smallpox, yet they were less severe in nature, and the disease was not fatal. So Jennings tried something that would change history: He drained some pus from a milkmaid’s active cowpox blisters and persuaded a farmer to let him inject the pus into the arm of the farmer’s son.

Then, in a move that would get him barred for life from any modern medical association, Jenner injected the boy with smallpox pus. The boy became mildly ill but did not develop smallpox, and he fully recovered in a few days.

Thus was born the smallpox vaccine, and a vaccination campaign that lasted until the World Health Organization declared the disease — one of humanity’s greatest scourges — eradicated by 1980.

Born alongside the smallpox vaccine on that day in 1796 was its fraternal twin, vaccine therapy, otherwise known as immunology. Since Jenner’s discovery, vaccines have been developed for many other diseases. To name a few: measles, rubella, diphtheria, mumps, polio, meningitis, hepatitis A and B, influenza, rabies, yellow fever and tetanus.

The impact of immunology on the human race is incalculable — almost. Early in 2014, the Centers for Disease Control and Prevention quantified it just a little bit. It estimated that the vaccines given to American infants and children over the past 20 years will prevent 322 million illnesses, 21 million hospitalizations and 732,000 deaths over the course of those lifetimes.

A Big Dig Through Data Uncovers Epidemiology — and a Cesspool 

Medicine can be so intense: the frenzy of the ER while saving a trauma victim. The claustrophobia-inducing hum and clank of an MRI. The triumphant high-five between surgeons after a delicate operation.

Some aspects of medicine are much softer in their walk and talk, yet no less important. Such is epidemiology — the use of observation and statistics to find patterns, causes, sources and effects of illnesses in populations. It’s a field akin to accounting and actuarial science — more Ernst & Young than Young Dr. Kildare.

Yet that’s the point: Epidemiology finds strength in numbers. The medical specialty’s origin dates to an 1854 cholera outbreak that swept the city of London. John Snow, a doctor and early advocate of the then-controversial germ theory, suspected the cholera bug was being spread by polluted water.

Snow investigated the source of the outbreak, interviewing locals to determine the circumstances of the cholera victims. Then he did something pivotal. He marked up a map with the location of all the deaths and found a shared water pump in the middle of a cluster of victims. Others who lived outside the cluster had drunk from the same pump while passing through the area. 

When city officials removed the handle from the pump, which had been dug next to an old cesspool, the outbreak stopped. 

Although unrecognized in his time, Snow is considered the father of epidemiology to today’s disease detectives, and his work greatly influenced public sanitation and other public health measures put in place worldwide.

The Ugly Face of War Leads to Modern Plastic Surgery

Plastic surgery may conjure images of Hollywood starlets and their enlarged and enhanced bodies, but it was developed and advanced for far less cosmetic reasons. During World War II, aircraft and their crew were deployed in unprecedented numbers. Also unprecedented were the ghastly burns many of the crew suffered when their planes were shot down, igniting the fuel in the process.

Archibald McIndoe, a New Zealand doctor, was among those charged with the daunting task of treating those men. In 1938, he was appointed consulting plastic surgeon to the Royal Air Force, one of four in the nascent field in Britain.

Medical convention at the time was to treat a burn with a burn. Acid was applied to remove the damaged skin, followed by a two-month waiting period to allow the area to heal enough to tolerate surgery. Not surprisingly, that was eight weeks of agony for those patients. It also left those burn victims with scars so severe that they often avoided going out in public for the rest of their lives.

For McIndoe, such radical wounds called for a radical departure from convention. The first new method he developed was a saline bath for crew who had extensive burns. The idea for this came from pilots who ditched at sea and, therefore, ended up in saltwater. Their burns healed noticeably better than those who bailed out over land.

Next up for McIndoe was to operate immediately, incising the damaged tissue and developing a new skin-grafting technique to replace it, also immediately. Not only did this give patients much less scarring, it allowed them to begin using the burned area far sooner in the healing process.

Aside from his surgical prowess, McIndoe also became much loved for his recognition of the psychological impact of burns. He stopped the practice of dressing patients in convalescent gowns, and instead he insisted they be allowed to continue wearing their usual military uniforms. He also recruited local families and had them invite patients over for meals and other gatherings, which helped his patients reintegrate into society, rather than hide from it.

His patients quickly dubbed themselves The Guinea Pig Club, as an affectionate and tongue-in-cheek acknowledgement of how McIndoe’s pioneering methods had helped them. In 1947, he received a knighthood for his work healing the bodies and psyches of his wartime patients. And his methods, including the skin graft he invented, are still in use today in reconstructive surgeries.

Making Blood Transfusions Work, Finally

You only need to read a book set in the 1800s or earlier to know that, through history, women often died in childbirth. One of the most common reasons for that was uncontrolled bleeding after delivery.

James Blundell, a British obstetrician, knew that transfusing blood into these women could save them. He also knew that others had been experimenting with transfusions for almost 200 years, often with fatal results, mostly because of the practice of using animal blood.

After successful experiments transfusing blood from one animal of the same species to another, Blundell made his first human attempt in 1818 on a woman who was hemorrhaging after childbirth. With her husband as a donor, he transfused 4 ounces of blood into the woman. 

She survived, but not all of Blundell’s subsequent patients were so fortunate. Although Blundell was the first to understand that human blood needed to be used on other humans, no one yet knew that blood came in different types — and that a transfusion with the wrong type would lead to immune rejection and, often, death.

Transfusions remained a dicey affair until 1901 when an Austrian doctor, Karl Landsteiner, discovered the different blood groups and which ones could be safely mixed with others.

Continuing research by others gave doctors the ability to bank blood, separate it into such components as plasma and screen for blood-borne pathogens. Today, about 15 million transfusions take place in the United States each year.

The End of Hysteria and the Advent of Women’s Health

“You’re hysterical!” Funny stuff, huh? Well, not during the Victorian era.

Female hysteria was a widely used medical diagnosis, particularly during the 1800s and early 1900s, although the term is attributed to Hippocrates, who based it on the ancient Greek word for “uterus” (hysteron) in the fifth century B.C.

Hysteria took on many meanings over the centuries, and by the time of its demise as a medical diagnosis, it had served as a catchall for anything male doctors (and they were almost all male doctors) didn’t understand about their female patients.

The symptoms of hysteria were, well, anything. Some examples: fainting, nagging, irritability, sexual dissatisfaction, loss of appetite, insomnia, laziness and a loss of speech but, weirdly, not of singing.

In the 20th century, the diagnosis began to be more heavily scrutinized. Unsurprisingly, it did not withstand that scrutiny. It was finally abandoned as a diagnosis through its removal from the 1980 DSM-III, the third edition of The Diagnostic and Statistical Manual of Mental Disorders, the medical world’s widely agreed-upon way to classify mental disorders.

The demise of hysteria incidentally overlapped a rise in women’s health as a separate field in medicine. Throughout the 1960s and 1970s, more and more women entered the field of medicine, to the point where newly minted physicians are now almost equally split by gender. Between the early 1900s and the early 2000s, the proportion of female graduates from obstetrics and gynecology residency programs grew from zero to about 80 percent. And in 1991, the U.S. Department of Health and Human Services established the Office on Women’s Health.

Not a bad list of accomplishments for a gender once thought to be overwhelmingly incapacitated by hysteria.

Where There’s Smoke ...

Most histories of the link between smoking tobacco and lung cancer attribute the discovery to a British doctor, Richard Doll, who made the claim in 1950 amid an epidemic of lung cancer in the postwar United Kingdom.

Although he proved the connection unequivocally by starting a 50-year longitudinal study in 1951 that showed half of smokers died from their addiction and that quitting was remarkably effective at reducing or eliminating that risk, he wasn’t actually the first to notice the link.

German physician Fritz Lickint published a 1929 paper that showed lung cancer patients were also overwhelmingly smokers. But because that research appeared during the time of unrest in Germany that preceded World War II, it remained an overlooked, if not ignored, contribution to medicine for many years.

Not that it mattered. In the face of a powerful tobacco industry and associated lobby, it would take until 1964 for the U.S. surgeon general to issue his first report educating Americans on the incredibly toxic effects of smoking, including being the primary cause of lung cancer.

In the interim, the tobacco industry had been busy promoting the health benefits of their product. “More doctors smoke Camels,” boasted one 1946 advertisement. “Smoke a Lucky to Feel Your Level Best!” said another one, from a 1949 Lucky Strike advertisement with a 17-year-old girl as a model.

Cigarettes were credited with better digestion, keeping a slender figure and creating an all-around sophisticated image. For a time, even the TV show The Flintstones was sponsored by Winston. Every episode ended with Fred and Wilma firing up a smoke together to show a Winston “tastes good like a cigarette should,” even in the Stone Age. It was in this environment that the 1964 surgeon general’s report was released. It was sent to media outlets on a Saturday, in order to minimize the effect on the stock markets and maximize coverage in Sunday newspapers.

The surgeon general at the time, Luther Terry, later said the report “hit the country like a bombshell.” But it worked. A 1958 Gallup poll showed only 44 percent of Americans thought smoking might cause cancer; by 1968, another Gallup poll pegged that number at 78 percent.

In January 2014, the Journal of the American Medical Association commemorated the 50th anniversary of that report by issuing a sober statistic: Over 8 million American lives have been saved by anti-smoking efforts since the release of the 1964 report. 

We’ve come a long way, baby.

From Grinding Organs to Transplanting Them

Nowhere is the interconnected nature of medical advances more on display than in the field of organ transplantation. When doctors began to understand how blood came in different types, they also began to understand the nature of immune rejection and what made donors incompatible with their recipients.

A doctor who benefitted greatly from this knowledge was Joseph Murray, an American physician who, like Archibald McIndoe (see “The Ugly Face of War Leads to Modern Plastic Surgery,” page 76), served as a plastic surgeon in World War II. Murray gained additional experience with tissue rejection while trying to graft the skin of deceased donors to the badly burned areas of his patients.

After the war, Murray’s focus turned to suppressing or avoiding the immune response that caused tissue rejection. If Murray could solve this problem, doctors could begin to figure out the long-sought-after ability to transplant organs.

A Ukrainian surgeon had attempted to transplant a cadaver kidney into a patient with renal failure in the 1930s, ending up with two dead bodies after the surgery. When Murray made medicine’s next attempt to transplant a kidney in 1954, he did so by taking a healthy one from his patient’s identical — and living — twin brother. Because there was no immune system rejection of the genetically identical kidney, both brothers survived the operation (shown above) and made full recoveries.

Murray then refocused his time on helping to find drugs that would suppress the immune response enough to allow transplants between less-compatible donors and recipients. With his guidance, others in the field of immunosuppressive drugs soon came up with such agents as Imuran, azathioprine and prednisone, allowing Murray to perform the first kidney transplant from an unrelated donor in 1959.

Murray won a Nobel Prize in Medicine in 1990 for his work in organ and cell transplantation. In 2012, he suffered a stroke at home at the age of 93. Murray died in Brigham and Women’s Hospital, the same place where he performed his first organ transplant operation.

Since that first successful operation, the field of organ transplantation has advanced exponentially. About 30,000 transplants are performed in the U.S. each year, including lung, heart, liver, pancreas, bowel and bone transplants, among others.

Bedlam Is Now Just an Expression

Chances are you’ve said this: “Man, it’s bedlam in here.” It’s just a saying, right? Yes, and that’s exactly the point.

Although the hospital that was once called Bedlam — the Bethlem Royal Hospital in London — still exists, the period of its history when it earned that nickname is long gone.

Uproar, confusion, screeching, wailing, chains worn indefinitely, madness unchecked — all were attributes of the place where the worst practices in treating the mentally ill were used over hundreds of years.

While it’s easy to chalk that up to a simple lack of any kind of compassion for the mentally disturbed, there’s a larger point at play: There weren’t any good options for treating mental illness.

That only began to change in the 1950s with the development of the first antipsychotic drugs, foremost of which was chlorpromazine, also known as Thorazine. Although nowhere close to a perfect drug, Thorazine at least gave struggling doctors an effective option for treating such mental illnesses as schizophrenia and the manic phase of bipolar disorder.

Thorazine’s success in mitigating the worst behaviors of such diseases led to rapid and ongoing development of many other drugs for mental ailments, including antipsychotics and antidepressants. Many critics believe the mentally ill are overmedicated and demonize psychoactive drugs, but few would want to return to the days before these drugs were available.

“Without the discovery of chlorpromazine, we might still have the miserable confinements witnessed [in the time] of desperate remedies,” wrote Trevor Turner, a psychiatrist at the Homerton Hospital in London, in his nomination of the drug as one of the most significant medical advances of recent history. “It is hard not to see chlorpromazine as a kind of ‘psychic penicillin.’ ”

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 LabX Media Group