Chapter 2 — Biology & Medicine, 1926–2026: The Century That Made Life Longer
The World Left Behind
In 1926, a scratch could kill you.
Not a deep wound or a serious infection—a scratch. If bacteria entered through broken skin and your body couldn't fight them off, there was nothing medicine could do. Sepsis would set in. Fever would rise. Within days or weeks, you would be dead. It happened to presidents (Calvin Coolidge's son died from a blister in 1924), to children (routine childhood infections killed roughly one in five before their fifth birthday), and to soldiers (more men died of infection than combat in most pre-20th-century wars).¹
Childbirth was dangerous. Maternal mortality in the United States was around 600 per 100,000 live births—roughly 1 in 170 women died giving birth.² Postpartum infection (childbed fever) was common and often fatal. Cesarean sections were last-resort procedures with high mortality.
Cancer was a death sentence. Surgery was the only treatment, and it was crude—radical mastectomies removed entire chest walls. Radiation existed but was poorly understood and often caused as much harm as the disease. The five-year survival rate for most cancers was in the single digits.³
Mental illness was hidden, feared, and barely treated. Patients were institutionalized in asylums where conditions ranged from neglectful to abusive. The treatments available—hydrotherapy, insulin shock, lobotomies—were often worse than the disease.
The human body was largely opaque. Doctors could listen to your heart and lungs, feel for lumps, and take your temperature. Beyond that, diagnosis was guesswork. The interior of a living body was invisible—a black box that could only be opened after death.
This was the state of medicine one hundred years ago. Not in the Dark Ages, not in ancient history—within living memory. There are people alive today who were born into this world.
Understanding where humanity has come from is essential to understanding where it might go. The acceleration thesis argues that AI could compress a century of progress into a decade. To grasp what that means for biology and medicine, one must appreciate the magnitude of the century just passed.
2026 Snapshot — Where Medicine Stands Today
Modern medicine is powerful, sophisticated, and—by historical standards—miraculous. It is also expensive, unequal, often reactive rather than preventive, and struggling with challenges its own success created.
Life expectancy in developed countries has risen from roughly 55-60 years in 1926 to over 80 years today.⁴ In the best-performing countries (Japan, Switzerland, Singapore), it exceeds 85. Even in lower-income countries, life expectancy has approximately doubled over the century.
Infant mortality has plummeted. In the United States, it's fallen from roughly 70 per 1,000 live births in 1926 to under 6 today—more than a ten-fold improvement.⁵ Similar or greater improvements have occurred globally. A child born today has a better chance of reaching adulthood than at any point in human history.
Infectious disease, once the dominant killer, has been largely tamed in developed countries. Vaccines have eliminated or controlled smallpox, polio, measles, and dozens of other diseases. Antibiotics handle most bacterial infections. Antivirals have transformed HIV from a death sentence to a manageable chronic condition.
Surgery has become precise, minimally invasive, and routinely successful. Organ transplants, joint replacements, and cardiac procedures that would have been science fiction in 1926 are now standard. Surgical mortality has dropped by orders of magnitude.
Imaging has made the body transparent. X-rays, CT scans, MRIs, PET scans, and ultrasound let doctors see inside living patients with extraordinary detail. Cancer is detected earlier. Heart disease is visualized. Brain abnormalities are identified.
Genomics has decoded the instruction manual for human biology. Anyone's complete DNA sequence can be read for under $500 and a few days' turnaround. Genetic testing identifies disease risks, guides treatment selection, and enables precision medicine approaches that would have been unimaginable a generation ago.
Immunotherapy has transformed cancer treatment. Checkpoint inhibitors, CAR-T cells, and cancer vaccines are producing durable responses in cancers that were uniformly fatal. Not for everyone, not for every cancer—but the principle is proven: the immune system can be reprogrammed to fight cancer.
And yet.
Chronic disease dominates the disease burden in developed countries. Heart disease, cancer, diabetes, Alzheimer's, and other conditions of aging kill far more people than infections. These diseases are managed but rarely cured. Patients live with them for decades, requiring ongoing treatment.
Healthcare costs are staggering. The United States spends over $4 trillion annually on healthcare—roughly 18% of GDP—yet achieves worse outcomes than many countries spending half as much.⁶ Medical bills are a leading cause of bankruptcy. Access is deeply unequal.
Care is reactive rather than preventive. Most healthcare spending goes to treating disease after it appears rather than preventing it in the first place. Prevention is less profitable, less dramatic, and harder to measure than acute intervention.
Data is fragmented. Despite decades of digitization, health records remain siloed, inconsistent, and difficult to use for research or coordination. A patient's complete health history is rarely available in one place.
Drug development is slow and expensive. The average new drug takes 10-15 years to develop and costs over $2 billion.⁷ Ninety percent of candidates that enter clinical trials fail. The industry has struggled with declining research productivity even as investment has increased.
Access is profoundly unequal. A child in Sub-Saharan Africa has a life expectancy 20-30 years shorter than a child in Japan.⁸ Within wealthy countries, poor and marginalized populations have significantly worse health outcomes than the privileged. Medicine's miracles are not evenly distributed.
This is the landscape AI enters: extraordinary capability coexisting with extraordinary limitations; life-saving technology coexisting with systemic dysfunction; a century of progress that has solved many problems and created new ones.
Notable Players
The medical ecosystem is vast. Here are key categories of actors shaping the current landscape and the AI-driven transformation to come.
Health Systems and Academic Medical Centers
The large integrated health systems—Mayo Clinic, Cleveland Clinic, Kaiser Permanente, Mass General Brigham, Johns Hopkins—combine research, clinical care, and medical education. They have the data, patient volume, and institutional capacity to pioneer AI applications.
Academic medical centers at major universities (Harvard, Stanford, Penn, UCSF, Johns Hopkins, MIT) drive basic research and train the next generation. Their affiliated hospitals are often where new treatments are first tested.
The UK's National Health Service represents a different model: a single-payer system with unified data infrastructure. The NHS has become a testbed for population-level health AI, with projects analyzing millions of patient records.
Pharmaceutical and Biotechnology
The major pharmaceutical companies—Pfizer, Johnson & Johnson, Roche, Novartis, Merck, AstraZeneca, Sanofi, GSK, Lilly, AbbVie—remain dominant in drug development and commercialization. They have the capital, regulatory expertise, and distribution networks to bring drugs to market globally.
Biotechs occupy the innovation frontier. Moderna and BioNTech demonstrated mRNA's potential; Regeneron and Vertex have produced breakthrough therapies; Genentech (now part of Roche) pioneered biological drugs. Hundreds of smaller biotechs pursue specific targets or modalities.
AI-native drug discovery companies are increasingly prominent: Recursion, Insilico Medicine, Exscientia, Relay Therapeutics, Isomorphic Labs (DeepMind's spinout). These companies bet that AI can fundamentally change how drugs are discovered.
Genomics and Diagnostics
Illumina dominates DNA sequencing hardware and has enabled the collapse in sequencing costs from $100 million per genome (2001) to under $500 today. Pacific Biosciences and Oxford Nanopore offer alternative technologies with different strengths.
Liquid biopsy companies—Grail (detecting cancer from blood draws), Foundation Medicine (genomic profiling of tumors), Guardant Health—are pushing toward earlier, less invasive cancer detection.
Multi-omics companies analyze not just DNA but RNA, proteins, metabolites, and other molecular data to build comprehensive pictures of patient biology.
Digital Health and Wearables
Apple (Watch, HealthKit), Google (Fitbit, health data platform), Samsung, and various startups have put health sensors on hundreds of millions of wrists. Continuous heart rate, ECG, blood oxygen, sleep tracking, and activity monitoring generate unprecedented volumes of health data outside clinical settings.
Continuous glucose monitors from Dexcom, Abbott, and others are transforming diabetes management and finding broader applications in metabolic health monitoring.
Telemedicine exploded during COVID-19 and remains a significant care delivery channel. Companies like Teladoc, Amwell, and numerous specialty-focused platforms connect patients with providers remotely.
AI and Health Tech
Google DeepMind has produced landmark work in protein structure prediction (AlphaFold) and medical imaging analysis. Google Health is deploying AI for clinical decision support.
Microsoft is investing heavily in biomedical AI through partnerships and internal research, with a particular focus on large language models for medical applications.
The major electronic health record vendors—Epic, Cerner (now Oracle Health), Meditech—are integrating AI capabilities into clinical workflows, with mixed success and significant implementation challenges.
Startups in AI-assisted diagnosis, clinical documentation, drug discovery, and care coordination number in the thousands. Most will fail; some will transform the field.
The Century in Medicine: A Brief History
The Antibiotic Revolution
In 1928, Alexander Fleming noticed that mold had killed bacteria in a petri dish he'd left out before vacation. This accidental observation led to the isolation of penicillin, though it took more than a decade to develop manufacturing processes for mass production.
When penicillin became widely available during World War II, it transformed medicine. Infections that had been death sentences became curable. Surgical mortality plummeted. The "golden age of antibiotics" followed: streptomycin (1943), chloramphenicol (1947), tetracycline (1948), and many others.⁹
By mid-century, some physicians believed infectious disease would be conquered entirely. This proved premature—bacteria evolved resistance, and new infections emerged—but the basic revolution held. Bacterial infection went from leading cause of death to manageable problem.
The Vaccine Triumph
Vaccines preceded antibiotics but accelerated alongside them. The smallpox vaccine had existed since 1796, but the 20th century brought vaccines for diphtheria (1923), pertussis (1926), tetanus (1938), influenza (1945), polio (1955), measles (1963), mumps (1967), rubella (1969), and eventually hepatitis, HPV, rotavirus, and more.¹⁰
The campaign to eradicate smallpox—the only human disease ever deliberately eliminated—succeeded in 1980 through coordinated global vaccination. Polio, once paralyzing hundreds of thousands annually, is now reduced to a handful of cases in a few countries.
The mRNA vaccines developed for COVID-19 (Pfizer-BioNTech and Moderna) represent the latest chapter: a new platform enabling rapid response to novel pathogens. From viral sequence to authorized vaccine took less than a year—a process that previously took a decade.¹¹
Sanitation and Public Health
Drugs get the credit, but infrastructure may have mattered more. Clean water, sewage treatment, food safety regulation, and basic hygiene prevented more deaths than any single medical intervention.
The chlorination of drinking water, widespread by the 1920s, eliminated waterborne diseases that had killed millions. Pasteurization of milk reduced infant mortality. Food safety inspection prevented outbreaks.
Public health measures—contact tracing, quarantine, health education—controlled epidemics that would otherwise have spread unchecked. The dramatic decline in infectious disease mortality began before antibiotics, driven largely by these non-medical interventions.¹²
Imaging and Diagnostics
Wilhelm Röntgen discovered X-rays in 1895, but the technology matured over the following century. By 1926, X-rays were standard for detecting fractures and tuberculosis. Over subsequent decades, the technology improved: better image quality, lower radiation doses, specialized applications.
The revolution accelerated in the 1970s. CT (computed tomography) scans, invented by Godfrey Hounsfield, used X-rays and computer processing to create cross-sectional images of the body. MRI (magnetic resonance imaging), developed by Raymond Damadian and Paul Lauterbur, created detailed soft-tissue images without radiation. PET (positron emission tomography) scans showed metabolic activity.¹³
By the 2020s, doctors could visualize virtually any structure in the living body: blood vessels, organs, tumors, brain activity. Diagnosis went from external examination and inference to direct observation.
The Molecular Revolution
The discovery of DNA's structure in 1953 (Watson, Crick, Franklin) launched molecular biology. Understanding that genes were sequences of nucleotides that encoded proteins provided the conceptual foundation for modern medicine.
Progress was initially slow. Sequencing DNA was laborious; modifying genes was difficult; the complexity of biological systems resisted simplification.
The Human Genome Project, completed in 2003 after 13 years and nearly $3 billion, sequenced the first complete human genome. It was a proof of concept more than a practical tool—but the costs would plummet. By 2020, a genome could be sequenced in days for hundreds of dollars.¹⁴
Parallel advances enabled modification, not just reading. Recombinant DNA technology (1970s) allowed genes to be cut and pasted between organisms. Gene therapy attempts began in the 1990s, with mixed results. Then CRISPR arrived.
CRISPR-Cas9, described in 2012 by Jennifer Doudna and Emmanuelle Charpentier (based on earlier work by Francisco Mojica and others), provided a precise, programmable way to edit genomes.¹⁵ Within a decade, CRISPR-based therapies were entering clinical trials for sickle cell disease, cancer, and hereditary blindness. The era of programmable biology had begun.
Transplantation and Replacement
In 1954, Joseph Murray performed the first successful kidney transplant between identical twins. Over subsequent decades, transplantation expanded: liver (1963), heart (1967), lung (1983), hand (1998), face (2005). Immunosuppressive drugs (cyclosporine, 1983) made transplants between non-twins viable.¹⁶
Today, organ transplantation is routine but limited by supply. Over 100,000 Americans are on transplant waiting lists; thousands die waiting each year. The fundamental constraint is the availability of donor organs.
Prosthetics and implants partially address the gap. Artificial joints (hips, knees, shoulders) restore mobility to millions. Pacemakers and defibrillators regulate hearts. Cochlear implants restore hearing. Artificial lenses replace clouded ones.
More sophisticated devices are emerging: ventricular assist devices that support failing hearts, artificial pancreas systems that automate insulin delivery, and early-stage work on artificial kidneys. The boundary between biological and artificial is blurring.
Cancer: From Death Sentence to Chronic Disease
In 1926, cancer treatment meant surgery—often radical, disfiguring surgery—and hope. The five-year survival rate for most cancers was below 20%.
Radiation therapy matured through mid-century, offering an alternative to surgery for some cancers. Chemotherapy emerged from chemical weapons research: nitrogen mustard, a derivative of World War I chemical weapons, proved effective against lymphoma (1942). The chemotherapy era followed, with dozens of drugs that could kill cancer cells—along with healthy cells.¹⁷
Targeted therapies began with tamoxifen (1977) for breast cancer and Gleevec (2001) for chronic myeloid leukemia—drugs that attacked specific molecular targets rather than all rapidly dividing cells. The approach proved transformative for cancers with identifiable driver mutations.
Immunotherapy—harnessing the immune system to fight cancer—had been attempted for a century with little success. Then checkpoint inhibitors emerged. Ipilimumab (2011) and pembrolizumab (2014) blocked proteins that tumors use to evade immune attack. For some cancers, patients who would have died within months survived for years.¹⁸
CAR-T cell therapy went further: extracting a patient's immune cells, genetically engineering them to recognize cancer, and reinfusing them. The first CAR-T therapy was approved in 2017 for leukemia; others followed.
Today, cancer remains the second-leading cause of death in developed countries. But the trajectory has shifted. Many cancers caught early are curable. Others have become chronic diseases—managed for years or decades rather than killing within months. The five-year survival rate across all cancers has risen to roughly 70%.¹⁹
Neuroscience: Mapping the Final Frontier
The brain remained the most mysterious organ longest. Understanding progressed from gross anatomy (identifying brain regions) to cellular (neurons and synapses) to molecular (neurotransmitters and receptors) to circuit (how neurons work together).
Electroencephalography (EEG, 1924) provided the first window into brain electrical activity. Brain imaging followed: CT scans (1970s), MRI (1980s), functional MRI (1990s), and increasingly sophisticated techniques for mapping structure and activity.
Psychiatric medications transformed mental health treatment—not by curing diseases but by managing symptoms. Chlorpromazine (1950s) emptied asylums by controlling psychotic symptoms. Lithium (1970s) stabilized bipolar disorder. SSRIs (1987 onward) treated depression without the side effects of earlier drugs.²⁰
The limitations remained profound. Science still does not understand how neural activity produces consciousness, how memories are formed and retrieved, or why psychiatric medications work for some patients and not others. Neurodegenerative diseases—Alzheimer's, Parkinson's—have resisted effective treatment despite decades of effort. The brain remains the hardest problem.
Modern Bottlenecks
A century of progress has created new challenges:
Cost: Healthcare spending in developed countries ranges from 10-18% of GDP and continues to rise. New treatments are often extraordinarily expensive—gene therapies can cost millions of dollars per patient. The system is financially unsustainable in its current form.
Access: The benefits of modern medicine are distributed extremely unequally. Between countries, life expectancy varies by 30+ years. Within countries, socioeconomic status predicts health outcomes. Marginalized populations receive worse care and die younger.
Chronic disease: Success against infectious disease and acute conditions has left chronic disease as the dominant problem. Heart disease, cancer, diabetes, dementia, and other conditions of aging cannot be cured with a pill or procedure. They require ongoing management for decades.
Drug development: Despite exploding knowledge, the productivity of pharmaceutical R&D has declined. The cost to bring a new drug to market has increased roughly 100-fold since 1950 (inflation-adjusted).²¹ More money yields fewer drugs. Something is structurally wrong.
Time-to-trial: Clinical trials are slow, expensive, and often fail. The gap between laboratory discovery and approved treatment is measured in decades. Patients with fatal diseases die waiting for drugs that might save them.
Fragmentation: The health system is a collection of disconnected silos. Primary care doesn't talk to specialists. Hospitals don't share records. Patients navigate a maze of providers, insurers, and bureaucracies. Data that could save lives sits unused in incompatible systems.
These are the problems AI will encounter. The question is not whether AI can improve medicine—that's already clear. The question is whether it can address these structural challenges or will simply layer new capabilities on a dysfunctional foundation.
The AI Transition Begins
AI has already entered medicine. The question is how deeply it will transform the field.
Current Applications
Medical imaging is the most mature application. AI systems read X-rays, CT scans, mammograms, and retinal images with accuracy matching or exceeding human specialists in narrow tasks. These systems are in clinical use, typically as second readers supporting (not replacing) human radiologists.²²
Clinical documentation is being transformed. AI scribes listen to patient encounters and generate clinical notes, reducing the hours physicians spend on paperwork. The technology is imperfect but improving rapidly.
Drug discovery is where the most dramatic claims are made. AI identifies drug candidates, predicts molecular properties, and designs clinical trials. Several AI-designed drugs have entered clinical trials, though none have yet been approved.²³
Administrative functions—scheduling, billing, prior authorization, patient communication—are being automated. These applications are less glamorous than clinical AI but may save more time and money.
What's Different Now
Why might AI transform medicine more deeply than previous waves of healthcare IT?
General capability: Previous medical software was narrow—a system for reading ECGs, a database for drug interactions, an algorithm for one diagnostic task. Foundation models are general-purpose. The same underlying technology that analyzes medical images can also read clinical notes, answer patient questions, and assist with research.
Integration with workflows: AI that augments existing workflows is easier to adopt than AI that requires new workflows. Modern AI can be embedded in the tools clinicians already use—adding suggestions to existing interfaces rather than requiring new processes.
Data scale: Medicine generates enormous amounts of data—imaging, genomics, clinical notes, lab results, wearable sensor streams. AI thrives on data. The medical system has been collecting data for decades; AI is beginning to make sense of it.
Economic pressure: Healthcare costs are unsustainable. Payers, providers, and policymakers are actively seeking ways to reduce costs while maintaining or improving quality. AI offers at least the possibility of doing more with less.
What Remains Hard
Validation: Medical AI must be proven safe and effective before deployment. This requires clinical trials, regulatory approval, and real-world validation—processes that take years and cost millions. The speed of AI development runs headlong into the pace of medical validation.
Liability: When AI makes a mistake that harms a patient, who is responsible? The physician who relied on the AI? The hospital that deployed it? The company that developed it? The regulatory framework is unclear.
Integration: Healthcare IT is notoriously fragmented. Electronic health records don't talk to each other. Different hospitals use different systems. Inserting AI into this mess is technically challenging and politically fraught.
Trust: Physicians are trained to take personal responsibility for patient care. Many are uncomfortable delegating judgment to systems they don't understand. Patients may feel similarly. Building appropriate trust—neither blind acceptance nor reflexive rejection—takes time.
Equity: AI trained on data from wealthy academic medical centers may not work well for populations underrepresented in that data. Deploying biased AI could worsen existing health disparities rather than reducing them.
Looking Forward
This chapter has surveyed a century of medical progress—from a world where scratches killed to one where genomes can be edited, organs transplanted, and the living brain imaged. The advances are extraordinary. The remaining challenges are substantial.
The next chapters in this section will explore how AI might transform medicine over the coming decade:
Chapter 3 examines the shift from reactive treatment to continuous prevention—how wearables, multi-omics, and AI could make disease increasingly rare rather than merely treatable.
Chapter 4 takes on the big killers—cancer, heart disease, neurodegeneration—and asks what AI-accelerated research might achieve against diseases that have resisted decades of effort.
Chapter 5 explores repair and regeneration—spinal cord injury, blindness, organ failure—areas where the body's own limitations might finally be overcome.
Chapter 6 ventures into brain interfaces, neural prosthetics, and the speculative possibility of cognitive enhancement or continuity beyond biological bodies.
Chapter 7 addresses human augmentation—genetic, physical, cognitive—and the ethical terrain this creates.
Chapter 8 confronts longevity directly—the possibility that aging itself might become optional, and what that would mean for individuals and society.
The century just passed proved that medicine could be transformed beyond recognition. The decade ahead may prove that such transformations can happen far faster than previously assumed.
Endnotes — Chapter 2
- Childhood mortality data from historical vital statistics. Pre-antibiotic era mortality from infection varied by region but remained extremely high by modern standards. Calvin Coolidge Jr. died in 1924 from sepsis following a blister on his foot.
- Maternal mortality data from CDC historical vital statistics and World Bank databases. The current US rate is approximately 23 per 100,000—still higher than most developed countries but roughly 25x better than 1926.
- Historical cancer survival data from National Cancer Institute SEER database and historical medical literature. Five-year survival rates for most solid tumors were below 20% prior to modern chemotherapy and targeted therapy.
- Life expectancy data from Our World in Data, UN Population Division, and national vital statistics. Global life expectancy has increased from approximately 32 years in 1900 to over 72 today.
- Infant mortality data from CDC, World Bank, and Our World in Data. The decline represents one of the most dramatic public health improvements in history.
- US healthcare spending from CMS National Health Expenditure Data. International comparisons from OECD Health Statistics and Commonwealth Fund analyses.
- Drug development costs and timelines from DiMasi et al. studies and industry analyses. The $2.6 billion figure (2015 dollars) is widely cited though methodologically contested.
- Life expectancy gaps from WHO Global Health Observatory and Our World in Data. The gap between Japan (84+ years) and several Sub-Saharan African countries (50-60 years) exceeds 25 years.
- History of antibiotics from medical history sources. The "golden age" produced most major antibiotic classes; relatively few new classes have been discovered since the 1980s.
- Vaccine development history from CDC, WHO, and medical history sources. The elimination of smallpox in 1980 remains the only deliberate eradication of a human disease.
- The Pfizer-BioNTech COVID-19 vaccine timeline from initial sequence sharing (January 2020) to emergency use authorization (December 2020) was approximately 11 months—unprecedented for vaccine development.
- McKeown thesis and subsequent analyses debate the relative contributions of medicine vs. public health. The decline in infectious disease mortality began before most medical interventions were available.
- Medical imaging history from radiology literature and medical history sources. Hounsfield and Cormack shared the 1979 Nobel Prize for CT; Lauterbur and Mansfield shared the 2003 prize for MRI.
- Human Genome Project costs and completion from NIH. Current sequencing costs from National Human Genome Research Institute tracking data.
- CRISPR history and development from Doudna and Charpentier's Nobel Prize lectures (2020) and scientific literature. The technology's rapid adoption across biological research has been extraordinary.
- Transplantation history from medical literature. Joseph Murray won the 1990 Nobel Prize for his pioneering work.
- Chemotherapy history from oncology literature. The development from nitrogen mustard through modern targeted therapies spans roughly 80 years.
- Immunotherapy history from oncology literature. James Allison and Tasuku Honjo shared the 2018 Nobel Prize for checkpoint inhibitor discovery.
- Overall cancer survival rates from SEER database. The improvement reflects both earlier detection and better treatment, though progress varies dramatically by cancer type.
- Psychopharmacology history from psychiatric literature. The "deinstitutionalization" enabled by antipsychotic medications had complex consequences—improving life for many while leaving others without adequate community support.
- Eroom's Law (Moore's Law spelled backward) describes declining pharmaceutical R&D productivity. Various analyses estimate costs have risen roughly 100-fold since 1950 while output has declined.
- FDA-cleared AI medical devices database shows hundreds of AI-based imaging tools. Clinical evidence varies; deployment remains limited though growing.
- Several AI-discovered drug candidates have entered Phase I/II trials, though as of early 2026 none have received full approval. Companies including Insilico Medicine, Recursion, and Exscientia have candidates in clinical development.