Reader & HOD, Department of Homoeopathic Pharmacy,
SVR Homeopathic Medical College,
Our understanding of prehistoric medical practice is from the study of ancient pictographs that show medical procedures, as well as the surgical tools uncovered from anthropological sites of ancient societies.
Serious diseases were of primary interest to early humans, although they were not able to treat them effectively. Many diseases were attributed to the influence of malevolent demons who were believed to project an alien spirit, a stone, or a worm into the body of the unsuspecting patient. These diseases were warded off by incantations, dancing, magic charms and talismans, and various other measures. If the demon managed to enter the body of its victim, either in the absence of such precautions or despite them, efforts were made to make the body uninhabitable to the demon by beating, torturing, and starving the patient. The alien spirit could also be expelled by potions that caused violent vomiting, or could be driven out through a hole cut in the skull. This procedure, called trepanning, was also a remedy for insanity, epilepsy, and headache.
Surgical procedures practiced in ancient societies included cleaning and treating wounds by cautery (burning or searing tissue), poultices, and sutures, resetting dislocations and fractures, and using splints to support or immobilize broken bones. Additional therapy included laxatives and enemas to treat constipation and other digestive ills. Perhaps the greatest success was achieved by the discovery of the narcotic and stimulating properties of certain plant extracts. So successful were these that many continue to be used today, including digitalis, a heart stimulant extracted from foxglove.
Several systems of medicine, based primarily on magic, folk remedies, and elementary surgery, existed in various diverse societies before the coming of the more advanced Greek medicine about the 6th century BC.
Egyptian medicine was marked by a mystical approach to healing, as well as a more empirical or rational approach that was based on experience and observation. Common diseases of the eyes and skin were usually treated rationally by the physician because of their accessible location; internal disorders continued to be treated by the spells and incantations of the priest-magician.
The physician emerged around 2600 BC as an early form of scientist, a type distinct from the sorcerer and priest. The earliest physician whose name has survived is Imhotep (lived about 2600 bc), renowned for his studies of pathology and physiology as well as his expertise as a pyramid builder and an astrologer. The Egyptian physician normally spent years of arduous training at temple schools in the arts of interrogation, inspection, and palpation (examining the body by touch). Prescriptions contained some drugs that have continued in use through the centuries. Favorite laxatives were figs, dates, and castor oil. Tannic acid, derived principally from the acacia nut, was valued in the treatment of burns.
Although Egyptians practiced embalming to preserve bodies after death, their knowledge of anatomy was minimal. As a result, they attempted only minor surgical procedures, with the exception of trepanning. According to reports of the Greek historian Herodotus, the ancient Egyptians recognized dentistry as an important surgical specialty.
Medicine in Assyria and Babylonia was influenced by demonology and magical practices. Surprisingly accurate terra-cotta models of the liver, then considered the seat of the soul, indicate the importance attached to the study of that organ in determining the intentions of the gods. Dreams also were studied to learn the gods’ intentions.
While magic played a role in healing, surviving cuneiform tablets indicate a surprisingly empirical approach to some diseases. The tablets present an extensive series of medical case histories, indicating a large number of medical remedies were used in Mesopotamia, including more than 500 drugs made from plants, trees, roots, seeds, and minerals. Emollient enemas were given to reduce inflammation; massage was performed to ease gastric pain; the need for rest and quiet was stressed for some diseases; and some attention was paid to diet. Water was regarded as particularly important, since it was the sacred element of the god Ea, the chief among the numerous healing gods. The serpent Sachan was also venerated as a medical deity.
Hebrew medicine was mostly influenced by contact with Mesopotamian medicine during the Assyrian and Babylonian captivities. Disease was considered evidence of the wrath of God. The priesthood acquired the responsibility for compiling hygienic regulations, and the status of the midwife as an assistant in childbirth was clearly defined. Although the Old Testament contains a few references to diseases caused by the intrusion of spirits, the tone of biblical medicine is modern in its marked emphasis on preventing disease. The Book of Leviticus includes precise instructions on such varied subjects as feminine hygiene, segregation of the sick, and cleaning of materials capable of harboring and transmitting disease. Although circumcision, the surgical removal of the foreskin on the male’s penis, is the only surgical procedure clearly described in the Bible, common medical practices include wounds dressed with oil, wine, and balsam. The leprosy so frequently mentioned in the Bible is now believed to have embraced many skin diseases, including psoriasis.
The practices of ancient Hindu, or Vedantic, medicine (1500-1000 bc) are described in the works of two later physicians, Charaka (lived about 2nd century ad) and Susruta (lived about 4th century ad). Susruta gave recognizable descriptions of malaria, tuberculosis, and diabetes. He also wrote about Indian hemp, or Cannabis, and henbane for inducing anesthesia, and included specific antidotes and highly skilled treatments for bites of venomous snakes. An ancient Hindu drug derived from the root of the Indian plant Rauwolfia serpentina was the source of the first modern tranquilizer. In the field of surgery, the Hindus are acknowledged to have attained the highest skill in all antiquity. They were probably the first to perform successful skin grafting and plastic surgery for the nose.
With the rise of Buddhism the study of anatomy was prohibited, and with the Muslim conquest of India, beginning around 1000 B.C, the field of medicine further declined and ultimately stagnated. Nevertheless, much valuable knowledge concerning hygiene, diet, and surgery was passed to the West through the writings of Indian physicians.
Chinese physicians believed that diseases result from imbalances in two life forces, Yin and Yang that flow through the body. Drugs and other treatments were intended to restore this balance. Hundreds of ancient herbal medicines, including iron for anemia, mercury for syphilis, arsenic for skin diseases, and opium, are still used in traditional Chinese medicine. Other Chinese medicines and techniques, including acupuncture, are now commonly used in Western medicine. Most Chinese medicine was based on a famous textbook, the Nei Ching, written by Emperor Huang Ti between 479 and 300 BC. Chinese physicians specialized in treating wounds, fractured bones, allergies, and other diseases. They diagnosed patients by asking questions about symptoms, diet, and previous illnesses, and by checking the patient’s pulse.
Greek culture, renowned for its masterpieces of art, poetry, drama, and philosophy, also made great advances in medicine. The earliest Greek medicine still depended on magic and spells. Homer considered Apollo the god of healing. Homer’s Iliad, however, reveals a considerable knowledge of the treatment of wounds and other injuries by surgery, already recognized as a specialty distinct from internal medicine.
By the 6th century BC, Greek medicine had left the magic and religious realm, instead stressing clinical observation and experience. In the Greek colony of Crotona the biologist Alcmaeon (lived about 6th century BC) identified the brain as the physiological seat of the senses. The Greek philosopher Empedocles elaborated the concept that disease is primarily an expression of a disturbance in the perfect harmony of the four elements—fire, air, water, and earth—and formulated a rudimentary theory of evolution.
Kos and Cnidus are the most famous of the Greek medical schools that flourished in the 5th century BC. Students of both schools probably contributed to the Corpus Hippocraticum (Hippocratic Collection), an anthology of the writings of several authors, although popularly attributed to Hippocrates, who is known as the father of medicine. Hippocrates was the greatest physician in antiquity. He convinced physicians that disease had identifiable causes and was not due to the supernatural. His writings were used in medical textbooks well into the 19th century. Greek physicians introduced such modern ideas as prognosis, or outcome of disease, and the use of case histories of actual patients to teach students. The highest ethical standards were imposed on physicians, who took the celebrated oath usually attributed to Hippocrates and still used in modified form today.
Although not a practicing physician, the Greek philosopher Aristotle contributed greatly to the development of medicine by his dissections of numerous animals. He is known as the founder of comparative anatomy. Further progress in understanding anatomy flourished by the 3rd century BC in Alexandria, Egypt, which was firmly established as the center of Greek medical science. In Alexandria the anatomist Herophilus performed the first recorded public dissection, and the physiologist Erasistratus did important work on the anatomy of the brain, nerves, veins, and arteries. The followers of these men divided into many contending sects. The most notable were the empiricists who based their doctrine on experience gained by trial and error. The empiricists excelled in surgery and pharmacology; a royal student of empiricism, Mithridates VI Eupator, king of Pontus, developed the concept of inducing tolerance of poisons by the administration of gradually increased dosages.
Alexandrian Greek medicine influenced conquering Rome despite initial resistance from the Romans. Asclepiades of Bithynia was important in establishing Greek medicine in Rome in the 1st century bc. Asclepiades taught that the body was composed of disconnected particles, or atoms, separated by pores. Disease was caused by restriction of the orderly motion of the atoms or by the blocking of the pores, which he attempted to cure by exercise, bathing, and variations in diet, rather than by drugs. This theory was revived periodically and in various forms as late as the 18th century.
Galen of Pergamum, also a Greek, was the most important physician of this period and is second only to Hippocrates in the medical history of antiquity. His view of medicine remained undisputed into the Middle Ages (5th century to 15th century). Galen described the four classic symptoms of inflammation and added much to the knowledge of infectious disease and pharmacology. His most important work, however, was in the field of the form and function of muscles and the function of the areas of the spinal cord. He also excelled in diagnosis and prognosis. Some of Galen’s teachings tended to hold back medical progress, however, such as his theory that the blood carried the pneuma, or life spirit, which gave it its red color. This theory, coupled with the erroneous notion that the blood passed through a porous wall between the ventricles of the heart, delayed the understanding of circulation and did much to discourage research in physiology. The importance of Galen’s work cannot be overestimated, however, for through his writings knowledge of Greek medicine was subsequently passed to the Western world by the Arabs.
While the Romans learned most of their medical knowledge from Egypt, Greece, and other countries that they conquered, their own contributions involved sanitation and public health. Roman engineers built aqueducts to carry pure water to residents of Rome, a sewage system to dispose of human wastes, and public baths. These measures helped to prevent infectious diseases transmitted by contaminated water.
The gradual infiltration of the Roman world by a succession of barbarian tribes was followed by a period of stagnation in the sciences. These invasions destroyed the great medical library in Alexandria (Alexandria, Library of) and many of its books and medical manuscripts were lost. Western medicine in the Middle Ages consisted of tribal folklore mingled with poorly understood remnants of classical learning. Even in sophisticated Constantinople (now İstanbul), a series of epidemics served only to initiate a revival of magical practices, superstition, and intellectual stagnation.
In the 7th century ad a vast portion of the Eastern world was overrun by Arab conquerors. In Persia (now Iran), the Arabs learned of Greek medicine at the schools of the Nestorian Christians, a sect in exile from the Byzantine Empire. These schools had preserved many texts lost in the destruction of the Alexandria Library. Translations from Greek were instrumental in the development of an Arabic system of medicine throughout the Arab-speaking world. Followers of the system, known as Arabists, did much to elevate professional standards by insisting on examinations for physicians before licensure. They introduced numerous therapeutic chemical substances and excelled in the fields of ophthalmology and public hygiene.
Important among Arabist physicians was al-Razi, who was the first to identify smallpox and measles and to suggest blood as the cause of infectious diseases. Avenzoar was the first to describe the parasite causing the skin disease scabies and was among the earliest to question the authority of Galen. Maimonides wrote extensively on diet, hygiene, and toxicology, the study of chemicals and their effect on the body. Al-Quarashi, also known as Ibn al-Nafīs, wrote commentaries on the writings of Hippocrates and treatises on diet and eye diseases. He was the first to determine the pathway of blood, from the right to the left ventricle via the lungs.
In early medieval Europe, religious groups established hospitals and infirmaries in monasteries and later developed charitable institutions designed to care for the victims of vast epidemics of bubonic plague, leprosy, smallpox, and other diseases that swept Europe during the Middle Ages. The Benedictines were especially active in this work, collecting and studying ancient medical texts in their library at Monte Cassino near Salerno, Italy. St. Benedict of Nursia, the founder of the order, obligated its members to study the sciences, especially medicine. The abbot of Monte Cassino, Bertharius, was himself a famous physician.
During the 9th and 10th centuries Salerno became Europe’s center for medical care and education and was the site of the first Western school of medicine. By the 12th century other medical schools were established at the universities of Bologna and Padua in Italy, the University of Paris in France, and OxfordUniversity in England.
In the 13th century, medical licensure by examination was endorsed and strict measures were instituted for the control of public hygiene. Representative scientists of this period include the German scholastic St. Albertus Magnus, who engaged in biological research, and the English philosopher Roger Bacon, who undertook research in optics and refraction and was the first scholar to suggest that medicine should rely on remedies provided by chemistry. Bacon, often regarded as an original thinker and pioneer in experimental science, was strongly influenced by the authority of Greek and Arabic medicine.
The period of the Renaissance, which began at the end of the 14th century and lasted for about 200 years, was one of the most revolutionary and stimulating in the history of mankind. Invention of printing and gunpowder, discovery of America, the new cosmology of Copernicus, the Reformation, the great voyages of discovery—all these new forces were working to free science and medicine from the shackles of medieval stagnation. The fall of Constantinople in 1453 scattered the Greek scholars, with their precious manuscripts, all over Europe.
The revival of learning in Western civilizations brought great advances in human anatomy. Some resulted from the work of artists, including Italian Leonardo da Vinci, who dissected human corpses to portray muscles and other structures more accurately. Andreas Vesalius, a Belgian anatomist, clearly demonstrated hundreds of anatomical errors introduced by Galen centuries earlier. Gabriel Falliopius discovered the uterine tubes named after him (see Fallopian Tube) and diagnosed ear diseases with an ear speculum. He described in detail the muscles of the eye, tear ducts, and fallopian tubes. Italian physician Girolamo Fracastoro recognized that infectious diseases are spread by invisible so-called seeds that can reproduce themselves. He founded modern epidemiology, the study of how diseases spread. The term syphilis, applied to the virulent disease then devastating Europe, was derived from his famous poem, “Syphilis sive Morbus Gallicus” (Syphilis or Disease of Gauls, 1530). Ambroise Paré introduced new surgical techniques and helped to found modern surgery.
The Dawn of Modern Medicine
The event that dominated 17th-century medicine and marked the beginning of a new epoch in medical science was the discovery of how the blood circulates in the body by the English physician and anatomist William Harvey. Harvey’s “Essay on the Motion of the Heart and the Blood” (1628) established that the heart pumps the blood in continuous circulation. The Italian anatomist Marcello Malpighi advanced Harvey’s work by his discovery of tiny blood vessels called capillaries, and the Italian anatomist Gasparo Aselli provided the first description of the lacteals, capillaries found in the lymphatic system. In England the physician Thomas Willis investigated the anatomy of the brain and the nervous system and was the first to describe diabetes mellitus. The English physician Francis Glisson advanced the knowledge of the anatomy of the liver, described the nutritional disorder rickets (sometimes called Glisson’s disease), and was the first to prove that muscles contract when activity is performed. The English physician Richard Lower studied the anatomy of the heart, showed how blood interacts with air, and performed one of the first blood transfusions.
The French mathematician and philosopher René Descartes, who also made anatomical dissections and investigated the anatomy of the eye and the mechanism of vision, maintained that the body functioned as a machine. This view was adopted by the so-called iatrophysicists, such as Italian physician Sanctorius, who investigated metabolism, and the Italian mathematician and physicist Giovanni Alfonso Borelli, who worked in the area of physiology. Opponents of this view were the iatrochemists, who regarded life as a series of chemical processes, including Jan Baptista van Helmont, a Flemish physician and chemist, and Prussian anatomist Franciscus Sylvius, who studied the chemistry of digestion and emphasized the treatment of disease by drugs.
The English physician Thomas Sydenham, called the English Hippocrates, and later the Dutch physician Hermann Boerhaave, reestablished the significance of bedside instruction in their emphasis on the clinical approach to medicine. Sydenham carried out extensive studies on malaria and introduced the new treatment quinine, obtained from cinchona bark, into Europe in 1632. After the invention of the first compound microscope in 1590, Dutch scientist Antoni van Leeuwenhoek used this groundbreaking technology in 1676 to identify organisms later called bacteria. This was the first step toward recognition that microbes were the cause of infectious disease.
The 18th century continued to be marked by unsupported theories. The German physician and chemist Georg Ernst Stahl believed that the soul is the vital principle and that it controls organic development; in contrast, the German physician Friedrich Hoffmann considered the body a machine and life a mechanical process. These opposing theories of the vitalists and the mechanists were influential in 18th-century medicine. The British physician William Cullen attributed disease to the excess or deficiency of nervous energy; and the physician John Brown of Edinburgh taught that disease was caused by weakness or inadequate stimulation of the organism. According to his theories, known as the Brunonian system, stimulation should be increased by treatment with irritants and large dosages of drugs. In opposition to this system, the German physician Samuel Hahnemann developed the system of homeopathy late in the 18th century, which emphasized small dosages of drugs to cure disease.
Other unusual medical practices developed toward the end of the 18th century include phrenology, a theory formulated by the German physician Franz Joseph Gall, who believed that examination of the skull of an individual would reveal information about mental functions. The theory of animal magnetism developed by the Austrian physician Franz Mesmer was based on the existence of a magnetic force having a powerful influence on the human body.
Despite these unorthodox medical practices, the end of the 18th century was marked by many true medical innovations. British physicians William Smellie and William Hunter made advances in obstetrics that established this field as a separate branch of medicine. The British social reformer John Howard furthered humane treatment for hospital patients and prison inmates throughout Europe. In 1796 British physician Edward Jenner introduced vaccination to prevent smallpox. His efforts both controlled this dreaded disease and also established the science of immunization.
Many discoveries made in the 19th century led to great advances in diagnosis and treatment of disease and in surgical methods. Medicine’s single most important diagnostic tool, the stethoscope, an instrument used to detect sounds in the body such as a heart beat, was invented in 1819 by French physician René-Théophile-Hyacinthe Laënnec. A number of brilliant British clinicians studied and described diseases that today bear their names. British physician Thomas Addison discovered the disorder of the adrenal glands now known as Addison’s disease; Richard Bright diagnosed the kidney disorder, Bright’s disease; British physician Thomas Hodgkin described a cancer of lymphatic tissue now known as Hodgkin’s disease; British surgeon and paleontologist James Parkinson described the chronic nervous system disease called Parkinson disease; and the Irish physician Robert James Graves diagnosed the thyroid disorder exophthalmic goiter, sometimes called Graves’ disease.
Medicine, like all other sciences, is subject to influences from other fields of study. This was particularly true during the 19th century, renowned for its great scientific innovations. For instance, the evolutionary theory proposed by Charles Darwin in On the Origin of Species by Means of Natural Selection (1859) revived interest in the science of comparative anatomy and physiology. And the plant-breeding experiments of the Austrian biologist Gregor Johann Mendel in 1866, although initially overlooked, eventually had a similar effect in stimulating studies in human genetics (see Heredity).
German pathologist Rudolf Virchow pioneered development of pathology, the scientific study of disease. Virchow showed that all diseases result from disorders in cells, the basic units of body tissue. His doctrine that the cell is the seat of disease remains the cornerstone of modern medical science. In France, physiologist Claude Bernard performed important research on the pancreas, liver, and nervous system. His scientific studies, which emphasized that an experiment should be objective and prove or disprove a hypothesis, were the basis for the scientific method used today. Bernard’s work on the interaction of the digestive system and the vasomotor system, which controls the size of blood vessels, was developed further by the Russian physiologist Ivan Petrovich Pavlov, who developed the theory of the conditioned reflex, the basis of human behaviorism.
A milestone in medical history occurred in the 1870s when French chemist Louis Pasteur and German physician Robert Koch separately established the germ theory of disease. Important in the development of this theory was the pioneering work of the American physician and author Oliver Wendell Holmes and of the Hungarian obstetrician Ignaz Philipp Semmelweis, who showed that the high rate of mortality in women after childbirth was attributable to infectious agents transmitted by unwashed hands (see Puerperal Fever).
Soon after the germ theory was recognized, the causes of such age-old scourges as anthrax, diphtheria, tuberculosis, leprosy, and plague were isolated. Pasteur developed a way to prevent rabies using a vaccine in 1885. In the last decade of the 19th century, German physician Emil von Behring and German bacteriologist Paul Ehrlich developed techniques for immunizing against diphtheria and tetanus.
New understanding of infectious diseases made surgery safer. Until the 1800s, surgeons operated in their street clothes, often without even washing their hands. Operating rooms, like other parts of hospitals, were filthy. About half of all surgery patients who survived the actual surgery typically died of infections that developed after the operation. The era of aseptic surgery, in which physicians used sterilized instruments and techniques to avoid infecting patients, was heralded by British surgeon and biologist Joseph Lister. With his introduction of an effective antiseptic, carbolic acid, Lister was able to successfully reduce mortality from wound infection (see Antiseptics). Rubber gloves were first worn during surgery in 1890, and gauze masks in 1896.
Another great advance in surgery came with the discovery of anesthesia. Until the 19th century, doctors used alcohol, opium, and other drugs to relieve pain during surgery. These medications could sometimes dull pain but could never completely mask it—patients often suffered from shock and died during surgery. In the United States, physician Crawford Long discovered the anesthetic effects of ether in 1842, and the dentist William Morton used ether in a tooth extraction in 1846. Ether and other anesthetics reduced surgical mortality and enabled surgeons to perform longer, more complicated operations.
A new tool for diagnosing internal diseases became available in 1895 when German scientist Wilhelm Roentgen discovered X-rays. The Danish physician Niels Ryberg Finsen developed an ultraviolet-ray lamp, which led to an improved prognosis for some skin diseases (see Ultraviolet Radiation). In 1898 in France, Marie and Pierre Curie discovered radium, which was later used to treat cancer.
In 1898 British physician Ronald Ross proved the role of the mosquito as a carrier of the malarial parasite, a disease that has been widespread and sometimes fatal for most of human history. In 1900 United States Army physician Walter Reed and his colleagues, acting on a suggestion made by the Cuban biologist Carlos Juan Finlay, demonstrated that the mosquito is the carrier of yellow fever. This finding lead to better sanitation and mosquito control, resulting in the virtual elimination of this disease from Cuba and other areas.
Medicine’s most revolutionary advances have occurred since 1900. By the end of the 20th century, medical advances helped to increase the average person’s life expectancy by almost 30 years. As people lived longer, new medical challenges emerged. Heart disease, cancer, stroke, and other conditions often associated with aging replaced infectious diseases as the leading causes of death. Physicians began to devote greater attention to preventing disease and keeping patients healthy into advanced age. Biomedical research also shifted focus to the most basic causes of diseases, including defects in individual genes.
Infectious diseases that historically have killed millions of people each year were conquered early in the 20th century by improved sanitation, antibiotics, and vaccines.
German physician Paul Ehrlich showed around 1910 that a chemical compound, arsphenamine, could treat syphilis. He opened the era of chemotherapy, in which physicians use chemical compounds that act selectively to target specific diseases.
In the early 1930s, German and French scientists showed that sulfonamide was effective in treating streptococcal bacteria infections. This discovery led to the first family of so-called wonder drugs, the sulfonamide antibiotics. In 1938 British biochemists Howard Florey and Ernst Chain purified penicillin, the bacteria-destroying compound that Alexander Fleming observed in mold ten years earlier. Streptomycin, the first antibiotic for tuberculosis, was discovered in 1944 by American microbiologist Selman Waksman. Dozens of other antibiotics were subsequently discovered, each stronger and more effective against a broader range of bacteria.
Scientists learned more about how the body’s immune system protects itself from infections, resulting in new tests for diagnosing infectious diseases and new vaccines to prevent them. The Wasserman blood test for syphilis was developed in 1906 and the tuberculin skin test for tuberculosis appeared in 1908. By the 1930s new techniques for growing viruses in the laboratory led to vaccines against viral diseases. These included a yellow fever vaccine in the late 1930s and the first effective influenza vaccine in the 1940s. The American physician Jonas E. Salk developed a polio vaccine in 1954. Later virologist Albert B. Sabin developed a safer oral polio vaccine, which was in wide use by the 1960s. Later came vaccines for other childhood diseases, including measles, German measles, mumps, and chicken pox.
Infectious diseases, once thought conquered by antibiotics, became a major concern again in the 1990s. New forms of tuberculosis and other diseases resistant to antibiotics spread. Concerns also arose over new or newly recognized microbes, such as human immunodeficiency virus (HIV), the cause of acquired immunodeficiency syndrome (AIDS), which became epidemic in 1981. As human populations grow and expand into wilderness areas, humans and animals come in closer contact. A number of diseases transmitted from animals have become problematic in recent years, including the hemorrhagic fevers caused by the Ebola and Marburg viruses, Hantavirus pulmonary syndrome, and Lyme disease. In other areas, physicians recognized that an easily curable bacterial infection caused most peptic ulcers, a disease once blamed on stress and diet.
Polish-born American biochemist Casimir Funk introduced the term vitamine in 1912. Researchers later identified vitamins needed by the body to prevent deficiency diseases such as beriberi, rickets, scurvy, and pellagra. As better nutrition was developed and the quality of life improved, these diseases almost disappeared from industrialized countries (see Human Nutrition). But by the end of the 20th century, other nutritional disorders emerged. Studies conducted in the United States in the 1990s showed that more than 97 million Americans were overweight and risked health problems, such as heart disease and diabetes mellitus, commonly associated with obesity.
Operations that people once regarded as impossible became routine in the 20th century. Many of these surgical advances resulted from improved drugs or medical technology. Better drugs to prevent rejection of transplanted organs made transplantation of hearts, kidneys, livers, lungs, and other organs removed from donors possible. Patients were kept alive with artificial kidneys and temporary artificial hearts while awaiting a transplant (see Medical Transplantation). The heart-lung machine made it possible to stop and restart the heart during coronary bypass surgery. Small fiber-optic instruments called endoscopes led to the new field of minimally invasive surgery. These new tools made it possible to remove a diseased gallbladder or appendix, for example, through small slits rather than large incisions, greatly reducing the amount of anesthesia required during the surgery and lessening recovery time. Transfusions of blood, plasma, and other saline solutions, which went into use in the 1930s, helped prevent deaths from shock in surgery patients. In the 1990s, physicians even began performing surgery to repair defects in unborn infants.
New methods for viewing diseased structures inside the body improved diagnosis of disease beginning in the 1970s (see Radiology). A gamma camera detects radioactive medication that attaches to certain forms of cancer cells. Computed tomography (CT) scanners use X rays to produce lifelike three-dimensional images of body structures. Magnetic resonance imaging (MRI) scanners produce highly detailed images without X-rays. Positron emission tomography (PET) detects very early warning signs of disease. Sonograms, or ultrasound, taken with high-frequency sound waves diagnose disease and monitor the progress of pregnancies. X -rays and high-energy particles emitted by linear accelerators also are used to treat cancer. Lithotripsy uses high-frequency sound waves to destroy some kidney stones and gallstones, conditions that once required surgery
Even in the early part of the 20th century, mental illness was almost a sentence of doom, and mentally ill persons were handled with cruel confinement and little medical aid. In the latter half of the century, successful therapy for some mental illnesses has greatly improved the prognosis for these diseases and has partly removed their stigma.
The theories advanced by Austrian physician Sigmund Freud were among the first attempts to understand malfunctioning of the mind, but the methods of psychoanalysis advocated by Freud and modified by his followers proved ineffective for treating certain serious mental illnesses. Two early attempts to treat psychotic illness were the destruction of parts of the brain in a procedure called lobotomy, introduced in 1935, and electroconvulsive therapy, devised in 1938. Lobotomy and less severe forms of psychosurgery are now used only rarely, and electroconvulsive therapy is primarily a treatment for depressive illness that has not responded to drug therapy.
A new era in treatment of schizophrenia, a severe form of mental illness, began in the early 1950s with the introduction of phenothiazine drugs. These drugs led to a new trend, deinstitutionalization, in which patients were released from mental hospitals and treated in the community. Valium (see Diazepam) and other benzodiazepine drugs went into wide use in the 1970s for treating anxiety and other emotional illness. Late in the century, there was growing awareness about the importance of diagnosing and treating clinical depression, a leading cause of suicide. Advanced imaging techniques that show the structural and functional differences in the brains of people with certain mental illnesses have opened the door for new treatment options.
Genetics and Biotechnology
The discovery of genes and their role in heredity and disease was one of the most important medical advances in history (see Genetics). In 1953 British biophysicist Francis Crick and American biochemist James Watson identified the double-helix structure of deoxyribonucleic acid (DNA). This discovery helped to explain how DNA carried genetic information. In the 1960s American biochemist Marshall Nirenberg added key details about how DNA determines the structure of proteins.
Indian-born American biochemist Har Gobind Khorana was the first to synthesize a gene in the laboratory in 1970, forging the way for scientists to develop ways to isolate, alter, and clone, or copy, genes. They applied these genetic engineering techniques to the diagnosis and treatment of diseases. Researchers identified genes associated with cancer, heart disease, mental illness, and obesity. With the genes identified, they worked on ways of modifying the genes to treat the disease. Gene therapy emerged as an experimental medical field that used genetically modified genes to treat diseases. In 2003 scientists completed the sequence of the human genome, in which they identified all the genes needed to make a human being (see Human Genome Project).
Genetic engineering techniques enabled production of scarce human hormones and other materials for use as drugs. A new biotechnology industry started producing these materials for medical use. Scientists also began genetically modifying sheep and other animals to produce drugs in their milk.
In 1905, British scientist Ernest H. Starling introduced the word hormone to describe substances secreted by the endocrine glands that regulate body functions (see Endocrine System). The discovery of adrenaline, or epinephrine, in 1901 led to identification and isolation of other hormones. One of the most important advances was the discovery of insulin by Canadian scientists Frederick Banting and Charles H. Best and Scottish physiologist John J. Macleod in 1921. For years people with diabetes mellitus used insulin extracted from animal pancreases. In 1981, human insulin produced using biotechnology became available. American physicians made another major advance in endocrinology in 1949. They discovered that cortisone, an adrenal gland hormone, relieved inflammation. New discoveries about human sex hormones later led to the first birth control pills.
Pregnancy and Childbirth
Great advances were made in birth control with the improvement of intrauterine devices in the 1950s and the development of the birth control pill in 1960 by the American biologist Gregory Pincus. By the 1990s long-lasting hormonal implants and contraceptive injections such as Depo-Provera were developed. These options gave women more control in deciding whether to become pregnant. Voluntary sterilization, involving vasectomies in men and tubal sterilization in women emerged as a popular way of permanent birth control. Unwanted pregnancies, however, remained a serious problem in the late 1990s. Researchers still sought more convenient and safer methods of birth control, including a male birth control pill.
By 1975 physicians were able to diagnose some congenital or inherited diseases before childbirth.Doctors take samples of placental cells or of the amniotic fluid around the fetus to determine whether hereditary blood diseases, Down syndrome, defects of the spine, or other congenital diseases are present. Even the sex of a fetus may be known in advance.
In addition to advances in early diagnosis, progress occurred in identifying the causes of some birth defects. Excess alcohol consumption during pregnancy was linked to fetal alcohol syndrome, and inadequate intake of the vitamin folic acid was linked to spina bifida and other neural tube defects.
Advances in treating infertility, which prevents couples from having children, began with the world’s first so-called test-tube baby born in the 1980s through in vitro fertilization. Other forms of assisted reproduction soon became available. Researchers in 1997 cloned a lamb from cells taken from an adult ewe. It led to speculation that human cloning could become another option in human reproduction.
Heart disease emerged as one of the leading causes of death in Western countries by the end of the 20th century. Great advances occurred in diagnosis, treatment, and prevention of this widespread disease.
Diagnosis improved with the widespread use of cardiac catheterization in the 1950s. This procedure involves threading a slender tube into the heart to take measurements and identify blocked arteries. Less invasive diagnostic methods, such as thallium scans in which a special imaging camera detects the movement of thallium in heart muscle, provided additional diagnostic improvements.
These techniques led to a new era in surgical treatment of coronary heart disease, artery blockages that cause most heart attacks. Physicians began treating blocked coronary arteries with a variety of new techniques. The first bypass operation was performed in 1967 and involved the creation of a new route for blood supply to reach blood-starved heart muscles. In balloon angioplasty, developed in 1977, a deflated balloon is inserted into a narrowed artery. The balloon is then inflated at the site of the narrowing to widen it. Other surgical advances included replacement of diseased heart valves with artificial valves; implantation of pacemakers that maintain normal heart rhythm; use of temporary artificial hearts; and better methods for correcting hereditary defects in the heart.
New drugs were developed to treat angina pectoris, the chest pain of heart disease; high blood pressure; dangerous abnormalities in heart rhythm; and high blood cholesterol levels. Studies showed that drug treatment could reduce the risk of a heart attack or stroke. In the 1980s, aspirin went into wide use to prevent blood clots that cause many heart attacks. Emergency medical personnel also began using drugs that dissolve clots and stop a heart attack if given soon after symptoms develop.
Advances have been made in the prevention of heart disease. Studies have identified risk factors such as high blood pressure, high blood cholesterol, cigarette smoking, diabetes, obesity, and lack of exercise. Government health agencies and public health groups began public education programs to help people reduce heart disease risks. These preventive methods seem to be working—according to the American Heart Association, the death rate from coronary heart disease declined 26.3 percent between 1988 and 1998.
Early detection and better treatment have resulted in major improvements in survival of patients with cancer. By 2000, 59 percent of people diagnosed with cancer were alive five years later, compared with only 25 percent in 1940. New drugs, surgical procedures, and ways of treating cancer with X rays and radioactive isotope radiation contributed to the improvement. In the 1990s, physicians used new knowledge about the human immune system to develop immunotherapy for some kinds of cancer, in which the immune system is stimulated to produce antibodies against specific invaders. Another form of immunotherapy is the use of monoclonal antibodies, genetically engineered antibodies that target specific cancer cells.
Screening tests for early detection of cancers of the cervix, prostate, breast, and colon and rectum became widely available. Researchers also made progress in identifying cancer genes that are associated with an increased risk of the disease and developed screening tests for some cancer genes. Advances in gene therapy also offered promise for new cancer treatments.
Health groups placed great emphasis in the second half of the century on cancer prevention through avoiding smoking and eating a diet rich in fresh fruits and vegetables. Despite these advances, the percentage of deaths from cancer increased from about 2 percent in 1900 to about 20 percent in 2000. Much of the rise, however, resulted from an increased proportion of older people, who are more vulnerable to cancer, and from cigarette smoking.
Advances in computer and Internet technologies created new possibilities for doctors and their patients in the early 1990s. Using computers to send live video, sound, and high-resolution images between two distant locations, doctors can easily examine patients in offices thousands of miles away. Rural patients no longer had to make long trips into urban centers to consult specialists.
In telemedicine, a computer fitted with special software and a video camera turns a live video image of a patient into a digital signal. This signal is transmitted over high-speed telephone lines to similar equipment at the doctor’s office, where it is converted back into a format that can be viewed live on a television screen. Telemedicine also includes machines specially designed to measure and record a patient’s vital signs at home then transmit the information directly to a hospital nursing station. This electronic remote home care enables health care professionals to monitor a patient’s heart rate, temperature, blood pressure, pulse, blood-oxygen levels, and weight several times a day, without the patient ever having to leave home.
In addition to providing a vehicle for doctors and patients in remote locations to interact, telemedicine also enabled doctors in distant locations to share information. Patient charts, X rays, and other diagnostic materials can be transmitted between doctors’ offices. Moreover, doctors in rural areas of the world can observe state-of-the-art medical procedures that they would otherwise have had to travel thousands of miles to witness. Still in its infancy in the late 1990s, telemedicine may one day alleviate some of the regional inequalities inherent in modern medicine, not just between regions of North America, but also between developing countries and urban medical centers in the industrialized world.
History of Medicine
Evidence of medical practice has been found in the earliest of human settlements. The individual who took the role of healer combined religion with their primitive knowledge of science and created rituals to aid in healing. Information such as incantations, magical spells and the knowledge of which plants had healing properties was usually passed down from generation to generation.
From the earliest times, trial-and-error revealed plants and parts of animals to be poisonous, edible, or useful in disease; this led to medical folklore and herbal remedies. Prior to the scientific revolution of the last century, attempts to cope with serious disease were frustrated by the lack of a satisfactory theory of disease or knowledge of causes.
Although the study of anatomy grew rapidly from the time of Aristotle and the Alexandrian medical school in 300 BC, physiology and ideas of organ function remained rudimentary. Speculations untested by experiments were by present-day standards grotesque, and gave rise to such practices as trephining, bleeding, cupping, and purging, often accompanied by magic rituals and incantations.
Nevertheless, in a few societies doctors accurately recorded relevant events, such as Indian physicians who in 1000 BC were listing features of several common disabling diseases. Chinese medicine is documented from 600 BC, and the first Chinese medical treatise was 1st-c BC. The major contribution of Greek medicine was in the field of medical ethics, and the Hippocratic code of conduct is still invoked today.
The man regarded as the father of “modern” medicine was Hippocrates, a Greek scholar who is best known for setting forth principles that he believed healers should follow, including the belief that a physician should work not for personal gain but for the love of humanity. His guidelines for a physician’s conduct became known as the Hippocratic Oath, which newly graduated physicians still swear to.
Roman medicine was pre-eminent in public health, with its emphasis on clean water, sewage disposal, and public baths. The emphasis by early Christians on miracles was balanced by the impulse to comfort and nurse the sick. Arabian medicine made significant contributions to chemistry and drugs, and set up the first organized medical school in Salerno. From there the torch was passed to Padua, where Vesalius corrected the anatomical misconceptions of Galen, and thereafter to Montpelier, Leyden, Edinburgh, and London.
The major medical discovery of the 17th-c was the circulation of the blood; 100 years later, oxygen and its relationship to blood. In the 18th-c, clinical bedside teaching became the favoured method of doctor training, as it is today. The value of post-mortem studies was demonstrated by Morgagni in Padua. New methods of examination were introduced notably the stethoscope (by Laënnec) and percussion of the chest.
Jenner showed the benefit of vaccination to prevent smallpox (though such procedures were known in 16th-c China). The germ theory of disease dominated the 19th-c, and Pasteur virtually created the science of bacteriology, from which Lister was inspired to develop the concept of antisepsis. By the end of the century, mosquitoes were known to carry malaria and yellow fever. Roentgen discovered X-rays, and the Curies radium. Freud developed psychiatry.
Progress in the 20th-c has been unparalleled, being distinguished by the growth in modern technology and the development of rigorous experimental testing. Thus the claim for the efficacy of a new drug, for example, does not rest on anecdote, but on carefully planned double-blind animal and human trials in statistically controlled populations.
Progress was stimulated rather than hindered by World Wars 1 and 2, in such areas as rehabilitation after injury, blood transfusion, anesthesia, and chemotherapy, including the development of antibiotics and vitamins and the discovery of insulin and cortisone. New concepts have included genetic disease, the baleful effects of some lifestyles and environmental pollution, vaccination for the majority of infectious disease, artificial organ and life-support systems, organ transplantation, and the science of immunology.
Up to the 20th century, most physicians were general practitioners, serving all the medical needs of their communities. The first area of specialization in medicine was surgery. As technological and scientific advances deepened our understanding of health and the human body, more specialized services became practical.