If astronomy teaches us anything, it’s that man is not the center of existence. Through the centuries, we have progressively demoted our own role in the universe. Ptolemy and Aristotle held sway over Western thinking for more than two millennia with the idea of a geocentric, or Earth-centered universe. Initially, Nicolas Copernicus’s sun-centered, or heliocentric solar system, was an improvement, but it was only marginally better. Heliocentrism gained ground due to the work of Johannes Kepler, who showed that planets could travel in ellipses, and Galileo, who turned his telescope skyward and recorded such phenomena as the phases of Venus. But even a sun-centered universe was found to be in error, as 19th century astronomers calculated the distance to the nearest stars and realized that our sun was merely one of millions of stars in orbit about the center of our galaxy. Earth was further demoted in the cosmic realm in the 1920s, when Edwin Hubble measured the first distance to a nearby galaxy and realized that our galaxy is but one of millions in the universe. Of course, Hubble himself was way off the mark. The telescope that bears his name, the Hubble Space Telescope, has discovered more stars and galaxies than astronomers dreamed possible. By one estimate, there could be as many as 500 billion galaxies in the universe.
4. Lamarckian Evolution
If heliocentrism removed man from the physical center of the universe, then Darwinian evolution by natural selection struck a second blow to remove man from the biological center of it as well. But a competing theory in Darwin’s day was advanced by Jean-Baptiste Lamarck, who posited that physical stressors experienced by one generation were somehow transferred to the next. Lamarck used the example of a giraffe’s long neck to explain his theory. Lamarck theorized that the long neck was a result of each successive generation stretching to reach a food source, somehow “imprinting” this attempt on the next generation. Darwin would counter that the environmental stressors would select only those giraffes with long necks that could reach a food source, and thus leave long-necked offspring. You could see how Lamarck was headed in the right direction, but critics would point out that if Lamarckian evolution were true, then handicapped parents would always produce handicapped offspring. What’s truly fascinating is that during the time of Darwin and Lamarck in the 19th century, Mendelian genetics, DNA, and mutation (what Darwin would call “Monstrosities”) were not yet understood.
3. Radiation Therapy For Minor Maladies
The discovery of radioactive decay came about from the pioneering work of Wilhelm Röntgen, Henri Becquerel, and Marie Curie. This early lab work was messy and dangerous, as researchers such as Curie ultimately died from symptoms of long-term radiation exposure. Yet despite the inherent dangers, radioactive potions were promoted as a cure-all well into the 20th century. One such concoction, Radithor, was composed of a weak solution of radium salts and was pitched as a cure for everything from stomach ailments to mental illness. To this day, tourists flock to abandoned mines hoping for cures via “radon-bathing.” Of course, targeted radiation treatment has found a modern scientific niche in cancer therapy, but even this may one day give way to gene therapy, and hopefully the brute-force methods of chemotherapy and radiation treatments, which kill good cells along with the bad, will be relegated to the past.
The concept of phlogiston theory was born of alchemy and died hard in the world of chemistry. The basic idea was that all chemicals and physical substances possessed an unknown property or attribute (phlogiston) that they lost during a chemical change, such as burning, that caused them to transition to a de-phlogisticated state. Yes, the theory sounds ridiculous today, but it held sway in the scientific world for almost a century. German physician J.J. Becher proposed the theory in 1667 to explain two common states of chemical change observed by medieval alchemists, combustion and oxidation, or rusting. He believed even decomposition of matter could be explained by “dephlogistication.” Even in early medieval biochemistry, the idea existed that the very essence of life could be reduced from blood, bile, or other bodily fluids. That all changed with the discovery of oxygen by Joseph Priestley in 1774. Priestley collected oxygen in a glass tube by exposing mercuric oxide to sunlight. He noted that the collected oxygen allowed a contained candle to burn more brightly, and also allowed a mouse in a jar to live longer. He also conducted experiments the preferred 18th century way; by huffing the unknown gases himself and noting the effects. Still, the discovery of oxygen directly countered the phlogiston theory; rather than something being removed from the substance, oxygen in the air combined with the combusted material to force a reaction. It’s curious that metallurgists have known this method of stoking a fire by forcing air into it since antiquity.
In 1673, Anton van Leeuwenhoek turned his first crude microscope toward a drop of water and noted a curious array of creatures he dubbed “animalcules.” Most observers at the time believed disease was either produced by bad air, known as “miasma," or by spontaneous generation. The discovery of bacteria and viruses brought forth the realization of their role in the spread of infectious disease. It’s amazing to think that a huge scientific advance was made through something as simple as having people wash their hands. The battle was on, as penicillin was found to combat bacterial infection in 1928. On the viral front, Edward Jenner noted that milkmaids never contracted smallpox, and he constructed the first vaccine from the cowpox virus. It’s sobering to think that more soldiers in most wars right up to modern times died of infection than from combat. True, the battle is still on as “superbugs” in the form of MRSA seek to out-evolve new antibacterial drugs, but few would argue that we should go back to the bad old days of bloodletting and the miasma theory of disease.