By Sarah Thompson at Brownstone dot org.
In the history of medicine, there have been two primary means of determining whether a substance has a medicinal application: theory and observation. The use of drugs in medicine has generally followed a pattern of trial and error, where a substance comes into use until it is determined to be harmful, at which point it is quietly removed from circulation, generally because something new has been discovered or invented to take its place.
In this era of controlled drug trials and regulatory bodies, there is a pretense of attempting to determine whether a drug works and is safe before it is given to patients. However, the definitions of "control," "efficacy," and "safety" are loose and malleable in practice, as evidenced by the difficulty of reproducibility, which requires that an experiment be repeated as described in a study and yield the same or statistically similar results. Often, it does not.
Why, then, do so many people continue to trust the curated results of such research? This stems from the perception, in the popular mind, that institutionalized contemporary medicine has a strong track record of empirical successes that justify continued faith in its structure and outcomes.
This belief forms the emotional receptors for pro-materialist medical narratives, which condition the intellect to assume that whatever is printed or said in favor of that approach to disease is accurate and correct.
There are three main pillars upon which the defense of contemporary mechanistic medicine rests in the popular mind: vaccines, antibiotics, and anesthesia. These three combined, we are told, have so greatly extended the average lifespan that any deleterious effects of the medical system are outweighed by orders of magnitude.
Medical error is acknowledged as real, and iatrogenic (physician-caused) injury and death as well, but these costs, while tragic, are considered minor negatives along the meteoric curve of positives.
Vaccines have been the subject of debate since their invention in the 19th century; a lengthy catalog of harm is well-documented and the disagreements revolve around both the extent of these injuries, and the ratio of costs to benefits. Antibiotics, too, have come under scrutiny because the profligate prescribing of them has resulted in treatment-resistant infections of increasing severity and fatality, particularly in environments such as hospitals and nursing homes.
The indiscriminate use of antibiotics has been challenged both inside and outside the medical field.
Anesthesia for surgery remains the one unassailable, and undebated, triumph of modern medicine. When asked what it is that the current mainstream medical system is useful for and does well, people across the entire spectrum of medical modalities will acknowledge surgical intervention, much of which is only tolerable due to anesthesia. It has made possible the judicious application of surgery without killing people from shock.
This is unequivocally positive.
But it has also made surgery more palatable, increasing the readiness of physicians to recommend it and the willingness of patients to endure it; the injudicious use of surgery is rarely discussed. This creates secondary dangers that are often ignored or minimized.
The earliest anesthetics were alcohol and other herbal intoxicants, and, upon their introduction to Western Europe, opium and morphine. In the 19th century, ether and chloroform came into use, as well as cocaine and nitrous oxide gas. These substances reduce sensitivity to pain, but none of them reliably render a person unconscious for a fixed period of time.
The word "anesthesia" itself has Greek roots meaning "without sensibility" or "without sensation;" the divorcing of the senses from the physiological experiences of the body removes essential feedback loops within both physical and psychic integration of impingements.
Addiction to morphine ("soldier's joy") became commonplace for infantrymen in 1...