Share A History of Science
Share to email
Share to Facebook
Share to X
The French Revolution was the culmination of Enlightenment thinking. But rather than celebrating science, the revolutionaries suppressed ideas and killed the people that held them.
Jones, Steve
Revolutionary Science: Transformation and Turmoil in the Age of the Guillotine Book
2017, ISBN: 1681773090.
Links | BibTeX
@book{Jones2017,Close
Close
Israel, Jonathan
Revolutionary Ideas: An Intellectual History of the French Revolution from The Rights of Man to Robespierre Book
2015, ISBN: 0691169713.
Links | BibTeX
@book{Israel2015,Close
Close
In the last decade of the eighteenth century, one of the greatest experiments ever was being conducted. It wasn’t a scientific experiment, even though it was the culmination of Enlightenment thinking. It wasn’t led by scientists, although many of the most prominent leaders had discoveries and inventions to their names. And even though it was conducted in the name of progress and reason, it ended up suppressing ideas and killing the people that held them. I am talking, of course, about the French Revolution.
Hello and welcome to A History of Science: Episode 6: Revolutionary Science.
France was not a backward country before the revolution. Rather, it was closer to being the intellectual center of the world. During the eighteenth century, French academies had transformed from rigid schools teaching ancient concepts to breeding grounds for theorizing and experimentation. French public debate was alive with new ideas on natural philosophy and humanist thinking. Periodical publications, including the world’s first ever academic journal, Le Journal des Savants, disseminated these exciting new thoughts to eager French audiences.
The kings of France were amongst the enthusiastic patrons of this new scientific community. They very much liked to style themselves as enlightened monarchs of a modern state. As such, many institutions, observatories, and laboratories enjoyed the royal prefix – most notably the prestigious Royal Academy of Sciences, and were financially sponsored by the state. But individual researchers, too, were known to benefit from royal patronage. One alumnus of the University of Paris, who had become a renowned experimenter in his own right, was Pierre Polinière. He was known for exciting his audiences with demonstrations of scientific principles. At the height of his fame, king Louis XV invited him to lecture at his court. He allegedly dazzled the noble crowd by making an apple explode with an air-pump. Even Louis XVI, the ill-fated last king of France before the revolution, was known to have a keen interest in technology. Under his watch, scientists experimented with balloon flight in the gardens of Versailles, while he himself tried his hand at lock-making in his state-of-the-art workshop.
But it was in this enlightened society, inspired by thinkers such as Voltaire, Rousseau, and Montesquieu, where, in 1789, the revolutionary flash hit the pan. Frustrated by famine, taxes, and oppressive government, the lower house overthrew the political order and thoroughly reformed the state. In quick succession, parliament abolished feudal obligations, noble titles, tax exemptions, and corporal punishment. They introduced a constitution promising equality before the law, freedom of speech, and democratic representation. At the stroke of a pen, France had entered the age of modernity.
Or so the story goes. The ancient regime, the old order, was denounced as a tyrannical dictatorship, based on the arbitrary whims of a decadent aristocracy. The new France was to be built on the pillars of liberty, equality, and brotherhood – ideas that had somehow sprouted in that tyrannical intellectual climate of the ancient regime. Above all, though, the revolutionaries appealed to reason as the foundation of their ideal state. They envisioned a nation of enlightened people – citizens, not subjects – who formed a virtuous meritocracy, where no man would bow to another.
That hopeful assembly of equals in 1789 would soon be displaced by a wartime dictatorship that wielded terror as a revolutionary principle. Universities would be closed, scientists would be guillotined – all in the name of reason. In this episode, we’ll explore this ironic paradox. And we’ll start with the daily struggles of a peasant, just moments before the ominous summer of 1789.
Imagine a peasant in pre-revolutionary France. Let’s call him Jacques. Jacques works his landlord’s fields. After the grain harvest, he goes by his landlord’s estate to pay his feudal dues. The landlord owns the scale and the weights; he determines how the shares are split. With the grain he doesn’t keep to feed his family, Jacques tries to make a profit on the weekly market in a neighbouring town. Along the way there he pays toll tax to a local customs officer – another share of his grain gone, once again determined by someone else’s weights and measures. On the market square, Jacques has to abide by the town’s local units of measurement when making his sales. Poor Jacques; he has to depend on a lot of different measures; and he has to put a lot of faith in the authorities controlling them.
This example of Jacques was not uncommon in early modern France. Measures were not standardized, and as many as seven or eight hundred different ones were in use throughout the country. You could measure weight in onces, livres, or talents, and volumes in litrons – for dry goods – or pintes – for liquids. Or you could resort to units of measurement based on labour, rather than output. A journal, for example, referred to one peasant’s day worth of work on the field, rather than the produce he yielded. Distances were commonly expressed in lieues, or leagues; there were multiple variants of leagues, however. The oldest one, the lieu ancienne, measured a little over three kilometers, while the Parisian league measured just under four. The postal league beat them both with a metric equivalent of well over four kilometers.
Many people realized this was far from ideal. Various French kings, starting all the way back with Charlemagne, had tried to introduce a nation-wide measure. This had culminated in the toise becoming a more or less reliable royal standard. Charlemagne himself had defined it as the length between a man’s outstretched arms, possibly his own. An iron bar representing this measure was kept safe in the castle of Chatelet. Not safe enough, though, since at one point it fell off and had to be hammered back in place by builders. Unfortunately, they deformed it with their blows, causing Charlemagne’s arms, and France’s official royal measure, to lose a couple of millimeters.
The revolutionary French government was determined to settle this age-old problem once and for all. It was, after all, a perfect example of enlightened democrats achieving what no French king ever could. They envisioned a nation-wide system of measures that would not be based on the length of some arbitrary monarch’s limbs, but on an eternal natural constant: the size of the earth.
A committee of some of the greatest minds of the age was assembled to define this new system of measurement. It included illustrious names such as the mathematician Nicolas de Condorcet and the brilliant chemist Antoine Lavoisier. They determined that the new measure should be exactly one ten-millionth part of the length of the meridian from the North Pole to the equator, in a line that passed right through Paris. Agreeing on a definition wasn’t the greatest challenge, though. Measuring the meridian’s length was.
Measuring such enormous distances was no small feat. It required delicate equipment to be set up on tall structures throughout the country, such as mountains or towers. The scientists’ plan was to measure the distance between Dunkirk in the north and Barcelona in the south, and then extrapolate those measures to find the earth’s meridian. This turned out to be a massive undertaking that would take twenty years to complete. The surveyors traveled through a country without proper roads, in bouncing carriages in which their expensive equipment was easily damaged. Circumstances during the tumultuous revolutionary years didn’t help either. To their despair, tall structures such as fortresses or churches had often been razed to the ground by the shelling of invading armies or the zeal of anti-Christian republicans. One surveyor recounted that, when he inquired about the location of a village’s church tower, the inhabitants told him that they had “demolished those steeples that arrogantly elevated themselves above the humble dwellings of the French people”.
In spite of these hardships, however, the surveyors did ultimately succeed. At the dawn of the new century they presented their preliminary findings to an international conference of scientists. Perhaps the only blemish on that significant moment was that their newly proposed meter just happened to be exactly the same size as Charlemagne’s outstretched arms.
Use of the metric system would become widespread in the decades that followed. Napoleon would spread the meter throughout Europe with every country he conquered. Later, those countries would, in turn, introduce it to their colonies all over the world. In its early years, the French republic had achieved a major scientific success. The revolutionaries had applied reason to decidedly solve a problem that had deluded the world for centuries. They were convinced it would be the first of many. And in the darker days of the French Revolution, science would be put to use for ever more eccentric and sinister goals.
Ancient local measurements of weight, volume, and distance had been successfully transformed into mathematically sound units. Time would be next. No longer would twenty-four hours of sixty arbitrary minutes constitute a day. Nor would years be counted from the approximate birth of a messiah worshipped by a superstitious, decadent clergy.
For a people reborn into reason, a day would have ten hours. An hour would have a hundred minutes. A minute would have a hundred seconds. There would be ten days in a week, three weeks in a month, and twelve months in a year. Years would be counted from the birth of the republic, 1792 – or, as the revolutionaries said, year I of the French Republic.
To give this new calendar some poetic decoration, playwright Fabre d’Eglantine was recruited to propose the names of the new months. He came up with names based on seasonal characteristics; for example, Thermidor – from Greek thermon, meaning heat – would overlap with July, the warmest period in summer; Floreal – from French fleur, or flower – would be roughly the equivalent of May, the springtime when flowers blossomed. In Great Britain, commentators at the time ridiculed the new months of the overzealous French republicans with ironic English ‘translations’ such as Wheezy, Sneezy, and Freezy for the winter months, and Hoppy, Poppy, and Croppy for summer.
The British were not entirely unjustified in their reaction, though. The decimal calendar was decidedly more difficult to use than the Gregorian one, and it wasn’t popular with anyone. French peasants hated it, because it prolonged their working week with three days. Administrators hated it, because it was notoriously difficult to apply correctly; many aspects, such as leap years, had been inadequately thought out and had to be fixed as time went by. The clergy, many of whom had been ardent supporters of the early revolution, hated it, because it set aside their sacred calendar, and reduced them to second-rate revolutionaries.
On the outset, the metric system and the republican calendar may seem to have a lot in common; they both strive to create mathematically sound systems of reasoning for everyday use. The differences between them, however, perfectly illustrate the changing role of reason throughout the early revolutionary years. In 1789, the rationalization of the old, inefficient state with its hodgepodge of laws and taxes was a leading principle of the revolution. A few years later, though, science was reduced to merely serving the megalomaniac visions of a new elite.
Work on the metric system started early in 1790, in a spirit of liberty, equality, and progressive change. It was an ambitious initiative that demonstrated clearly how science could create fair, practical systems of thought. It would ensure that no feudal overlord would ever again demand a greater share of the harvest, simply because he happened to own the scale. The decimal calendar, on the other hand, was realized in 1793, by a paranoid wartime government that saw counter-revolutionaries everywhere. It was purposefully created as a political instrument to increase control over the French people. Longer working weeks would make peasants more productive. The abolition of sundays would make them less religious. Whoever controlled the calendar, controlled religion, economy, and ultimately, the nation itself.
Seen from a political perspective, both systems of measurement are polar opposites of each other. The metric system was designed to serve the people; the decimal calendar was created to make the people subservient to the state.
In the remainder of this episode, we’ll take a closer look at how the role of reason changed over the revolutionary years. We’ll do so through the eyes of one of the most detested characters in the French Revolution, who surprisingly, started out as a scientist himself: Jean-Paul Marat.
Jean-Paul Marat was not French by birth. He was born in Switzerland, to a family of religious refugees. His father, despite being well-educated, had a hard time providing for his family. Marat must have realized at a young age that as a Huguenot, he was a marked man. Any success he sought he would have to gain through tireless perspiration.
Marat left for Paris when he was still in his teens. There he studied medicine, receiving his medical degree for a treatise on curing gonorrhea, with which a friend of his was reportedly afflicted. To increase his social standing, he entered into the world of Parisian intellectuals. He started publishing on the philosophical issues of the day: liberty, equality, slavery. His sharp pen did get him noticed, and before long he found himself in the employ of the comte d’Artois, the younger brother of king Louis XVI.
Using his new-found wealth as a court physician, Marat concentrated his efforts on other fields of science: fire, electricity, and light. He performed countless experiments, wrote down their results in voluminous tomes, and, to his amazement, found that not all perspiration pays off.
When Marat completed his first scientific book, in which he argued that fire was a fluid rather than a material element, he sent it to the Royal Academy of Sciences for appraisal. The academy reviewed his work and was impressed with the 166 “new, precise, and well-executed experiments” Marat described in it. They did not, however, endorse his conclusions. That did not stop Marat from publishing his book with their ringing endorsement prominently covered. Infuriated members of the academy, among them renowned chemist Antoine Lavoisier, demanded he rectified. Yielding to the pressure, Marat removed the endorsement from his book. It would be the start of his tenuous relationship with the scientific establishment.
When he started conducting a second set of experiments, this time on light, Marat involved the academy early on. Over a period of seven months, he performed all experiments in front of a delegation of scientists. His peers were again impressed by his experimental zeal and persistence, but felt that they did not justify the conclusions he drew. Once again, Marat did not gain the endorsement of the academy that he so desperately sought. Disillusioned, he published his book in England rather than France. When completing his third and final academic book, on electricity, he didn’t even bother to involve the academy at all.
When the revolution broke out in 1789, Marat discarded science altogether to devote himself entirely to politics. He would not, however, forget the scientific community, or how he felt they had mistreated him.
A few years later, things had changed dramatically for Marat, and for France. Marat had started publishing a popular daily newspaper, l’Ami du Peuple – the Friend of the People – in which he expressed his increasingly radical political views. His popularity as a journalist got him elected to the new parliament. Radical politicians, Marat included, tried to appeal to popular opinion with ever more extreme proposals, culminating in them voting to execute the deposed king Louis XVI. As France was being invaded by foreign powers, determined to restore order and absolutism to the country, the French people entered into a state of constant fear of invasion and paranoia. Marat fueled the fire by constantly publishing accusations of suspected counter-revolutionaries, especially within the old noble ranks. This culminated in the notorious prison massacres, in which over a thousand prisoners were lynched by mobs of fired-up citizens.
Gone were the days of equality and brotherhood. The will of the people, carefully molded by Marat’s journal and catered to by extremist politicians, ruled. Anyone not considered a member of the common French people was suspect. This hatred of the aristocracy did not end with the old nobility. The aristocracy of intelligence was denounced just as well.
Marat allied himself with Maximilian Robespierre, the orchestrator of the revolution’s most extreme and violent period. Together, they declared war on what they saw as the scientific aristocracy. “No race of men is more dangerous to liberty, more an enemy to equality,” Robespierre declared, “than the aristocrats of science.” Science should directly serve and benefit the people, or it should not be conducted at all. “What do the diverse hypotheses with which certain scientists explain natural phenomena matter to legislators?”, he continued. Common sense should rule instead. In 1793, by order of Robespierre’s government, all universities, all academies, and all literary societies were closed down.
So here we are. In 1793, the darkest days of the French Revolution. Universities have been closed. Scientists, many of whom having been born into noble families, are being executed under the guillotine. And in a perverse irony, the people are encouraged to literally worship reason. A Cult of Reason has been invented to substitute Christianity. It encourages worship of concepts such as truth and liberty, usually portrayed by young women in Roman attire, complete with white togas and laurel wreaths. In a Festival of Reason, churches are converted into Temples of Reason, its altars replaced with ones dedicated to philosophy. The irony cannot be greater: places of worship being converted into farcical temples of reason, while science’s real temples, its universities and academies, stand empty.
Was the French Revolution, at this point, still devoted to scientific principles? I think not. Never mind the revolutionaries’ constant use of words like reason, rationality, and truth, the scientific culture had become oppressive, dogmatic, and anti-intellectual. There was no longer a free public discourse, a sharing of ideas, or a culture of investigation. The revolutionaries’ use of scientific words, by 1793, had become just decorum.
Now, there’s no denying that the French Revolution was strongly based on philosophical ideas. It also developed forms of state civics and nation building that greatly influenced modern states, and which are still relevant today. In the short term, however, France’s scientific community suffered. Science became a political instrument, rather than an independent entity. Scientists were expected to cater to the expectations of the populace, rather than follow their own academic interests. The intellectual culture that had formed Rousseau, Voltaire, and Montesquieu, whose ideas had made the revolution possible in the first place, was suppressed and replaced with dogma. It would take a decade before the universities were again officially recognized by Napoleon, and many more before France regained a scientific culture to form the likes of Louis Pasteur and Marie Curie.
How did it all play out for the characters in our story? Jean-Paul Marat was murdered by a woman whose whole family had been guillotined. For a short time afterwards, busts of his head were used to try and instill a sense of martyrdom into the followers of the Cult of Reason. In 1794, however, all prominent members of the Cult were guillotined in one of Robespierre’s power plays, practically ending its existence. A few months later, Robespierre himself was guillotined, finally ending the revolution’s darkest days. It had cost the lives or careers of dozens of renowned scientists. Among those who died were all scientists I mentioned in this episode. Gilbert Romme, mathematician and creator of the Republican Calendar, committed suicide after having been sentenced to death. Nicolas de Condorcet, mathematician and instigator of the metric system, was found murdered in his cell. Antoine Lavoisier, chemist and co-creator of the metric system, was guillotined because of his noble background. When appealing for his life, the judge perfectly demonstrated the arrogance and hypocrisy of the time: “The Republic has no need for scientists”.
If you enjoyed this episode, and want to know more about the strange paradox of science during the French Revolution, I recommend two books that served as the main sources for this episode. The first is an easy read, Steve Jones’ No Need for Geniuses, ideal for an accessible overview. If you want to get all in, try Jonathan Israel’s Revolutionary Ideas, a dense academic work detailing the philosophy that shaped the revolution. Oh, and if you’re at all interested in the French Revolution, do yourself a favor and listen to Mike Duncan’s Revolutions podcast.
Thanks for listening. Hopefully until another History of Science.
In the 16th century, alchemy became a victim of its own success. The more it achieved, the more its reputation suffered.
Moran, Bruce T
Distilling Knowledge Book
2006, ISBN: 9780674022492.
Links | BibTeX
@book{Moran2006,Close
Close
Cheating death. Curing every disease. Possessing unlimited amounts of gold. Such fantastic objectives were pursued all throughout medieval Europe by alchemists. They burned, melted, distilled, and vaporized any substance they could get their hands on. They tirelessly hunted for ancient codified scrolls in an attempt to unlock their hidden formulas. They drank, inhaled, and injected their mysterious concoctions. They suffered from lead and mercury poisoning. Their hands trembled as they picked up their flasks, their minds forgetful, anxious, and paranoid. Rather than the key to eternal life, they mostly found early graves.
In the Scientific Revolution, this superstitious alchemy died out. Modern, rational chemistry was born. Or so the story goes. In reality, there was a very thin line between the art of alchemy and the science of chemistry.
Hello and welcome to A History of Science: Episode 5: The Philosopher’s Study.
“Alchemy is a fine occupation. Not only is it very useful to human need and convenience, it gives birth every day to new and splendid effects.” Says Vanoccio Biringuccio, a mining engineer from Sienna in 1540. He continues: ”The art of alchemy is the origin and foundation of many other arts. It should be held in reverence, it should be practiced. The practitioner should enjoy that pleasing novelty which it reveals to him in operation.”
That pleasing novelty is the most direct reward for experimentation. That shiver up your spine as you peek through the microscope, or pour a few drops from a pipette into a test tube, is one of the cornerstones of science. It satisfies human curiosity, long before practical applications of discoveries, and any riches that may come with them, present themselves.
The joy of discovery was an essential ingredient of the Scientific Revolution. And as we have seen in our episode on Columbus’ voyage to America, discovery, strangely enough, had to be discovered itself. In the quote above, some fifty years after Columbus set foot on the new continent, our Italian mining engineer lovingly applies this new concept to alchemy. In his days, alchemy was the closest anyone could get to discovering new things and unraveling the threads of Creation. As such, alchemy sounds like a looming precursor to the Scientific Revolution, a promising token of what was to come.
That’s not how we remember alchemy, though, is it? In our collective memory it is much closer to the image I sketched in the introduction: crazy sorcerers working in damp basements pursuing their fever dreams of eternal life or unlimited gold. In this episode we will follow the transformation of medieval alchemy into a modern science. And especially, we’ll explore when, how, and why its reputation was utterly destroyed in the process.
What exactly alchemy was, is difficult to pinpoint. It was as much a craft as it was an art; as much esoteric mysticism as proto-science; and as much learning by doing as passing on ancient knowledge. People from any walk of life could find themselves in the elusive profession of alchemy: physicians and apothecaries, blacksmiths and masons, painters and sculptors, and peasants and noblemen alike.
Among them are many familiar names, often of people you would not expect to see in relation to alchemy. Leonardo da Vinci, who, among his many other occupancies, dabbled in alchemy, left countless notes describing alchemical procedures he had used in his art. Among these is a recipe for making mixed gold alloys look purer than they really are, an invaluable procedure for a Renaissance artist. Even Isaac Newton, arguably the hero of the Scientific Revolution, spent decades searching for alchemical texts, in a spurious effort to reorganize ancient alchemical wisdom into a coherent collection; not a story often recounted about the posterboy of modern physics.
But alchemical craft was a highly sought-after commodity in medieval and early modern Europe. Most alchemists applied their work to solve everyday problems; they extracted medicines from natural ingredients, created copper alloys to create stronger metals, or developed durable paints with brighter colors. Alchemists were commonly employed by kings and noblemen. Much like astronomers who predicted their patrons’ futures in return for a free hand at their research, alchemists forged weapons or applied mercurial medicine next to their intellectual pursuits.
And it is in these intellectual pursuits that alchemy first crosses the line into esoteric mysticism. The wildly ambitious objectives of alchemy – all-curing potions, turning base metal into gold – are often used as examples for medieval backwardness, and for the exclusion of alchemy in the realm of the real early sciences. But was alchemy really that irrational? Was there no method to the magic? And are its theories not equally obsolete as those of pre-Copernican astronomy or humoral medicine? Let’s take a closer look at two of the most mysterious objectives: the elixir of life, and the Philosopher’s Stone.
The elixir of life, also known as aqua vitae, or life water, was believed to be a super-medicine that could purify impure bodies, thus curing disease and prolonging life. This elixir could be accessed by purifying natural ingredients, usually through distillation; the closer a natural ingredient could be amended to approach its divine perfection, the stronger its qualities would be. It was a theory similar to the one used when distilling alcohol; distillation of wine, for example, yields hard liquor, which is both purer in alcohol and stronger to the taste. Distilling any medicine, then, should create a purer form that worked better.
As nonsensical as searching for this super-medicine may sound to us, the methods alchemists employed do share similarities with those of modern chemistry. The process of purifying natural ingredients to find their essence – or, as we would say, elements, is still the defining characteristic of modern chemistry. Early nineteenth-century medicines were relatively simple compounds of recently discovered elements; exactly what fifteenth-century alchemists were trying to achieve with their elixirs of life.
So, modern chemistry and medieval alchemy share the assumption that nature can be reduced to its original building blocks. The name alchemists gave to this evasive perfectly pure ingredient to natural compounds was the quintessence, or fifth essence. Why the fifth? Because it followed after the four basic elements of antiquity that somehow seem to pop up in virtually every episode of this series: earth, water, air, and fire. As we have talked about time and again, ancient and medieval philosophers held that everything – people, trees, rocks, oceans – was made out of these four elements. And alchemists, being the practical artisans that they were, went the extra mile.
They reasoned that if nature could be distilled into its original components, these pure building blocks could be recombined to form natural compounds. Any natural compounds. This is the central idea that fueled alchemists’ fever dreams of finding the Philosopher’s Stone. If earth and water could be mixed to create mud, couldn’t other natural substances be created by combining their quintessential elements? How about metals? Silver? Gold? There should be a formula out there that allowed alchemists to transmute quintessential natural elements into pure gold.
So where did alchemy fit in medieval Europe? Everywhere and nowhere at the same time; it was part practical craft, part theoretical philosophy, part magical make-believe, and it could be practiced by anyone. And while it was a mostly respectable trade, it was solidly kept out of universities. Alchemy’s rise into academia would start with a man, who ironically, did much to destroy its reputation in the process.
Philippus Aureolus Theophrastus Bombastus von Hohenheim is better known under his professional moniker, Paracelsus. Bombastus does serve him better, though. By all accounts Paracelsus was an arrogant, impossible man. Born in Switzerland in 1493, he traveled all over Europe, working as a physician, astrologer, and alchemist, among other professions. His unorthodox ideas and difficult persona meant he never stayed in one place long.
Paracelsus had grand ideas on many subjects, including alchemy. And, true to his nature, he rejected everything that was conventionally held about them. In his works on alchemy, he discarded the four classical elements that had dominated natural philosophy since antiquity, and came up with three new ones instead: the tria prima, or three principles, of sulphur, salt, and mercury. Sulphur represented combustible ingredients, salt unburnable ones, and mercury metallic elements. He applied this idea not only to alchemy, but to medicine as well. In a single stroke he rejected Hippocrates’ humoral theory, and claimed that all bodily organs possessed a kind of inner alchemist, which he called the archeus. If someone fell ill because his or her archeus malfunctioned, a physician had to take this job of physical alchemist upon him and purify the body by administering medicine.
Fascinating ideas, but they did not win him many friends. Living in the same time as Martin Luther, unorthodoxy in sixteenth-century Europe was less than welcome. And even though the unorthodoxy of Paracelsus was not religious, he wasn’t any less passionate about it. Allegedly, he burned copies of Galen’s works on the town square in Basel, an act considered nothing less than heretical in medical circles at the time. He criticized local physicians and apothecaries wherever he went, outing them as quacks who made their patients sicker rather than better. His egalitarianism came through in his disregard for titles – and subsequent disrespect for anyone holding a title – and in his inviting in of common people to his courses; he lectured to barber surgeons and midwives in German, rather than to physicians and apothecaries in Latin.
In dealing with his academic peers, he did not hold back either. In true Paracelsian form, he denounced university professors as behaving as though ‘they were the ones upon whom all belief depends, as if heaven and earth would fall apart without them.’ Paracelsus was a strong proponent of experience over authority, practically ridiculing anyone who held on to unproven theories of ancient writers. Passing down ancient knowledge, with no regard for reality, was no method to understanding Creation. True knowledge of nature, Paracelsus believed, lay in combining philosophy and alchemy; in theorizing about nature through observations and experience.
Up to this point you may be forgiven for thinking that Paracelsus was utterly modern, a true scientific thinker before his time. Someone who definitely deserves a place in the pantheon of early thinkers of the Scientific Revolution. He was, after all, an advocate for experimentation. He lectured to laymen in the vernacular. He believed the key to knowledge to lie in nature rather than books. But that is only half the story; let’s dive beneath the surface.
When Paracelsus spoke of learning through experiments, he didn’t mean it in the way modern scientists do. Most of his writing is ingrained with magical associations. He often spoke of his proposed theories in abstract, allegorical sense, not as reusable concepts. His tria prima of salt, sulphur, and mercury were as much symbolic categories as they were principal elements of matter. His organic archeus was a metaphor for how he believed the human body to work, rather than an accurate description of his observations. And rather than believing the quintessence of elements to be similar to the alcohol in wine, Paracelsus believed it to be an elusive spiritual energy that could be unlocked through transmutation. In his medical practice, he believed his patients should seek medical redemption; the spiritual vitality of their inner alchemists should be restored, so they could restart performing their function of alchemical purification.
That sounds a lot like Paracelsus found the recipe to magical mushrooms rather than gold, doesn’t it? From an academic point of view, he certainly did not adhere to an early modern mechanical philosophy. Paracelsus believed the alchemist to be a vital component of the process itself. Knowledge of Creation was attained through an evasive mix of perspiration and divine inspiration. Even though he held that the world was made up of smaller particles, he did by no means believe that these were simply physical building blocks. He never implied that anyone following an alchemical recipe could turn base metal into gold.
In spite of his alternative ideas, or perhaps because of them, Paracelsus acquired a modest following. For the next century, Paracelsian alchemy and medicine would be considered interesting albeit unorthodox perspectives. By the end of the century, however, calling any practitioner Paracelsian would be an insult.
One of the men who was keen on using his name in vain was Andreas Libavius, a German physician born ten years after Paracelsus died. He was as far removed from the innovative Paracelsus as any alchemist could be. Libavius was fiercely traditional in all his convictions. He followed Aristotelian natural philosophy and humoral medical theory, and rejected virtually all of Paracelsus’ new insights. He criticized Paracelsus’ disrespect for ancient writers, and thought he overstated the importance of natural experimentation. Yet he also rejected Paracelsus’ less rational convictions: his invoking of magic at every step, and his over-reliance on the alchemist’s personal experience.
Libavius did share one trait with Paracelsus, though: he was good at insulting others. In one of his books, Libavius asserted that Paracelsian alchemists were ‘enemies of nature’, positively ‘insane’, and ‘studious of vanities’. He asked rhetorically to his readers: ‘Could you stand to walk with such a fellow? Would he even be worthy of life?’
I think it is safe to assume Libavius really did not like Paracelsian alchemists. He did have a point there, though. The importance Paracelsus had placed on personal experience, divine inspiration, and magical formulas made alchemy more elusive than it had ever been. Among his followers were countless showmen who dazzled their audiences with chemical tricks. Snake oil salesmen sold elixirs of life with exotic quintessences that would cure any disease. Paracelsus had inadvertently opened the door of alchemy to any shrewd businessman styling himself as a wizard.
Libavius, instead, was a traditional, scholarly alchemist. He hated seeing his craft being diluted by frauds and imposters. He wanted to see alchemy in universities rather than market places. And for that to become reality, he had to remove it as far as he could from Paracelsian wizardry. And he had to act quickly, for, against all odds, Paracelsian alchemy was gaining a foothold in academia.
In 1609, Paracelsian alchemist Johannes Hartmann became professor of chymiatry in the German town of Marburg. His preserved lecture notes reveal a lot about what he intended to teach his students; half of the course material deals with magical relationships in nature, and with vague antipathies and sympathies in medicine. Only interspersed with these quasi-theoretical notions are actual recipes for conducting alchemical procedures. His code of conduct, in sharp contrast, leaves nothing to the imagination: students were required to swear an oath of obedience, loyalty, diligence, discretion, and gratitude to him personally. He also made clear that nothing of what they saw, heard, or experienced in his courses should ever be made public.
That was not the academic alchemy that Libavius had in mind. He envisioned an alchemy deeply rooted in ancient theory, with a shared body of knowledge, and progressing through experiments that were clearly described and could be reproduced by any scholar so inclined. In order to do that, he separated it from Paracelsian magic and linked it to Aristotelian theory. He based his argument on a single assumption: ‘nothing is to be received into chemistry which is not of chemistry’. This deceptively simple criterion ruled out any supernatural, magical, or divine nonsense that any Paracelsian would bring to the table. With this philosophical view of alchemy he convinced professor Zacharius Brendel of the university of Jena to accept chemistry as an academic field in its own right. “Would you not,” he asked the eloquent professor, “like to do for alchemy what Tycho Brahe did for astronomy?”
Professor Brendel certainly felt like following in the footsteps of the man who drew astronomy into the realm of science. During the remainder of the seventeenth century, alchemy would increasingly be taken seriously in academic circles, and accepted as a subject worthy of study in its own right. All throughout Europe famous scholars took up the subject from within the confines of academia: Robert Boyle in England, Antoine Lavoisier in France, Herman Boerhaave in the Netherlands. They no longer called it alchemy, though. As part of the demystification process, it had been rebranded as chemistry.
So there we have it; the story of how the respectable art of alchemy fell into disrepute and was adopted into academia under the moniker chemistry. A rather arbitrary name change, and one that doesn’t do justice to the knowledge and discoveries that alchemy produced. An unprecedented name change, too. Astronomy was called astronomy when scholars believed the world to be the center of the universe; medicine was called medicine when physicians tried to cure patients by bleeding them dry. Alchemy, as we have seen, was certainly not beyond redemption.
None of the people we discussed in this episode held scientific views that still hold up today, but all of them advanced knowledge from their own perspectives. Regardless of his magical views, Paracelsus stressed the importance of experimentation that would be essential to the Scientific Revolution. Andreas Libavius, while being an advocate for authority over experience, did emphasize the importance of sharing knowledge. Alchemists really do deserve that place in the pantheon of science.
The artificial dichotomy between alchemy and chemistry is perhaps best exemplified by the question of who the first chemist was. Amazingly all people we have talked about on this episode have been called the first chemist at one point or another. Paracelsus is called the father of toxicology, because of his correct realization that all substances are poisonous in the right dosage. Libavius wrote a voluminous work that is considered by some to be the first textbook on chemistry. Even Antoine Lavoisier, who lived two centuries later, at the time of the French Revolution, is regularly branded the father of modern chemistry.
The answer to the question is, of course, irrelevant. For centuries, countless scholars contributed to the field we now call chemistry. Whoever the first chemist was, is of no concern. But we do know that whoever he was, he was certainly an alchemist.
If you enjoyed this episode and want to know more about the complex relationship between chemistry and alchemy, I warmly recommend Bruce T. Moran’s Distilling Knowledge. A link to his book, which served as the main source for this episode, is on the website, ahistoryof.science.
Thanks for listening. Hopefully until another history of science.
In the 16th century, great advances were made in anatomy. Amazingly, this didn’t lead to a single improvement in surgery, which remained crude, cruel, and lethal.
Wootton, David
Bad Medicine Book
2007, ISBN: 9780199212798.
Links | BibTeX
@book{Wootton2007,Close
Close
Vesalius, Andreas
De humani corporis fabrica Online
1543.
Links | BibTeX
@online{Vesalius1543,Close
Close
Are you as intrigued by the skeleton in this show’s logo as I am? Probably not. It does have its place in the history of science, though. It’s a five hundred years old drawing that comes out of the first modern anatomy book ever written. It has had people mesmerized by it ever since.
Nowadays, the writer of the book, the Fleming Andreas Vesalius, is hailed as a revolutionary proto-scientist and the founder of modern anatomy. And while those claims are not exaggerated, there is one thing Vesalius’ book did not have any impact on: medical practice.
His exquisitely detailed drawings of the human body did not shed doubt on the ancient practice of bloodletting, nor did they stop anyone from trepanning skulls as a treatment for insanity. They didn’t even improve amputations, Cesarean sections, or any other type of common surgery.
Hello and welcome to A History of Science: Episode 4: Between Skin and Bones.
The Scientific Revolution saw an unprecedented increase in knowledge in a wide range of subjects, from physics to astronomy to biology. By 1650, anatomists had dissected our bodies and learned the purpose of most of its organs; physicians had fiercely debated William Harvey’s theory of blood circulation – and relented to his evidence; physicists had viewed the makeup of our tissue through their microscopes, and had seen the tiny organisms that lived on it with their own eyes.
All was in place for a medical revolution. But nothing happened.
For centuries to come, the constantly increasing understanding of nature and anatomy did not change medical practice one bit. Until the late nineteenth century, doctors still relied on Hippocrates for their diagnoses and their cures. Why?
Now that’s a complex question to answer. One that can be approached from many different angles, involving an array of interesting characters and events. It is a question worth coming back to time and again, and we certainly will.
In this episode, we will explore this question from the perspective of surgery: why didn’t it change with advances in anatomy? And why was real progress – when it happened – so slow to spread? Especially, we will dive into the stories of two pioneers who made important discoveries in the very same decade, yet who never met: Andreas Vesalius and Ambroise Paré.
In 1538, Andreas Vesalius stole a corpse. In the dead of night, he sneaked to the field of gallows in his hometown of Brussels. There, the tortured corpses of executed criminals were left to rot in gibbets, until nature devoured them. Vesalius cut down one of the unfortunate men’s corpses, cut it into smaller chunks, and took it home. There, he boiled the limbs one by one, until the flesh fell off and only the blackened bones remained. He then set himself to the task of painstakingly reassembling the body, bone by bone. When he was done, he had created the first anatomical skeleton since antiquity. It would make him world famous – and unpopular with his neighbors, no doubt.
Vesalius had to operate in secrecy, as the dissection of corpses was outlawed virtually everywhere in Europe. Killing someone was one thing, but depriving him of his body on Resurrection Day was a step too far in Christian Europe. Because of that, the last dissections had been performed in pre-Christian Greece. In the fourth century BC, Herophilus and Erasistratus had dissected the bodies of condemned criminals. Their texts had been lost, and what remained of their knowledge was Galen’s account of their work. That account, combined with the knowledge he had gained from animal dissections, formed the basis of medieval anatomy. And it was deeply flawed.
When Vesalius published his book De humani corporis fabrica, or On the Fabric of the Human Body, he took full advantage of the technology of his time. Thanks to the printing press, it was richly illustrated with deeply detailed depictions of the human body. Bones, muscles, and organs were not only well described, but were clearly visible. Gone were the sketchy illustrations in Galen’s reprints.
Almost immediately, Vesalius’ book had a great impact; he corrected countless mistakes in the antique texts, and had the evidence to prove his claims. He singlehandedly sparked the study of anatomy which would become a mainstay of medical teaching up until now.
Andreas Vesalius is often regarded as a scientific revolutionary before his time, marking the beginning of the movement that would steadily disfavor the writers of antiquity and replace their dogmas with rational science. Unfortunately, however, his work did nothing to dethrone Galen and Hippocrates. The world was not ready to fully embrace the new science and throw out the old. And Vesalius was not out to dethrone anyone.
Remarkably, Vesalius did not see himself as the revolutionary he is usually portrayed as. We know he invented something – medical anatomy – that may have never been invented before. We know virtually nothing of the extent of Herophilus’ and Erasistratus’ work, and we do know that Galen only dissected animals, not people. That makes Vesalius a pioneer if ever there was one.
Andreas Vesalius was not modest: he proudly includes a picture of himself in his famous book. But he did not think he was discovering anything the ancients hadn’t known. He was convinced – as many of his contemporaries were – that Galen had dissected human cadavers himself. Any errors in his writing, then, must be due to the manual process of copying books for fifteen hundred years. Vesalius placed himself in the long tradition of rediscovering knowledge the ancients had had; he was not out to prove them wrong.
So as far as scholarship went, Vesalius’ contemporaries saw him as picking up where Galen had left off. There was no sudden realization of Hippocrates’ shortcomings, nor an unshakeable belief in experimental methods. It would be a very gradual process that – bit by bit – replaced the ancients’ doctrines over the course of centuries.
So the scientific establishment did not radically change with Vesalius’ discoveries. But how about medical practice? Shouldn’t surgery, for example, benefit greatly from this new anatomical knowledge? Or, at the very least, from the exquisitely detailed drawings in his book? Surprisingly, no. There are no records indicating that surgery became any better, safer, or more common after Vesalius’ publication. To understand this lack of practical impact, we have to look at where surgery fit in medieval society.
It was a last resort, for starters. In a time before anaesthetics, surgery was extremely crude and traumatizing. Patients were dragged in, held down on the operating table, and mercilessly sawn into. If they didn’t die from shock or blood loss, infection was a likely third option. Obviously, surgery took a high toll on everyone involved; not just the patient, but the surgeon as well.
And because medieval surgery resembled a butcher shop more than an operating theatre, it was considered beneath the status of learned men. Physicians with medical degrees would never perform manual labor of any kind, so why should surgery be an exception? Surgical operations were left to so-called barber surgeons: barbers who already had knives for shaving and cutting hair, and who might as well use them for the odd amputation. Barbers, of course, did not have an academic education, nor did they speak Latin. The revelations in Vesalius’ work, then, were not accessible to them and saw no application on the operating table.
One of the barber surgeons for whom the discoveries of Vesalius remained inaccessible was the Frenchman Ambroise Paré. Paré was practising in Paris, when in 1537 he was sent to serve in the French army as a battlefield surgeon. Now, as a barber surgeon he can’t have been squeamish; he must have performed countless bloodlettings and occasional amputations during his time in Paris. But what he saw on the battlefield positively abhorred him. Gunpowder-flung projectiles maimed soldiers in horrible ways, leading to their comrades often putting them out of their misery rather than subjecting them to even more cruelty by the hands of Paré and his colleagues.
Now, the only respite surgeons could give their patients was speed. The common practice for amputation was to take off the limb as quickly as possible, then make sure the bleeding stopped before the patient bled to death. In the 1500s, cauterization was the preferred method of stopping the bleeding after amputations. That meant that either red-hot irons were pressed against the veins, or boiling oil was poured into the wound. Paré realized that his patients were as likely to die from the shock of cauterization as they were from the wound infecting afterwards. He started looking for an alternative.
He found it in ligatures. Ligatures are short pieces of thread that are tied around blood vessels. Paré found that tying silk around veins is effective in stopping the blood flow, saving the patient from dying due to blood loss, not to mention from the immense pain of cauterization.
Paré wrote about his discovery in his 1534 treatise on gunshot wounds. Like Vesalius, Paré used many detailed illustrations to embellish his writing. Unlike Vesalius, though, the book was meant as a guideline for other barber surgeons working in similar conditions, not as a scholarly work. It was written in the vernacular rather than in Latin, and was never referenced in academic circles. Paré himself, actually, was only recognized as a surgical pioneer in the twentieth century, but had been mostly forgotten in the centuries before.
Paré’s work, however, would have solved a prudent theoretical dispute, had any scholar bothered to read it. In the 1500s, the way blood flows through the body was badly understood. Owing all the way back to Galen, even the distinction between veins and arteries was in dispute. It was commonly held that arteries, unlike veins, contained air rather than blood. Experience with dead animals showed that arteries were mostly empty, save for a couple of drops of blood. Now, this is actually due to the fact that without the heart actively pumping blood through the body, liquids in the arteries drain away. It wouldn’t be until academics accepted Vesalius’ work, and that of William Harvey’s on blood circulation half a century later, that this theory was finally discarded. Ambroise Paré, who had lots of experience tying ligatures around both veins and arteries, could have easily confirmed that both of them were indeed blood vessels.
To us it may seem unfathomable how something as mundane as a lack of communication was the main cause for centuries-long surgical stagnation. The absence of any discourse between practitioners and scholars was not just caused by their differences in status, however. It reveals a very different relation between theory and practice that permeates all fields of proto-science. A relation that both Vesalius and Paré were pioneers in shattering.
In medieval Europe, theory dictated science. Practice served merely as a demonstration of theory. So while we’re on the topic of anatomy, imagine this scenario: one of the few places where human dissections were allowed before the work of Vesalius, was Padua, in Italy. There, students were taught anatomy while a corpse was being dissected before their very eyes. Their professor would recite Galen’s work on anatomy while a barber surgeon cut open the body and showed the appropriate organs to the students. That meant that everyone in the room could see with their own eyes how, for example, the lower jaw exists of a single bone – not two, as Galen claimed – or how the liver has only two lobes – not five, as Galen claimed, or how the heart’s two chambers are not connected – and thus have no holes, as Galen claimed. Everyone could see that the body did not conform with Galen’s descriptions. So did anyone point out his errors? No, they all agreed the body was flawed.
With our mindset it is all but impossible to follow such a line of reasoning. But medieval science was deeply influenced by the ancient Greeks, who were obsessed with purity. After Socrates, all visible objects were held to be imitations of their perfect original forms. That is why planets, for example, had to follow perfectly circular orbits, or at the very least tended towards a perfect shape. This idea blended nicely with the Christian vision of a perfectly created, but corrupted, world. God had not created a natural chaos. There was a perfectly created original man to whom every living person naturally tended.
And so the students would agree that the person whose corpse was being dissected was just very far removed from perfection. He must be deformed due to his flawed character and his sinful behavior. The extent to which people approached Creation’s perfection was simply reflected in their bodies. And since most dissections were performed on executed criminals, it was easy to attribute their ‘deformities’ to the impurity of their natures.
Even though Vesalius’ and Paré’s immediate influence on surgical practice was limited, they were pioneers in redefining this relationship between theory and practice in medicine. They both were convinced, well before their time, that practical experiment should be the leading feature in science.
After his first grave robbery, Andreas Vesalius would dissect many other bodies, often publicly. Before the eyes of mesmerized onlookers, he would make grand shows of his handiwork, laying bare the threads of Creation itself. He actively encouraged his colleagues and critics to repeat his experiments, and see for themselves how he had reached his conclusions. He argued by example, not by authority.
Ambroise Paré went even further than that. During a military campaign in 1537, he invented the clinical trial, by accident. While cauterizing wounds, Paré ran out oil. In a desperate attempt to help his patients, he turned to an old recipe he had heard about before: a tincture of egg yolks, rose oil, and turpentine. To his amazement, he found that the wounds of the patients treated with this tincture healed much better than those treated with the conventional burning oil. Even though he did not understand why – it worked because of the antiseptic properties of his tincture – he kept using it. More importantly, he started applying this comparative approach to his patients more and more often. In one case, in which a soldier’s face was burned from gunpowder, he allegedly applied conventional burning paste to the man’s left cheek, and an alternative tincture of rosebud oil to his other.
Paré’s unconventional interest in medical experiments was not followed up on, however. Even though he published his methods and results, the practice died with him. His approach of experiment over understanding would be essential to the success of the medical revolution. When it finally happened, four centuries after Vesalius and Paré, evidence of successful treatments often came before theoretical understanding.
Perhaps unsurprisingly, progress was starting to be made after the practice of surgery and the study of medicine blended together. Around the turn of the nineteenth century, the profession of barber surgeon was disappearing, being replaced by that of the academically trained surgeon. In England, the Royal College of Surgeons required its members to have medical degrees from 1800 onwards. Four decades later, anesthetics were discovered and quickly embraced by the surgical community. In the fifty years following that, advances in germ theory, antiseptics, and the rediscovery of the clinical trial made surgery painless, effective, and relatively safe.
Make no mistake: both Vesalius and Paré were extremely influential. Vesalius’ work formed the basis for modern anatomy, while Paré’s contributions made medical knowledge accessible to practitioners outside academia. They could have achieved so much more, however, had they joined forces. Before that could even be imagined, though, the relations between scholars and practitioners, authority and experience, and theory and practice had to be redefined. As firm believers in experiment over understanding, Andreas Vesalius and Ambroise Paré were pioneers in leading the way towards this new science.
If you enjoyed this episode, I’d greatly appreciate it if you left a rating on iTunes, or any other app you use for listening to me. For more information on the medical revolution that almost happened, check out David Wootton’s Bad Medicine, a rather provocative book that served as the main source for this episode. You’ll find a link to it on the website: ahistoryof.science.
Thanks for listening. Hopefully until another History of Science.
Computing began long before the twentieth century. Mechanical calculators ran on cogs, wheels, and steam engines.
Padua, Sydney
The Thrilling Adventures of Lovelace and Babbage Book
2015.
Links | BibTeX
@book{Padua2015,Close
Close
MechanicalComputing,
How the Pascaline works Online
2012.
Links | BibTeX
@online{MechanicalComputing2012,Close
Close
Scoble, Robert
A demo of Charles Babbage's Difference Engine Online
2010.
Links | BibTeX
@online{Scoble2010,Close
Close
How would you like a job as a computer? Not a programmer, not even a mathematician. A computer. Someone who makes calculations. By hand. All day, every day.
I guess that doesn’t sound appealing. Until well into the previous century, however, those jobs did exist. And so did the word computer. It referred to these people, who worked patiently in the back-office of every factory, bank, and government department. There they calculated everything, from mortgages to railroad bridges to government budgets.
In the decades after the Second World War, they were quickly being superseded, first by mechanical calculators and later by electronic computers. To us in the twenty-first century, the idea that those jobs ever existed seems utterly ridiculous. But it took centuries of increasingly complex inventions and visionary realizations before manual computing was finally a thing of the past.
Hello and welcome to A History of Science: Episode 3: Enchanting Numbers.
Until the invention of the modern electronic computer in the 1940s, calculating had always been hard manual labor. In the sixteenth century already, Tycho Brahe, the astronomer who was renowned for the accuracy of his data, complained about it in the preface to one of his books. He strikingly reminded his readers that if they got tired of the many calculations in his writing, they’d better take pity on the author, who had had to do them all three times over.
The burden of calculating could be reduced somewhat using an abacus, but it remained tiring and mind-numbing work. During the Enlightenment, when the whole world was increasingly being interpreted in terms of math, the first attempts at mechanizing math itself were made. These early mechanical calculators marked the beginning of the long journey that culminated in the invention of the computer.
In this episode, we will explore the inventors of the mechanical computer. It is a story of men and women who lived centuries apart but were all ahead of their time. And who, in contrast to our heroes of our previous episodes, all applied science that was actually correct.
Let’s start off with a little thought experiment. Imagine a clock. But instead of twelve numbers, this one has ten. It starts at zero and counts up all the way to nine. And instead of just two arms, this clock has ten: one for each number. All arms move on the same wheel, so if one moves, they all move. And finally, the arms are all black, except for the one pointing to zero: that one is bright red. Congratulations, you have just invented a calculator.
Don’t see it yet? Let’s try a sum: three plus four. Imagine moving the red arm from the zero to the three. Now move the black arm that happens to point to the zero to the four. The red arm now points to seven: three plus four.
What I have just described is the Pascaline, the first ever mechanical calculator. It was invented in 1642 by the Frenchman Blaise Pascal. Pascal was something of a child prodigy, who would go on to become an influential mathematician and philosopher in his own time. When he was nineteen, however, he saw his father struggling with tax collection, and decided to invent a machine to help him with additions and subtractions.
Pascal’s machine was more advanced than the one you just imagined, even though the basic principles were the same. As you may have noticed, the clock-like calculator we discussed only works for sums under ten. Above that, you need some sort of mechanism to increment one digit at every ‘hour’ to a similar clock representing the tens. That way, a sum that passes nine will move on to zero on the same clock, but will add one digit to another clock on its left. Pascal designed an elegant lever for that operation, one that relied on its own weight to increment the clock to its left.
Blaise Pascal’s Pascaline turned out to be dependable and accurate, and it was the first blatant success for its young inventor. It never became a commercial success, however. It required more knobs and wheels than most contemporary clocks, and before the Industrial Revolution, these had to be hand-made by craftsmen. That made Pascal’s machine too expensive to mass-produce, and only twenty are known to be in existence. Among those who used the machine, however, it gained an excellent reputation. Rather than automate math, however, the Pascaline remained mostly a mechanical novelty.
In the centuries that followed, Blaise Pascal’s design served as an inspiration for many like-minded inventors. Most designers tried to build similar devices with additional abilities for multiplication and division, mostly with limited success. Some of these designs never left the drawing table, others were built but didn’t come close to the reliability of Pascal’s original. At the dawn of the nineteenth century, Blaise Pascal’s calculator had still not been surpassed.
The world had changed dramatically since its invention, however, and with it had the prominence of math. The Industrial Revolution was in full swing in 1800. Factories had split manufacturing processes into bite-sized chunks that no longer required experienced and skillful craftsmen. These tasks required no skill at all on the worker’s part, except for near mechanical precision. Products made from cotton, wool, or metal thus no longer depended on workers’ craftsmanship, but instead on calculations that described how raw materials should be combined into final products.
Factories were not the only places where calculations were at the forefront. Modern nation states kept expanding their branches of government throughout the nineteenth century. In the wake of the American and French revolutions, countries started becoming more democratic. With democratic governments, early forms of social security and public health programmes saw the light of day. Implementing these policies required unheard amounts of data in the form of censuses, measurements, and statistics. Whole armies of civil servants were occupied with counting votes, determining unemployment benefits, and calculating population growth.
The cry for automated calculation grew deafening. It was heard loud and clear by Charles Babbage.
Charles Babbage was a mathematician with more than a knack for mechanical engineering. He was born into a prominent London-based family in 1791, and went on to have a successful academic career as a professor in astronomy and mathematics.
While working at the Astronomical Society, Babbage experienced the problems of manual computing first-hand. The society’s most celebrated publication was the Nautical Almanac, a multi-volume book of nautical charts that was used by navigators at sea. While checking the numbers in their latest version, Babbage noticed discrepancies. Lots of them. And not just the odd typo, but worse: systematic errors.
Calculations in the almanac, as in many other fields of science and engineering, were done with the help of log tables: tables that listed the results of logarithmic functions for a huge number of integers. Log tables were invaluable when performing complex multiplications. But, like any other calculation, they were done by hand. And so, unavoidably, they contained mistakes. Any calculation based on a mistaken log table value was bound to be wrong as well.
Now, interestingly, the logic involved in calculating logarithms is relatively simple. The bottleneck is that the calculation of any row in a log table needs the result of the previous row as input. An error in any row, then, practically cripples the whole table. A nightmare for anyone whose calculations depend on it. Like Charles Babbage.
Babbage realized that the errors in these calculations were all caused by mundane reasons: calculators getting tired and sloppy, typesetters who couldn’t read the calculators’ notes. None of it was caused by the difficulty of the math; the calculations themselves were straightforward enough. A machine, he reasoned, did not get tired or sloppy, and should be able to generate flawless log tables. And the fewer people it required, the smaller the chance any errors would slip into its output. Babbage named his invention the Difference Engine, after the method of divided differences with which log tables were calculated.
Nothing like the Difference Engine had ever been imagined, let alone built. Sure, mechanical calculators had existed for centuries, but they still required the user to perform every step. On the Pascaline, the user had to input each number by hand, then crank a handle to reset the numbers before the next sum. A machine that would perform all these steps – reading input, manipulating data, printing output – without human intervention was unprecedented.
These days we remember Charles Babbage as a visionary genius, but not as a great project leader. He was the archetypical brilliant professor; full of ideas, but without the perseverance to realize them. Whenever he came up with a better idea, Babbage would take his machine apart and start from scratch. Even worse, when it came to dealing with people he was truly his own worst enemy. He was condescending to his subordinates, academic peers, and – worst of all – to his patrons. When any of them did not show enough enthusiasm to yet another rebuild of the engine to accommodate Babbage’s latest great idea, he didn’t hesitate to ridicule them for their short-sightedness.
This behavior did not make him many friends. After working on the Difference Engine for a decade, and spending a small fortune of government grants on it, Babbage could only show a small prototype to his investors. And instead of giving them every possible assurance that he was close to finishing it, he boldly proposed abandoning the machine altogether. Charles Babbage had had an even better idea.
Now that he did. There is no doubt about it. But no matter how good it was, it did not get funded. Nor did it get built. It would linger in obscurity until the invention of the modern electronic computer, when – in hindsight – it turned out to be brilliant.
So what was this brilliant machine Charles Babbage invented? Well, the Difference Engine was great and all, but it was still specifically built to perform one kind of calculation: log tables. What if there was a machine that could handle any calculation, no matter its complexity, precision, number of variables, or conditionals? A calculator, in short, that did math regardless of its application in the real world?
Babbage had envisioned the general-purpose computer, or – as we know it – the pc. And as it was still the first half of the 1800s, he designed it as a steam-driven mechanical monster.
Now this ‘Analytical Engine’, as he called this new machine, has never been built. Even though Babbage described its workings in countless documents and letters, what exactly it would have looked like is anyone’s guess. Someone who made an inspired guess is Sydney Padua, who wrote a graphic novel on Charles Babbage and his machines. Her drawing of the Analytical Engine may help you get an understanding of what it looked like. A link to her interpretation is on the website: ahistoryof.science.
When you get past the first impression of the Analytical Engine as a seven meter long mechanical steam engine, it is remarkable how much its parts resemble those of modern computers. It has a processor, a memory bank, a card reader for input and a printer for output. In recent years, it has been estimated that the Analytical Engine ticks all the boxes that computer science pioneer Alan Turing proposed for computers. That means it is – conceptually – a complete computer.
So how did it work? Well, the ‘sequences of calculations’ – or as we would say, software – would be inserted into the machine on punched cards. Punched cards had been used with great success by Joseph Marie Jacquard in his automatic textile manufacturing machine, the Jacquard loom. The instructions on the cards would be executed on the numbers put into the ‘store’, the engine’s memory. The store made up the largest part of the engine, and consisted of thousands of columns that could each store a number of up to 50 decimals. Long rods would transfer the values in the store to the ‘mill’, the processor. There, the calculations would be performed and the output written to the printer. The whole process would be driven by a steam engine.
Now, as you can imagine, this giant machine cost a fortune to build. And there were no investors left who were eager to fund Charles Babbage, the man who hadn’t made good on his promise to build the much smaller Difference Engine. As the years passed him by, Babbage grew increasingly frustrated; why did nobody see the revolutionary impact his machine could have? Why did no one believe in him?
There was one person who believed in him, though. And no story about the Analytical Engine would be complete without her.
Lady Ada Lovelace was born to the poet, womanizer, and all-around enfant terrible Lord Byron. She never met him, though. Byron and Ada’s mother separated shortly after her birth, and he died when she was just eight years old. Lord Byron was not remembered fondly in the household in which Ada grew up. Her mother painted him as a debauched layabout, and she was not even shown his portrait until her twentieth birthday.
Afraid that Ada shared her father’s red-hot passionate blood, her mother ensured she had a purely rational education. Unusual for a girl in her time, Ada was taught astronomy, physics, and math. Anything but poetry, really. Nonetheless, she was supposed to get married and play the role of aristocratic housewife, as was expected of any woman in British society. And while Ada did get married and raised a family – and became Lady Lovelace in the process – her interest in math was never far away.
During a society dinner in 1833, Lovelace met Babbage, and the two instantly connected. The socially inept professor and the young society lady may not seem like a perfect match, but they both spoke the language of math. Babbage invited her over for a demonstration of his prototype of the Difference Engine, and she was positively mesmerized. Over the years, the two would keep up an intense correspondence, and Lovelace grew to become Babbage’s closest intellectual peer and confidante. In time, Babbage would become so enamored with Lovelace that he admiringly called her ‘the enchantress of numbers’.
Besides her letters to Babbage, Ada Lovelace published only a single text on her work with him. Unlike Babbage, who has left countless documents, letters, and scribbles about the Analytical Engine, Lovelace set out her vision of the machine as supplementary material to an article she translated. This article, written by Italian mathematician Luigi Menabrea, was the first publication that tried to explain this general purpose calculator to a wider audience. Lovelace, knowing everything there was to know about the machine, kept adding so many translator notes, that her seven appendices ended up being three times longer than the original article.
Lovelace’s notes are very insightful; she explained clearly – albeit in verbose Victorian English – what the machine did, how it worked, and what it could be used for. She didn’t get bogged down in details, like Babbage often did, but instead painted a rather visionary image of this wonderful machine. One might even call it poetic.
Of the seven appendices, there are two that stand out. In the final note, she provides an example of how to write punched card instructions for the machine. She chops up an algorithm into separate logical expressions, and then describes its initial values, variables, and the conditions upon which specific calculations are performed. In other words, she writes code for the Analytical Engine. And even though it was never executed, she is rightfully recognized as the first ever computer programmer.
Impressive as that achievement may be, there is something else in her notes which I think is even more visionary. In the first appendix to the article, she talks about how the Analytical Engine could be applied to tasks other than math. As an example she describes how, if music were expressed numerically, ‘the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent’.
This statement reveals an understanding of the power of computing that went beyond what even Charles Babbage had envisioned. Babbage always talked about his machine in terms of math. In his mind, he had designed a calculator; granted, an extremely advanced calculator, one that worked automatically and could be used for any equation, but still: it was a calculator. Ada Lovelace seems to have realized that the Analytical Engine could perform any task, as long as it could be expressed numerically.
A phrase that is often mentioned regarding modern computers, is that they work with ones and zeros. And at a very low level, that is true. The smallest form in which computers store data is a bit, a value that is either true or false, one or zero, on or off. By itself, a bit has no meaning. Only when it is interpreted within a certain context can it represent meaningful information. Let me explain. Any data on your computer, whether it’s a text document, a picture, or a podcast, is stored as a long sequence of bits. From such a sequence alone, it is impossible to tell what kind of data is stored, let alone its content. The software you use to read that data interprets it as a collection of characters, a rectangle of colored pixels, or thirty minutes of audio. In theory, the same sequence of bits may represent the Bible, the Mona Lisa, and God Save the Queen. All at the same time.
Unlike Charles Babbage, Ada Lovelace appears to have understood this layer of abstraction. Her notes give a brief insight into what she imagined the Analytical Engine to be capable of. And had it been built, the digital age would have begun a hundred years before its time.
It is difficult to assess the impact of these early mechanical calculators. Pascal’s calculator was too expensive to mass-produce, and he shifted his attention to other projects. Charles Babbage grew increasingly erratic near the end of his life, and he and his machines became somewhat of a laughing stock in intellectual circles. Ada Lovelace, as a woman, was never really taken seriously as an intellectual, no matter what she did.
The Pascaline never became a commercial success, and it would be the 1850s before the first mechanical calculator became available on the market. They quickly became widespread, and were in use until the invention of the microchip made electronic pocket calculators possible in the 1980s.
A machine as innovative as the Analytical Engine would not be built until the 1940s. When Howard Aiken proposed his idea for an electromechanical calculator to IBM, however, it was rejected time and again. It was only after he had explicitly referenced Babbage’s work and his working prototypes, that he could convince his superiors of the feasibility of his machine.
The legacy of these early mechanical calculators lives on in computing in other ways, too: the first electronic computers still used punched cards for their input, just like the Jacquard loom had 150 years earlier. Until the 1960s, computers’ memory was referred to as their store, a term coined by Babbage for the Analytical Engine. And these days still, the command found in most programming languages to write output to the screen is print, as if computers still communicate with paper.
If you enjoyed this episode, check out the videos on the website; you can see working prototypes of the Pascaline and the Difference Engine there. For more on Babbage and Lovelace, be sure to check out the graphic novel by Sydney Padua. A link to her book – including its great picture of the Analytical Engine – is on the website: ahistoryof.science.
Thanks for listening. Hopefully until another History of Science.
Columbus discovers America. But more importantly, he discovers discovery itself.
Wootton, David
The Invention of Science Book
2016, ISBN: 978-0141040837.
Links | BibTeX
@book{Wootton2016,Close
Close
Britannica, Encyclopædia
Art From The Encyclopædia Britannica (1768–71) Online
1768.
Links | BibTeX
@online{Britannica1768,Close
Close
Clavius, Christoph
In sphaeram ioannis de Sacro Bosco commentarius Online
1570.
Links | BibTeX
@online{Clavius1570,Close
Close
In 1492, Columbus sailed west to prove that the world was round. His sailors, terrified of falling off the edge of the world, were on the brink of mutiny just as America’s coastline came into view.
Except they weren’t. Ever since antiquity, nobody who had given it a moment’s thought actually believed the world was flat. Especially sailors, who could see the world’s curvature on the horizon with their own eyes.
This myth of medieval people believing the world to be flat was constructed after the fact. It was a story of ignorance that neatly fitted with the newly defined ‘Dark Ages’. It showed just how far men had come, and how truly the Renaissance man was head and shoulders above his ancestors.
Or is there more to the story than that? The discovery of America may not have proven to anyone that the earth was round, but it did completely uproot everyone’s concept of what the world looked like.
Hello and welcome to A History of Science. Episode 2: Beyond the Edge of the World.
In 600 BC, Pythagoras was the first of the ancients to call the world round. In the centuries that followed, revered Greek writers including Plato and Aristotle all paid lip service to the idea, until in the second century AD the Alexandrian writer Ptolemy settled the issue once and for all. In his Almagest, he summarized all arguments made over the centuries for a spherical earth. The book would remain the standard on astronomy for the next 1400 years, and ensured that the spherical earth was known throughout medieval Europe.
Among the proofs that Ptolemy gave were theoretical and practical ones. To his more learned readers, he argued that every part of the earth’s surface tended towards the center, and so logically formed a perfectly round globe. Yet he also described how mountains seem to rise out of the sea to sailors on an approaching ship, indicating that they must have been hidden by the curved surface of the sea. And it were imaginative explanations like these that ensured that no fifteenth century sailor was afraid of falling of the edge of the world.
So Columbus’ voyage did not convince his supposedly backwards contemporaries of the sphericity of the earth. His discovery of the American continent, however, did completely change the perspective medieval people had of the earth. Not from a flat disc to a round globe, but from perfectly shaped spheres of land and water to a seemingly random pattern of continents. In this episode, we will explore how Columbus’ discovery changed that perspective, and how it planted into the minds of fifteenth century people the seed of discovery.
What did the world look like to Columbus and his contemporaries? It was as round to them as it is to us, but apart from that?
Well, for starters, it was the center of the universe. Ptolemy summarized earlier philosophers’ arguments for the earth being the universe’s center in his Almagest. This idea fitted neatly with the Christian idea of man and his habitat being the core of God’s Creation, and went down like butter in medieval Europe.
The ancients envisioned the universe as a geometrically based system, built up from perfect circles, triangles, and other forms. They reasoned that for any movement to continue indefinitely, such as the planets’ orbits, only perfect mathematical shapes would suffice. Any other shape would be inefficient causing the planets to lose their speed and eventually crawl to a halt.
A map of the Ptolemaic system, then, closely resembles a clockwork. The earth forms an orb in the center, with all other planets moving around it in perfect circles. Now, in reality the planets move in egg-like ellipses, not in perfect circles. And earth rotates in an ellipse of its own, of course. And, just for the record, all planets rotate around the sun, not the earth. So obviously, this geocentric perfect-circle model did not fit at all with observations of the planets’ movements. Seen from the earth, planets seem to randomly change their speed, distance, and brightness in relation to our planet. As instruments became more accurate, mapping these random orbits to the Ptolemaic system became increasingly difficult.
The ancient Greeks did try, though. To account for the apparent anomalies in their model, they conceptualized circles-within-circles. Planets moved around the earth in orbits resembling an old phone cord; constantly looping and looping again. By adding enough loops to planets’ orbits, they could make the model fit the measurements. And while the system remained the standard for more than a millennium, it was cracking in its joints with every new measurement.
Now before we get back to Columbus, let’s take a closer look at what the world itself looked like in the Ptolemaic System. To begin answering that question, we first have to determine what his contemporaries believed the world was made of. And that, unsurprisingly, brings us back to the same four classical elements we introduced in the previous episode. The ancient Greeks believed not just that human bodies were made of earth, water, air, and fire, but everything else as well. Earth and water makes clay. Air and fire creates smoke. All other things out there – plants, rocks, stone, animals, metal – existed of similar combinations of these four elements – just in more complex variants.
If you look at the world from this pre-modern point of view, these four elements are abundant; we live on earth, breathe air, are surrounded by oceans, and see fire bursting through the skies during thunderstorms. Although medieval scholars were not sure about the exact proportions in which the classical elements related to each other, they were definitely sure that the world was made up out of them.
Ptolemy did not limit his model of perfect circles to planetary orbits. The world existed of perfect circles as well; four of them, to be precise, each one containing one of the four classical elements. Ptolemy called these circular bodies spheres, and described their relations in detail.
As the ‘heaviest’ of the classical elements, the sphere of earth was located at the center of the universe. Immediately surrounding it was the sphere of water, forming the world’s oceans and rivers. Above both of those was the sphere of air, which formed the sky. Above that, and mostly out of sight, was the sphere of fire, which was only briefly visible through bursts of lightning.
Now, I can imagine you having difficulty visualizing these admittedly nonsensical maps. If you do, take a look at the website, where I have collected some of these maps for reference: ahistoryof.science. In the meantime, try and visualize this idea of the world as a Russian Matryoshka doll. The outer doll forms the sphere of fire, the smaller one inside forms the sphere of air, the one after that water, and the smallest and final doll represents the sphere of earth.
So what does this have to do with Columbus’ voyage? Well, let’s think this model through, shall we? If we are living on the innermost sphere of earth, why are we not surrounded by water? The smaller orb of earth should be completely engulfed by oceans from the adjacent sphere of water, right? But, obviously – as we are not all swimming – this is not the case. So what’s happening here?
There were conflicting theories on this matter. The most prominent of them envisioned the sphere of earth as an apple floating in a bucket of water. Being light, the apple floats to the surface, with roughly half of it being exposed to the air. This made perfect sense to fifteenth century Europeans; we lived on an enormous island surrounded on all sides by one giant ocean.
Now, Columbus did not set out to prove the world was round. What he did want to show, however, was that there was a Western sea route to the Indies. Since before the birth of Christ, Europeans had traded with the Far East through the Silk Road. The Silk Road was an extensive network of trading communities connecting Europe, the Middle East, India, and China to each other. Traveling this road, while lucrative, was dangerous. The way was long, hazardous, and – most important of all – expensive because of the many middlemen skimming off profits. There must be a better way to get to that silk, over water; if Asia was on the other edge of the apple, you could simply sail all the way around the globe until you hit land. And that is what Columbus intended to do.
He had a hard time finding fundraisers for his expedition. And who can blame them? His potential patrons weren’t afraid of him falling off the earth with their precious ships, but they were convinced nobody could circumnavigate the earth and live. There could be no land between Europe and Asia; no place to reprovision, get freshwater, or perform repairs. It was suicide.
Famously, of course, Ferdinand and Isabella of Spain took a gamble and funded Columbus’ expedition. And when, on that fateful day in October 1492, Columbus hit land, he was convinced he had circumnavigated the globe. He must have landed on the Eastern coast of India, and so called the inhabitants Indians. He wasn’t unimaginative to not consider having landed on another continent altogether, he simply couldn’t fathom the idea of other continents existing. Columbus died in 1506, having never realized the land mass he had famously set foot on was not connected to Asia at all.
When scholars landed on this unknown land, however, they quickly came to realize that it could not possibly be where it was supposed to be. The observations of the stars on this new land were completely different from what they were used to from Silk Road expeditions. This was not Asia. so, where were they, then?
In 1503 Amerigo Vespucci, an explorer who had sailed West in the wake of Columbus’ discovery, was the first person bold enough to call the shots. In a letter pregnantly titled Mundus Novus, he claimed the newly discovered land mass was nothing less than a new continent. From then on, the land was commonly referred to as the New World, until it would get its official title from Vespucci’s first name, Amerigo.
Vespucci claimed that the New World was located roughly halfway around the globe from the known land. To fit this fact into the already bursting Ptolemaic System, an increasingly creative theory had to be thought up.
Scholars at the time argued that this New World was an antipode to the known sphere of earth, in other words that it lay exactly opposite to it on the globe. This possibility had been discussed once before by the ancients, but had never received much attention. Now that this new continent had to be shoehorned into the Ptolemaic system, a theory with some ancient credibility suddenly looked appealing.
In order to make a continent on the other side of the globe possible, they imagined the sphere of earth as an ellipse instead of a circle. If it was long and thin enough, it was theoretically possible to pop out of the water on both sides of the globe. Variations on this theory required the sphere of water to be egg-shaped as well to allow for more of the earth to be above the water’s surface.
This theory only held if the New World lay exactly opposite the known earth. And as more measurements were done, it turned out that it did not. It lay roughly a quarter West from the Eurasian continent. Fanciful theories were proposed to make this fact fit the model; one of them envisioned two spheres of earth instead of one. But to no avail. It was too late. The New World could not be explained by the Ptolemaic system.
Cartographers were the first to come to terms with this new reality. In 1517 the first sophisticated map of the world existing of three spheres instead of four was published. The spheres of earth and water were collapsed into a single one. Notably, the word terra, which had previously been used to denote the sphere of earth, was increasingly being used to reference the combined sphere of earth and water. This represents a major step towards our modern concept of the world as a single globe, instead of a combination of overlapping orbs.
From there it went quickly. Theories on how this natural phenomenon had to be explained ran wild. The Viennan scholar Vadianus suggested that inhabitable earth was scattered randomly across the globe, floating on one giant ocean. Another hypothesis was put forward that claimed God had opened up the sphere of earth through caves and rivers, and had allowed the water to fill these cavities, essentially becoming one sphere. Whatever the case, though, the four spheres were forgotten as if they had never existed.
Itn 1492 everybody believed the spheres of earth and water to be separate. In 1550 every scholar believed they were one. This sudden change in belief was absolutely unprecedented.
It is difficult for us to imagine how great the implications of this seemingly rational shift in opinion were. Medieval science relied much more upon authority than it did on evidence, experiment, or observation. The wisdom of the ancient Greek philosophers was never called into question. Measurements that conflicted with their teachings were whisked away as having been performed carelessly. If that didn’t satisfy more persistent minds, scholars claimed nature had simply changed since the time of Aristotle.
The discovery of the New World, however, was too big to ignore. The facts were too abundant, and too decisive. Within the span of half a century, and after 1400 years of prominence, the four spheres were discarded. It was the first occurrence ever in which ancient philosophy was accepted to be wrong. The first time that facts dictated science.
Not just the immediate implications were great. In short order, other aspects of the Ptolemaic system were put to the test. The most famous of these is Copernicus’ theory on how the earth revolves around the sun, instead of the other way around. Even though this would not be widely accepted until Galileo provided evidence for it, the spark had been lit. Copernicus could have never made this claim sixty years earlier, in a time when Ptolemy still had unquestioned authority. Now, it was a viable point of view.
The discovery of the American continent was a catalyst for change in virtually all fields of science, not just astronomy. All of a sudden there were lots of different kinds of plants, animals, minerals, even people, that somehow had to fit in the existing systems of thought. How did ‘Indians’ get to the New World if there was no land bridge connecting Asia and America? Were they somehow not descendants of Adam and Eve? Or what about the giant Dinosaur bones that lay scattered all over the new continent. Had these animals gone extinct? Did that imply that Creation wasn’t perfect? Was God still tinkering with a finished product?
This seemingly endless amount of unknown flora and fauna that came from the New World uprooted every notion held since antiquity. How could theories on life not be contested if the ancient Greeks could never have taken into account completely unknown species? In order to make sense of all these facts, observations, and measurements, a more systematic method of inquiry was needed. Over the centuries, these systematic routines of experimental testing and logical deduction would come to form what is perhaps the greatest discovery of all, the scientific method.
In 1492, Columbus discovered America. But more importantly, he discovered discovery itself.
If you enjoyed this episode, I encourage you to read David Wootton’s The Invention of Science, a magnificent book that is a must-read for anyone listening to a podcast on the history of science. Much of the content of this episode comes from chapter four of his book, so be sure to read it if you want to know more about the impact Columbus’ discovery had on the medieval worldview. A link to his book is on the website, ahistoryof.science.
Thanks for listening. Hopefully until another history of science.
Daring 17th century doctors try their hand at blood transfusion. With fatal consequences.
Tucker, Holly
Blood Work: A Tale of Medicine and Murder in the Scientific Revolution Book
2012, ISBN: 978-0-393-34223-9.
Links | BibTeX
@book{Tucker2012,Close
Close
Harvey, William
Exercitatio Anatomica de Motu Cordis et Sanguinis in Animalibus Online
1628.
Links | BibTeX
@online{Harvey1628,Close
Close
Blood.
The mere sight of it is enough to make many people faint. Blood has long been thought to be the magical ingredient to life. It has been used in rituals, cures, and potions. It has been believed to contain the essence of our being – our very soul, itself.
On average, five and a half liters of blood flow through our body. Lose two of them, and die.
Hello and welcome to A History of Science. Episode 1: Bloody Beginnings
Blood transfusion has been a mainstay of medical practice over the course of the last century. Ever since the discovery of blood types in 1901, blood has been safely shared between people. The enormous need for blood donors that grew out of the industrialized warfare of the First World War led to the invention of blood banks. And now, a hundred years later, blood transfusion is amongst the most likely medical procedures that anyone will undergo at some point in their life: for anyone suffering major trauma, it is a life-saving procedure.
The transfusion of blood is an idea that long predates modern medical science. It was not thought of as a remedy for stabbed soldiers bleeding to death on the battlefield. It was practiced by doctors who could not fathom the existence of blood types, DNA, red or white blood cells. Indeed, it was pioneered just after the discovery of blood circulation.
In this episode we will explore these early pioneers of blood transfusion. Who were they? What drove them? And perhaps most importantly, what could they possibly hope to accomplish?
In 1628, British physician William Harvey wrote history. In April of that year, he published his masterpiece De Motu Cordis, or An Anatomical Exercise on the Motion of the Heart and Blood in Living Beings. In it, he describes the results of years of painstaking experiments, measurements, and observations, and concludes that blood is pumped through the body by the heart.
As intuitive as his discovery may sound to us, it was indeed very far removed from accepted medical science at the time. Before Harvey, blood was believed to be generated in the stomach, as a byproduct of the digestion of food. It would make its way through the body, constantly being warmed up by the heart. After reaching its boiling point, it would then evaporate and leave the body through the lungs. Breathing was the exhaling of fumes of vaporized blood. The human body as a steam engine, with the heart being the furnace, and the mouth as a steam whistle.
Fortunately, William Harvey lived in an exciting time, one we now call the Scientific Revolution. Classic ideas about how the world worked were increasingly being turned on its head by experimental science. Ancient writers, such as the Greek philosopher Aristotle and the Roman physician Galen, had dominated medieval intellectual life for centuries. Their works were treated as gospel – literally. Just like the Bible contained all knowledge one could possibly need about morality, Galen was the only textbook one could ever need on medicine. And if the Reformation was an out-of-hand conflict about the Bible’s interpretation, medieval medical science was a continuing feud about interpreting Galen and Aristotle. If your patient died, you had simply misinterpreted Galen’s instructions. They could not possibly have been wrong.
Now that the ancients were slowly but surely falling out of favor as experimental science proved them wrong, William Harvey’s words did not fall on deaf ears. In the decades following the publication of his book, Harvey’s work became the accepted theory on blood circulation. And by some, his work was not just read, but positively internalized.
Richard Lower was a London-based physician who, like so many young aspiring scholars of his day, was determined to make a name for himself as a bold experimenter. Already a renowned surgeon, Lower was well-acquainted with the sight of blood. And he was not scrupulous to share that sight with others. Like many medical practitioners of his time, he was keen to showcase his skill dissecting bodies and demonstrating nature’s mysterious workings. Before the eyes of undoubtedly aghast audiences, he would carefully lay bare tissue, veins, and bones, presenting an astonishing insight into Creation itself.
Richard Lower usually dissected the bodies of executed criminals. But whenever the corpses of unfortunate crooks were in short supply, Lower was happy to have animals take their place. Legend has it that not even his own pets were safe: one of his dogs was apparently called Spleen, because Lower had separated the poor creature from its spleen.
And so perhaps it was to no one’s surprise when, on a cold winter morning in February 1665, Richard Lower firmly strapped two baulking dogs to a table. He carefully exposed the carotid artery of the first, cut it open with his surgical precision, then immediately put in a quill to stop the blood from pouring out. He then turned his attention to the unwilling unlooker and performed the same operation, this time on the jugular vein. Lower then quickly strapped the two quills together, and watched as his contraption began to work.
The blood of the first dog slowly but surely made its way out of its pulsating artery, through the system of quills, until it reached the other dog’s vein, where it promptly became one with the recipient’s circulation. Lower would later write an intricate account of his experiment to the Royal Society of London, describing every detail, from the tools he used to the positioning of the table. He does not seem to have registered exactly how long his ghastly procedure took, however. We can only imagine him distantly observing the constant stream of blood making its way through his tubular contraption.
At some point, however, the donor dog began to howl, wither, fall into convulsions, and then fall silent. Lower removed his apparatus from the recipient’s veins, stitched up the dog’s wounds, and watched as it immediately leapt up from the table, escaping the room ‘as if nothing ailed him’. Richard Lower had performed the first successful blood transfusion in history. It would be the first of many. And he would not limit himself to animals.
But before we get into that, let’s explore what blood meant to Richard Lower and his contemporaries. What did he think blood actually was?
His views on that no doubt originated from the ancient Greeks. And, in contrast to blood circulation, their views on this topic would not be challenged for another two centuries. The ancient Greeks believed the body to work through an intricate balance of four bodily fluids, which they called humors. They differentiated between black bile, yellow bile, phlegm, and blood.
These humors were a philosophical construct as much as a medical one. They did not just have physical properties, but were intimately connected to the Greeks’ belief of the four basic elements, the seasons, emotions, and life stages. Blood, for example, was thought to be warm and moist and was produced mostly in spring. Overly-excitable youngsters who had trouble controlling their passions in springtime clearly had an overabundance of blood. Black bile, in contrast, was believed to be cold and dry, and was produced in abundance during autumn. People who suffered from depression when the leaves began to fall must have an excess of black bile.
So how would a physician practically use this theory of humorism in treating his patients? Well, a patient who suffered from depression due to an excess of cold black bile could be treated with regular warm baths. Similarly, blood was tied to warm, moist air; fever then obviously resulted from an excess of blood. The easiest solution to remedy that was a bloodletting.
Bloodletting was one of the most prescribed treatments for any illness in medieval Europe. It was the easiest of the four humors to purge, and with many illnesses producing fevers, a likely cause of disease. Bloodletting could either be performed by cutting directly into a vein, or by applying leeches to the patient’s body. After sucking themselves full of blood, leeches would detach from the body, and were thus easily prescribed in daily doses. The practice of bloodletting continued well into the nineteenth century, by which time the demand for leeches grew so high, that leech farming became a profitable line of business.
With bloodletting being at the forefront of everyday medical practice, it is easy to see why blood transfusion felt like a golden bullet. After all, instead of carefully adjusting some pour soul’s excess of boiling feverish blood, would not replacing his blood with that of a healthy person cure him right then and there?
News traveled fast in seventeenth century Europe. Details of Richard Lower’s breakthrough experiment had been published in the Philosophical Transactions, which was read not just in London, but in Paris as well. There it fell into the hands of Jean-Baptiste Denis, a man who was in many ways the French counterpart to Richard Lower. A young physician himself, Denis too felt he was destined for glory. What better than a groundbreaking experiment would earn him respect with the conservative old guard of Parisian medicine?
Wasting no time, Denis acquired some canine research subjects of his own and got to work replicating Lower’s experiments. He quickly mastered the skill of transfusion, and did one better: unlike Lower, he managed to keep both dogs alive by having the blood flow two ways. Eager for recognition, Denis immediately announced a public demonstration of a blood transfusion on the banks of the river Seine. And so, on a Saturday afternoon in 1667, Parisians curiously crowded around Jean-Baptiste Denis, who promised to ‘transform the blood of a young and healthy dog into the veins of an old and mangy one’. And so he did. Jean-Baptiste Denis’ star was unmistakingly on the rise.
Back in England, the same could not be said for Richard Lower’s. He was in very real danger of having the credit for his invention being stolen by a French impersonator. Especially when the disturbing news reached him that Denis had apparently transfused a lamb’s blood into a man’s arm, he knew it was time to retake the crown. The Royal Society had voiced their concerns about the ethical implications of this new procedure; did Lower really know what he was doing, meddling with God’s Creation like this? This and similar objections had made further experimentation with blood transfusion difficult in England, and would come back to bite both Lower and Denis later.
But for now Richard Lower would not let pesky moral qualms stand in the way of science. It was time for a blood transfusion into an Englishman’s veins. Denis had indeed transfused some blood from a lamb into the veins of a young man, who suffered from an unrelenting fever. The boy had survived, and had reportedly felt somewhat better the morning after, but details were sketchy. Outdoing the Frenchman with a more impressive achievement would not be thát difficult. Lower just needed a patient who could evidently be healed. And so he decided to cure a madman.
Madmen in seventeenth century London were easy to find. Without proper psychological wards or social security, confused people desperate to do anything for a few coins were all over the city. Lower’s choice fell to Arthur Coga, an ideal candidate in more ways than one. For starters, Coga was only slightly strange, not completely out of his mind. He also came from a respectable family and was well-educated, enthusiastically, and oddly, speaking Latin whenever he could. That meant that his patient would be able to rationally convey his experiences to his doctor. And last but not least, Coga was a butcher, which meant that – just like Lower – he was well-accustomed to the sight of blood.
And, as it turned out, not just to its sight. Arthur Coga was so curious during the whole proceeding, that he actually tasted the blood as it was dripping out of his veins. ‘It is of good relish’, he is reported to have remarked. During the procedure, Richard Lower transfused eleven ounces of blood from a lamb into Coga’s arm. Coga remained the ideal patient throughout, feeling so ‘well and merry’ afterwards, that he delighted his audience with Latin anecdotes.
But had it worked? The next morning, Lower’s colleagues went to check on their patient and found him to be calm and well-composed. A much different sight from the extravagant freak whom they had operated on the day before. During a visit later that week, Coga was still ‘speaking very reasonably and very well’, reporting that he ‘felt like a new man’. Emboldened by this apparent success, Lower repeated the operation, replacing another 8 ounces of the man’s blood. Once again to great success.
Or was it? Lower and his colleagues were already planning a third operation, but Coga suddenly baulked. Being no longer the well-composed rational man, Arthur Coga wrote a poignant letter in which he accused the physicians of having transformed him into a sheep. He was only interested in another experiment if Lower promised that he would be transformed completely, ‘without as well as within’; that is, if Lower could make him grow wool. This disheartening token of the man’s returned insanity was pregnantly signed, ‘Agnus Coga’, Coga the Sheep.
Coga’s letter was obviously a return to form for a man suffering from delusional tendencies. But his fears about being transformed into a sheep were not as far-fetched to seventeenth century Europeans as they are to us. And much of the more intellectual criticism of blood transfusion revolved around these same fears.
As we talked about earlier in this episode, the ancient concept of humorism was doctors’ main framework for diagnosis and treatment. It provided a holistic approach to disease, relating to nature’s building blocks earth, water, air, and fire. This fragmentation of nature into smaller interchangeable pieces had long ago sparked the idea of changing one element into another. Enter the alchemists.
Alchemists in early modern Europe did exactly that. They tried transmutation with any material they could get their hands on; lead, mercury, even urine. These days they are mainly remembered for their fruitless pursuit of the Philosopher’s Stone: a formula that was rumored to turn base metal into gold. But in their own time, alchemists were nothing short of technological pioneers who were unraveling the threads of Creation.
Now what if this transmutation that was so evident when creating quicksilver by heating cinnabar rock, was applied to the human body? Blood was closely tied to the element of air, after all, and was an organic building block in and of itself. Would changing a man’s blood with that of an animal not change his constitution? Would the animals’ properties not become his own? Would he not become innocent as a lamb, fat as a pig, or sick as a dog?
These ideas were partly what drove the pioneers of blood transfusion when selecting animals in the first place. Richard Lower indeed hoped to transfer some of the lamb’s gentility into the extravagant Arthur Coga. But desecrating the ‘blood of the lamb’ like this did not sit well with his more conservative contemporaries.
The Bible had clearly given man dominion over the animals. Man himself answered only to God. Creating all kinds of half-breeds or animal-man hybrids was seen as tinkering with this perfect natural hierarchy. Who knew where it could lead? Left unchecked, this kind of uncontrolled experimentation could lead to hordes of horned demons roaming the earth!
Meanwhile in Paris, Jean-Baptiste Denis disregarded such criticism as superstitious nonsense. His own lamb-to-man experiments had seen great success so far. And with the British gaining on him, it was time to cure a madman of his own. His choice fell on a Parisian who enjoyed recognition as the village idiot, named Antoine Mauroy. Mauroy was definitely madder than Coga had been. Having once been a valet to the Marquise de Sévigné, he had lost his mind over being rejected by his lover. He had reportedly torn his uniform to pieces and run naked and screaming through the streets. He had even resorted to violence, threatening to kill his former noble employers, setting fire to their properties.
Now, being of good standing, several benevolent nobles had tried to help Mauroy. Doctors had prescribed all sorts of treatments for his madness, including countless cold baths and bloodlettings from virtually every part of the unfortunate man’s body. Nothing had helped. Antoine Mauroy was mad as a hatter. And so he was a perfect candidate for Denis’ procedure.
As he was unreasonably mad, Antoine Mauroy could not be asked to participate, like Coga had been. Instead, he was rounded up and found himself tied to a chair in Denis’ operating theatre. Indeed, Jean-Baptiste Denis too was determined not to let any moral qualms stand in his way. During the first proceeding, Mauroy reacted mildly. But after a second, Mauroy evidently was not as easy a patient as Coga had been. As the blood was being pumped into his vein, his face went red and he started to sweat all over his body. Afterwards, Mauroy was feverish, nauseous, suffered from diarrhea and nosebleeds. His urine was black as ‘chimney soot’. But, just like Coga, in the week following the operations he was generally well-composed and calm.
And that was good enough for Jean-Baptiste Denis. Immediately, he sent out letters about his miraculous cure to all corners of Europe, including – of course – to England. To his great delight, his findings were published in the Philosophical Transactions. Finally he had received the same acclaim as Richard Lower.
The story does not end on this happy note, however. Nor does it end well. Because just like Coga, a few weeks after being cured Mauroy’s violent tendencies returned with a vengeance. His distressed wife urged Denis to come to their home in the countryside and try and cure her husband one last time. Denis found Mauroy tied to the bed, struggling and cursing. His wife’s face bruised from the beatings. Next to the bed lay crude surgical instruments, and bowls that had been used for bloodletting.
Denis hesitated. It was very far removed from an ideal situation for any surgical procedure, even by seventeenth century standards. He ultimately gave in, though. Using one of the farm’s calfs as a donor, the procedure did not take long. Before Denis had good and well started, Mauroy fell into a violent fit, every limb of his body trembling uncontrollably. Fearing for his patient’s life, Denis pulled out the quills and stitched his veins back up. It was too late. That same night, Antoine Mauroy died.
And with him did blood transfusion. The idea had been frowned upon from the beginning because of its arrogant meddling with Creation itself. Even the usually adventurous Royal Society in London had voiced their concerns over the safety of the procedure. And now, with one patient dead and none of the others having been cured, they had all had enough. Blood transfusion was a bridge too far. France officially banned its practice. In England it fell out of favor.
Both Richard Lower and Jean-Baptiste Denis seem to have been rudely awakened by Mauroy’s death. They tried their hand at less hazardous experimental pursuits, aimed at comprehending how blood circulates through the body. And so, the blood transfusion craze died down as quickly as it had begun, lasting only half a decade. It would be exactly hundred forty nine years before anyone would try his hand at it again.
If you enjoyed this episode, keep an eye on my website for more to come: ahistoryof.science. If you want to learn more about early blood transfusion and its daring pioneers, I wholeheartedly recommend Holly Tucker’s Blood Work, which served as the main source for this episode. A link to her book is on the website, ahistoryof.science.
Thank you for listening. Hopefully until a next history of science.
[/read]
The podcast currently has 6 episodes available.
84 Listeners
10,740 Listeners