COLD, by David McNicoll
An Age in Need of Reason
In 1645, and in what can only be described as truly bizarre, the people of Chamonix in France called upon the Bishop of Geneva to slip on the old ceremonial robes and pull out all the bells and whistles to perform a demonic exorcism in their small community. But this was no visitation sent from the fires of Hell, nor some Linda Blair style possession, it was, of all things, focused on a valley glacier.
During the previous few decades, glaciers such as the Mer de Glace in the Alps around Mont Blanc had been expanding rapidly, and this had been accelerating dramatically by 1645. Farms had been wiped out, small villages swamped and now Chamonix itself was under threat – the ice had not stretched this far down the valley since the end of the last Ice Age, an elapse of 10,000 years; so, no wonder people were baffled and frightened. The intervention of the bishop coincidently matched the termination point and the town was “saved”. A couple of Steins for the priest and his fluke for sure, but it was just one part of a story that was being played out across Europe and other regions of the globe – what geographers and climatologists refer to as “The Little Ice Age”. It would trigger the biggest change in how we understand temperature and with it, the weather, physics, and indeed medicine, since the dawn of time.
This is not the story of climate change or global warming, heaven knows there’s enough books on this subject to fill the British Library – no, this is the story of how calculating and controlling temperature itself, and its effects were brought to bear by a handful of individuals and who’s work we still rely on today. Now, weather and climate are very different things, although obviously connected and temperature plays a very significant role in predicting and monitoring both, but temperature also works at the micro level, from super-conductors at astonishingly low grades to shields on space shuttles that had to withstand equally astonishingly high temps as they re-entered the earth’s atmosphere. We use temperature measurements to bake bread, roast chicken or even make decent whisky. Without thinking we switch on the AC or the Heating in our houses to a specific setting – something so mundane, but that would seem magical to our grandparents; and computers control the heat or otherwise in the most sophisticated of airline planes, and even my car. Temperature is a core dimension in our lives, from what clothes to wear as we scrape the car windows, to which factor suntan lotion to slather on at the beach; and yet while we know what is hot, cold or lukewarm, it took science time, patience and ingenuity to allow us to take for granted that what we see every day on the weather map: the metrics of degrees, scales and how to factor for them.
When the weatherman is pointing to the map, shows where the clouds are, the high and low pressures, etc., there’s a raft of numbers across the board as well. For most of the world, these are graded in degrees Celsius, and in the United States in degrees Fahrenheit. There’s a notion that one is more old-fashioned than the other, but not so – Anders Celsius and Daniel Fahrenheit lived at the same time, and growing up in the UK there was a cross-usage of both. The ‘Magic 90’ is still a term used – Scotland rarely reaches 90°F – and I can remember a heading: “Scotland sizzles in the 70s”, but on a day-to-day basis Celsius is the common currency. But, that’s Britian for you – we buy our petrol by the litre and still work out our miles to the gallon. But each range has features that better suit different circumstances, and as both are in centigrade (increments between zero and one hundred) so are compatible. But this is the tip of the iceberg. Speaking of which.
The city of Minneapolis in the United States is located on the west bank of the Saint Anthony Falls, the only major set of rapids on the mighty Mississippi River. This great cascade, now trammeled and domesticated beyond recognition from when the first Europeans arrived in Minnesota, powered the great timber and flour-mills that made the city wealthy and famous; but they were not formed by a twist in local geology but by incredible events and forces that took place over time and space a long way away. A story that would be unlocked by a man who would lend his name to the source of that power.
Born in 1807, Louis Agassiz was a Swiss polymath who dedicated himself to what today we’d call paleontology – and discovered numerous fossils and was able to determine their place in the record in a time when people still believed the world was built in seven days. Growing up in the Alps his brain also turned to looking at how the various landforms – the deep valleys and serrated mountains came to be. Working and studying with others he concluded that at one point the Alpine glaciers must have extended to an incredible size and stretched way beyond what was contemporary. Boulders and mounds found miles and miles away confirmed this in his mind; and in 1837 he gave a talk to the Helvetic Society – a sort of gentleman’s club of amateur natural scientists – that the Alps had once been covered by a much larger ice sheet. It received a 50-50 reception. However, in Britain the geologist William Buckland had come to a similar conclusion but had no evidence. Together, they travelled to Scotland, and in the Highlands Agassiz was staggered to see the same patterns of boulder deposition, moraines, terraces and so on – but this time in a country where there were no glaciers whatsoever. The logical conclusion is that vast amounts of ice must have once flowed in the mountains of Scotland – it was seismic. But, it ran counter to common beliefs at the time and so the man packed his bags and headed to the USA in 1846. There he focused mainly on the paleontological work of his younger years but he again recognized in the northern states the tell-tale signs of long-extinct glacial activity. And again, he was right. The once enormous Laurentide Ice Sheet (bigger than the Antarctic sheet today) covered most of Canada, spread through states like Minnesota and Wisconsin, and even reaching Manhattan (indeed, the ridge through Brooklyn’s Prospect Park, and extending the length of Long Island is the terminal moraine deposit).
As this colossal sheet began to melt around 10,000 years ago massive lakes formed at the edges, and some were dammed in by stubborn, city-sized blocks of ice – and one, that would dwarf all of the Great Lakes combined covered much of Manitoba and northern Minnesota, and when its ice dam failed it released a biblical torrent of water the like no-one has seen since. Named after the esteemed glaciologist, this was Lake Agassiz, and when it flooded the water poured through what is now the Minnesota River Valley cutting a huge ditch through the drift and bedrock all the way to the Mississippi and beyond. This was Glacial River Warren and when it sliced past Mendota on its way to cutting the bluffs above Saint Paul, the Mississippi, a tiny river in comparison cascaded over a 100ft high cliff into the Warren. Of course, once the lake was drained the waters subsided, we’re left with what is essentially a glorified creek in its stead, and the Mississippi has cut back the soft rock from the original falls far enough to the St. Anthony. It is a set-in-stone reminder that the earth is in constant motion, nothing stays the same, and how temperature has played its part. But it would take a Scotsman and a Serbian to unlock the mysteries that lay behind Agassiz theories and why temperature could create ice ages and wipe them out just as easily. And all the while, measuring, quantifying, and codifying temperature itself, a science needed to work out these most massive of geological questions, all the way down to running steam engines, boiling kettles and reducing fever, was moving apace. But there was a long way to go.
For the bottom line is that folks in a town in France in 1645, nor a wheeled-in Catholic bishop, had any more idea about why the climate was doing what it was doing than they did the temperature on any given day, or more importantly, what it would be tomorrow, nor did they really care. But they did understand the weather, and its patterns – their livelihoods, and indeed, lives depended on it.
Red Sky at Night – Shepherd’s Delight
These Alpine farmers may not have known the specific temperature, for they would have worked day-to-day reading the clouds – looking for the telltale signs of rain, hail, snow or blistering sunshine. My gran, a true old Highlander reveled in her sayings and especially her weather prophecies. And all of them tended to poke towards some shade of bad weather: “horsehair clouds = rain”; “no horsehair clouds = rain” Growing up in Scotland, I think you can hedge your bets either way there will be rain. But it was the spookier stuff – “The Northern Lights, that’s a sure sign of cold weather coming in; blue fire seen on the hills, that snow” Never failed to get it right. There can be some reasoning behind this. If you see the Northern Lights, chances are there’s no cloud cover and its clear skies – well, in autumn and winter that suggests frosty mornings and the car scraper.
My gran was a shepherd’s daughter and these old farmer boys knew the weather back to front: well, they had to, it was a key part of their every working day, survival even. In a heartbeat they could read the sky, check the winds and smell the air and tell you what was coming tomorrow better than any university educated meteorologist or physicist. They and their fathers had been doing it since time immemorial: it was instinctive, and it had to be. Sailors will tell you the same story – it wasn’t about instruments; it was about an education not taught in any school. The concept of weather fronts, cold warm and so on were not developed until the First World War when understanding air flow was important for early aviation and for deciding whether it was timely to launch a ground assault. This is why we call them ‘Fronts’ as their mapping looked like the maps of the Western Battlegrounds. It was a first step forward in a more methodical approach, and the one we use today – the age of the weathervane was on the wane. Although not quite – you can still read a lot into what’s coming at you by wind direction and speed. However, today, this now tends to decide on putting the washing in the dryer or on the line. Our grandfathers did not have the advantage of doppler radar or weather-balloons, and sadly their special kind of inherited knowledge is passing from popular interpretation.
From the work done by Agassiz, geographers began to slowly realize that the processes that shape the land happen over eons of time, and repeatedly. In the 19th century the view that the world was created in seven days and was around 6500 years old, as calculated by Archbishop James Ussher in 1650 (and which was widely accepted at the time, including notables such as Issac Newton) became increasingly discredited. Substantial work done by the Scottish geologists, James Hutton and Charles Lyell had all but proven that the world was millions of years old, and Agassiz’s theories of ancient glaciation along with Darwin’s evolutionary work put the final nail into this biblical coffin. The slow revolutionary realization that the world was incredibly old also opened the door to the suggestion that it may have experienced multiple ice ages, which in turn led to the question, why?
James Croll, and yes, another Scotsman, was a very unlikely candidate to shed light on our understanding of episodic glacial events. Born in Perthshire in 1821, son of a mason-farmer, he seems to have been a wandering soul when it came to work, one of which was to manage a temperance hotel (a hostelry that didn’t sell or pour alcohol), before finally becoming a janitor at the Museum of the Andersonian University in Glasgow in 1859. Working in the museum gave him access to a cornucopia of literary and scientific work, and he became self-taught in physics and astronomy. He became so accomplished that he was elevated to membership in prestigious academic societies. A more remarkable change in the direction of a life is hard to conceive.
Croll had read the works of the French astronomer, Urbain Le Verrier who while studying the wobbles in the orbit of Uranus had predicted that a ninth planet must be out there wandering the ether, and in 1846 using mathematics alone he predicted its location – it took about a month for this to be verified by observation. It was Neptune. What interested Croll was the mechanics of how planets orbited the sun, and how they could fluctuate over time, sometimes over vast periods of time. From this, he concluded that the earth’s orbit must also oscillate and when further from the sun it would produce extended ice growth, and when nearer signal the end of any ice age, and that this had been going on for hundreds of thousands of years. There were obvious miscalculations and timings with his seminal work, published in 1875 – “Climate and time, in their Geological Relations”, but the overall theory was sound and correct. The planet’s often eccentric travels around the sun do indeed affect our climate in a truly dramatic way.
Also working on Le Verrier’s work on celestial motion, the Serbian Milutin Milanković took the theory one step further through research and published papers in the 1920s and 30s. He was commissioned to predict fluctuations on a million plus years scale, and delving slightly deeper than Croll he discovered that there were several variables at play. Croll’s orbital eccentricity was right – sometimes the earth orbited the sun in a near circle, sometimes in an egg-shape and sometimes as an elongated ellipse. This seems to repeat itself every 100,000 years. Then based on work by Keppler he figured out that the tilt of the earth also varies – sometimes the north pole is tilted at a higher angle than today, sometimes more towards the sun during the summer this up and down movement seems to have a 41,000 year cycle; and finally using old Greek records dating back to 150BC he determined that the northern summer can be in December and the Winter in June as the earth wobbles on its axis. This is known as the Precession of the Equinox, and circles around every 23,000 years. Because there is more land in the Northern Hemisphere if the north is tilted away from the sun in summer, finds itself further away from the solar heat source and angled differently in relation to its orbital plane then it can trigger an ice age. This was the smoking gun, and while there are other factors such as a permanent landmass over the south pole and volcanic activity, these cycles, like arms of a clock accurately fit the pattern of glacial growth and retreat on the monumental scale. Formally proved in 1976, they are known as the Milanković Cycles; but credit must surely be given to James Croll for even suggesting it in the first place.
Around 13,000 years ago the clock’s cogwheels clicked once more, and the earth warmed and warmed rapidly. Across the British Isles, Scandinavia and the Alps the once-massive ice caps were gone within a couple of thousand years or so. In North America, due to the sheer size of the Laurentide ice sheet, it took until about 8000 years ago for the mighty wall of solid ice that once stood over 1000ft thick above Times Square to melt back to its current locations across northern Canada and Greenland. The world had entered the Holocene period, and everything changed in a cosmic heartbeat.
For our ancestors, as the planet turned up the thermostat, some of these changes may have even been felt within a couple of generations, and of course their consequences. Global sea levels rose dramatically, to the order of 400ft. Britain became an island, mainland Australia parted company with Tasmania and New Guinea, and the Mediterranean Sea crashed through the Bosphorus to flood the Black Sea basin, then occupied by a saline lake – an event that may have prompted stories of the biblical flood. In Africa, with more water in the atmosphere the Sahara region entered what’s known as the ‘African Humid Period’, where it was a lush, semi-tropical landscape full of rivers, lakes, giraffes and elephants – a climatic band that then stretched across Mesopotamia and inspired the earliest of cultivations and agricultural societies. In North America, as the climate warmed significantly the tree line moved to the north of Canada and great oak, maple and chestnut forests expanded across the continent replacing the vast grasslands in many areas. This, and with the explosion of the human population saw the very spectacular extinction of many of the continent’s large animals – from huge bears and sloths, from mammoths and mastodons to horses, dire wolves and saber-toothed cats. The landscapes the indigenous hunter-gather peoples lived in were changing before their eyes and must have proven a challenge for sure. But one they successfully mastered in each area from the forested east to the arid and mountainous west.
The idea of giraffes and elephants lazing around fertile river valleys and around vast lakes in what it now the Sahara Desert seems surreal, and human populations too – there is early rock art in Egypt depicting these animals and folks hunting them – a Garden of Eden so to speak. Well, apart from the Nile valley and delta northern Africa does not look like this today, and has not done so for 6000 years. Some refer to the Holocene as the Long Summer, but the climate has fluctuated up and down with monotonous regularity throughout the period and has affected society along the way, albeit regionally and sometimes sporadically. Knowing the temperature is the single most important factor in determining both the forecast for tomorrow and whether to plant barley, or whether it’s worth planting barley at all for the next ten years, and just hitch up the pony and move on. And it is an incredibly difficult problem to solve and put a number on – and therefore as a farmer from Cairo to Cornwall you relied on cloud design and wind direction, it’s as old as the hills.
The ancient Greeks for all their innovation couldn’t really solve the problem either, and although they mastered some of the rudimentary facts of thermodynamics, they would stumble upon one that would prove fundamental – that as things heat up, they expand. However, and in what manner they looked to employ this information is lost to time. The Romans might have tried but during their most triumphant years the weather was on their side, so maybe there was no need to probe too far. Arabic and eastern alchemists had by the 8th century understood not only do certain liquids boil, evaporate and then condense again given different amounts of heat from the fire – the root to modern medicine and single malt scotch through distillation – they were eyeballing it using bubble measurements, oils and other means, but nothing to truly calibrate these heats. Indeed, they would design the apparatus to guide accuracy rather than hunt for a gradient scale. Back on the farm, after a couple of dire centuries of near perpetual rain, frost and lost summers, Europeans struggled through to the Medieval Warm Period around 1000AD and it’s all sunshine and light again, and once more no-one really cared that much about temperature, not even the monks and barbers who were employed to use distilled spirits to cure or kill. But it starts to go catastrophically wrong in the early 14th century, and it probably began with a truly ground-shaking volcanic eruption in far-off Indonesia.
Based on extensive research by the Museum of London Archeology (MOLA) in the early part of the 21st century at the site of the ancient St. Mary Spital cemetery (the largest excavated grave site in the world), a staggering number of bodies were found dating to a famine period around 1258 – something had gone seriously wrong with the weather. The excavations have revealed over 10,000 bodies, and maybe up to a third come from this year. This was just one graveyard on the eastern fringes of London, and can surely only be a fraction of fatalities across Britain that year and in the years that followed. London wasn’t alone, contemporary reports from Paris reported brutal weather and thousands freezing to death. Other indicators, from tree rings and ash deposits point the culprit to this massive human catastrophe as being volcanic in origin. But where?
Possible theories have ranged from North America to the African Rift, but detective work – albeit not 100% conclusive – point to a monumental eruption of an Indonesian island mountain called Samalas; an explosion that makes Krakatoa look like a damp squib. Much more research is required, but the effects seem to have not only been longer term, but one that kicked off a chain reaction. From English records we can see that crop failures became more frequent, unrest against the king and authority more intense as resources ran dry. This was a theme repeated across Europe, and while there may have been some respite in the first decade of the 14th century, everything fell off the cliff in 1315, and we usher in the Little Ice Age.
Medieval Warm Period and the Little Ice Age
The opening scene of Shakespeare’s tragedy Macbeth, with the thunder and lightning crackling away, brings ashore on the storm the three witches, who are to foretell the doom of the Scottish king. The rain lashes, the fog hangs low, and there among the tempest they conspire their next get-together amid yet more lightning and rain to ensnare Macbeth with their prophecies of power and wealth. Roman Polanski’s 1971 film version with its storm battered heathland and bleak wind-swept desolate moors captures the mood perfectly, it is indeed a foul day to win such a fair victory in battle. It captures what many would consider Scotland as looking like in the middle-ages: soaking wet, cold, treeless, gloomy and forbidding. It frames the dark atmosphere of the play brilliantly, but like the story itself a work of fiction, this is not what Scotland looked like nine-hundred years ago. But by the time Shakespeare came to write the play in the early 1600s, he may be excused for thinking that it might have done. Sixteenth century Scotland was a pretty wet, cold and often dreary place.
By contrast, when the real King Macbeth ruled Scotland in the middle of the Eleventh century it was, like most northern European countries, experiencing what would later be called the Medieval Warm Period, a climate-optimum of long warm summers and reliable harvests, and probably the best weather seen since Roman times – wheat was harvested in Scandinavia close to the Arctic Circle, and grapevines viable in England. Also, and let’s be clear here, Macbeth the man did not go creeping around castle corridors in the wee small hours looking for pensioners to slay, and then spend the rest of his days with a mad wife awaiting mobile forests and Caesarian regicides. Although he was a warlord and lived in a violent world, he did command his kingdom for over seventeen years unmolested, and part of the reason for his success apart from his own personality and ruthlessness was he lived through an age of relative arable prosperity and thus economic stability, which was known as the ‘Age of Ale and Bread’. These were the good times, and they rolled. Europe’s population doubled between 1000 and 1300, both famine and pestilence barely register, and the brutality of early centuries now but tales to be told, such as the Viking Sagas and Beowulf. This was the calm, for no-one was braced for the storm that was about to be visited upon them from the early 1300s, and barely let up for the next four hundred years.
The term ‘Little Ice Age’ is a bit misleading, questionable even; and, clearly there is more to four hundred years-worth of history than a run of bitterly cold winters, non-existent summers, devastating storms and wrecked harvests, but it is a constant feature, a side character in the play that tends to pop up and swing the balance, influence decisions, force innovation and discovery on one hand and wreak havoc on the other. It’s not the decider in the game, but an important undercurrent throughout.
It wasn’t defined by a centuries-long, Narnia-style winter, but characterized by a marked increase in the volatility of the weather. Yes, it got colder – a lot colder, but it was the inconsistency from one year to next that kept you on your toes and made long-term agricultural planning impossible. The knock-on effect of this would be shorter leases, and with that less investment in making the garden grow so to speak. In one given year the harvest might fail simply because there was snow on the ground until April followed by a summer when it never stopped raining before the hard frosts kicked in; the next year might be blisteringly hot and the crops simply withered in the fields. Either way, you were in trouble – big trouble, especially if you lived hand to mouth in a subsistence agrarian world, like Europe’s Atlantic coast. It is reckoned that on an average day a late medieval farmer consumed just about the same number of calories as he expended growing and harvesting them. In short, for a good chunk of the last thousand years people were mostly hungry. Consecutive failed harvests and it could be game over if you’d run out of alternatives, help or provisions for a rainy day (indeed the idioms, ‘Save for a Rainy Day’, and ‘Make Hay while the Sun Shines’ both originate from the unpredictable weather patterns that came to define the opening salvos of the Little Ice Age, when being prepared for random changes had become crucial in a way it hadn’t been in the past). Expect the unexpected became the new-normal.
Like all climatic patterns, there is no one single reason why the Little Ice Age happened or lasted the length it did; no single answer to why regular high-octane storminess replaced relative stability, why bitterly cold winters and roasting hot summers hallmarked a lottery of variability from one year to the next, one place to the next. No wonder our medieval ancestors from every corner of society placed their faith in divine intervention and acts of God. It was just so baffling – like the Titans were playing blackjack with mankind: and losing. Today with modern understanding and computer models we can deduce most of what was going on, and much of the information we’ve gleaned comes from scientific research on polar ice-cores and old tree-ring information that gives us a guide to temperature; art, literature and anecdotal evidence to describe the human response and observation of what was happening around them; and economic evidence for grain prices, population movements and of course mortality figures.
Although on average cold throughout, there were two particular phases that really came to define the period as a whole with intensity of conditions: from 1315 to around 1420, and again from 1650 through to 1715, and both are ingrained in the psyche of the European collective memory. The first episode was the big shock to the system, and to compound the confused misery nature threw in the Black Death to mix things up. It must have been awful on a scale we just simply cannot fathom this distance remote in time.
Following the aftermath of Samalas and other vectors we still don’t understand, the first tangible effects would have been felt in the fields and they left all of Europe, forest, town and hill with a lasting legacy. The ominous unreliability of the harvest in the 1320s led not only to grain shortage, but also cattle disease which in northern Europe simply made the already grim famine conditions worse. Malnutrition in children will weaken their immune systems permanently, and those who were children during these hard, lean years were therefore the immune-compromised adults who faced the hellish apparition of the plague when it arrived in the late 1340s, exacerbating the death toll and altering the very fabric of the societies that followed. Both climate and disease would be key vectors in determining the course of much of Europe’s history and its impact globally. But what had happened?
Over the last few thousand years under normal, average conditions there is a pressure balance in the North Atlantic, with a permanent low system anchored over Iceland or a little to the west and solid high-pressure bubble sitting over the Azores and stretching across towards Bermuda at a latitude of around 30°N. This relationship controls the flow of westerly winds across the Atlantic and draws warm, moist air into Western Europe and the British Isles, making summers cool and winters milder than they ought to be given their northerly position. It also controls a conveyor belt of Atlantic depressions that bring frequent rainfall and occasional stormy weather, which in the UK we tend to generalize as being the effects of the almost mythical ‘Gulf Stream’.
Over time this correlation between the low over Iceland and the high over the Azores can wobble and fluctuate in both space and intensity, and this is known as the North Atlantic Oscillation (NAO). The strength and position of the system varies from year to year, decade to decade, and can be influenced by external conditions such as volcanic activity, the El Nino pattern in the Pacific and even sunspot cycles, and it has a huge impact on the climate of Europe, over both short and longer terms. A large difference in strength between the two zones leads to an increase in westerly airstreams and the attendant weather patterns they control, and the converse is true if the index, as it is called, is low and the westerlies reduced. If those westerlies are reduced then it leads to seasonal extremes, with cold winters and very hot summers, with the added effect of reducing overall rainfall, particularly in the north and west. With this reduction in strength Atlantic storms also tend then to track across the Mediterranean zone rather than the North Sea region, bringing increased rainfall over North Africa, Spain and Italy.
In addition, if the Azores High happens to shift slightly to the south then it has the effect of deflecting tropical cyclones into the Gulf of Mexico; if in a more northerly position then it drags the tail end of Caribbean hurricanes and flashes them across the Atlantic as deep, storm bearing lows. Writing in 2015, K. Jan Oosthoek notes: “it is now thought that during the Little Ice Age the North Atlantic Oscillation index was generally in a more negative pattern.” This is no doubt the case, but it would also appear to have flipped on a frustratingly unpredictable manner, creating sudden shifts in the weather, which everyone had to deal with without much without warning.
With a southward movement in the frigid polar air mass, there also appears to have been a significant delay in the seasonal shift from deep-winter to welcome spring-like conditions that today we expect from March through April. Spring, and indeed summer itself was also cooler, with some year-round night frosts often mentioned in well-kept records from the south of England and France. There was also an increase in storm frequency and intensity, as the main track for significant Atlantic depressions moved south to around 50 - 60° North, which is slap-bang over Scotland, where today that conveyor blows through higher latitudes north of Shetland before petering out over the Norwegian Sea. Autumns seem for the most part to been on a par with today, but with a sharp, cliff-fall plummet into a far more brutal, frosty and snowy winter. The resultant shortening of the time lapse between the start of spring and the end of Autumn, and thus the growing season, would have dealt a blow to any farmer; but factoring the effects of altitude, slope aspect and existing field layouts into the equation, we find the disruption pattern was not consistent across the country. Nor was there consistency in how it was handled.
During the most severe periods there was extensive starvation and multiple crop failures on a massive global scale leading to civil unrest, and by the sixteenth century society had altered irrevocable and Europe would not emerge the same, both politically and institutionally the old order was swept away, and it culminated with the bloodletting of the witch-hunts, the Protestant Reformation and the destructive Thirty Years War. The back story to all of these consequences is complex and multi-layered with no single cause, but without a weather-beaten, war-weary, half-starving, down-trodden, fed up and at times angry demographic, perhaps the severity of these seismic events wouldn’t have been as pronounced, or their effects as lasting.
Based on the width and density of tree rings collected by researchers in both the UK and Ireland, it’s been demonstrated that 1695 to 1704 was the coldest single decade in the last 1000 years, if not longer, and would be brought home by four completely failed harvests, resulting in severe shortages without relief, leading to utter destitution, the kind that had not been seen in Scotland for over a century and would never be seen again. The rings are also punctuated by the big eruptions of Hekla (1693), Serua (1693) and Aboina (1694), and the huge volumes of ash they ejected certainly contributed to the 1690s being demonstrably the coldest phase of the entire Little Ice Age period, when in places the growing season shrank by two months
As things cooled globally, Dutch whalers bound for the Arctic had begun noticing and recording that the climate of the high north was altering rapidly. Up until then the sailors had sat in port awaiting the spring melt of the pack ice to access the hunting grounds and their summer stations where they processed the much sought-after oils and blubber. However, from the 1660s, the ice it seemed was taking longer to melt, and from a more southernly point. The whales had been forced south too, but access to their facilities was becoming increasingly restricted and at times unreachable, with the ice not melting at all, even in the middle of the summer. By the 1680s, and with the pack ice freezing south of Iceland it looked like their industry was finished. But ever the ingenious and adaptable, rather than caving into the problem, they solved it. First, they turned their hunting ships into on-board factories, and then having figured out how to transport the oil long distance they brought the cargo back to Holland where they refined it under much better working conditions, with the result being the highest-grade whale oil in the world. It made them fortunes. This was no one-off, and no-one in Europe adapted to the cold and used it almost as a muse more than the Dutch – famous paintings of rich merchants on skates zipping across the frozen River Amstel while families have snowball fights or roasted chestnuts is the classic image of Little Ice Age Holland. The Dutch response and adaptability came in conjunction with their independence from Spain and rise as a mercantile superpower – it was a perfect storm, and one they took full advantage off. Their ‘can-do’ attitude was one of problem solving rather than counting sorrows, and it served them well. And while the 1690s brought horrific famine from Finland to France and seven-figure death tolls, it also saw the flourishing of science and the Age of Reason, but again many of the advances made came out of necessity to deal with the world around them, which included the people themselves who forged the paths ahead; and at the forefront of this discovery was the goal of taming temperature itself. Putting a number to it doesn’t change the weather, but it changes your ability to adapt and prepare for it.
Daniel Fahrenheit & the Magic of Mercury
In 1708, right in the middle of this bitterly cold period, Daniel Fahrenheit, a skilled thermometer maker from Danzig travelled to Copenhagen to meet with the Danish astronomer Ole Christensen Rømer, it would prove fortuitous. For personal usage, and we have no idea in what capacity, Rømer had created a homemade temperature scale, undoubtably based on more basic earlier attempts by folks like oddly named Santonio Santonio, whereby he calculated the freezing point of ammonium-chloride brine water to be 0° (this was the coldest measurable liquid that could be reproduced at the time) and then fixed the freezing point of pure water at 7.5° RØ. The light bulb flashed in Fahrenheit’s head, and realized he could combine his thermometer-making with a practicable and useable scale of temperature.
Rømer wasn’t the first to invent a scale, and Fahrenheit not the first to make thermometers, but it was revolutionary. By 1706 he was making barometers and other apparatuses based on the Florentine system, and it wasn’t a huge leap to go from this to making accurate and calibrated instruments.
The Florentine system really began when in 1646 Galileo started making observations of bubbles rising in a glass vial when heated and falling when cooled, an extension of the old Greek realization that objects, air and so forth expand as they get warmer. This was further developed in Florence by glass-makers Antonio and Jacopo Mariani who filled a tube with alcohol spirit and glass beads and recorded to which level or number of beads up the tube the spirit travelled when heated. There was no scale involved, but by simply counting the beads a gradient could be ascertained. At around the same time in England, physicist Robert Hooke was experimenting with mercury, and polymath Issac Newton suggested three fixed points to quantify this gradient - 0° being the freezing point of water, 12° the heat of blood, and 34° the boiling point of water. Rudimentary it may have been, but these were the shoulders that Fahrenheit and then Anders Celsius would stand upon.
Originally known as Hydrargyrum (hence the elemental symbol Hg), mercury is, well, mercurial. The only known metal that is a liquid at room temperature, the old name meant ‘water silver’; and the term ‘quicksilver’ is often used as well. It’s a dense, heavy metal that has some rare properties, one of which is how it reacts to fluctuations in heat or cold. The Roman God Mercury was the messenger of the gods due to his lightening speed and was named for the planet that races across the night skies. At one time most of the key metals were named for the gods of the sky, now only mercury is. It also a rare element, and is usually extracted from the ore, cinnabar – also the source of the colour and pigment vermillion.
The Spanish would use mercury in helping in the extraction of New World silver, and it would be applied to the relief of toothache and as a believed cure for syphilis, until it was realized that the health consequences of ingesting the metal were worse the symptoms of the disease itself. Milliners would use mercury for hat-making, and over the course of years of skin contact it got into their brains and affected their sanity – hence the phrase to be as ‘Mad as a Hatter’. But, as well as a raft of industrial applications, it must be most famous for its use in accurate thermometers. Indeed, we still use the term, the ‘mercury is climbing’ as summer weather comes in. The UK, along with most of Europe, banned the production of mercury thermometers in 2009 due to the environmental impact of disposal.
In 1659, the French astronomer and mathematician, Ismael Boulliau who was also trying to grapple with the temperature problem abandoned the use of mercury in his gauges in favour of alcohol as he felt it wasn’t as responsive. In 1713, and inspired by Rømer’s scale, Fahrenheit was developing quality, precision mercury thermometers – by 1717 he’d ironed any issues and with well calibrated glass tubing, the use of a bulb reservoir and all contained within a vacuum, he was producing them commercially, and this was the trick to his success. Fahrenheit wasn’t making one-offs for laboratories and amateur scientists, he was selling to the mass market. With his longer and more calibrated and, more importantly, reliable scale it was an instant hit, and the Fahrenheit system of degrees of temperature became adopted everywhere.
Following Rømer he marked his 0° F as effectively the freezing point of brine, and 32F was the freezing point of pure water. For his 100° he, like Newton went for the temperature of blood (or body heat). We know that this is not quite accurate as most people come in around 98.5° F – but, it is possible that when testing for this at a time when it was much colder and fevers were also more prevalent that the average temp might actually have been higher across the population. And, to complete Newton’s three-point proposal, the boiling point of water was pegged at 212° F. The applications of his system were enormous.
The official yardstick of the UK, is literally a brass rod held at Westminster calculated in length, and the standard which all others must copy to comply commercially. It was set at 36 inches at 62° F and at sea level atmospheric pressure. Coins across Europe were minted according to the same exacting controls, and alcohol was produced and taxed in varying degrees – lager being colder than whisky for example. For the first time in human history, temperature was quantified, measurable, and applied.
Daniel Fahrenheit was, however not alone – René-Antonie Ferchault de Réamur was also working on a scale that included body temps and freezing points, but also on the temperature of silkworms and, strangely, oranges. That notwithstanding, his name is less catchy for the public imagination which probably doomed his research. He and others would fade into obscurity, but then enter Anders Celsius.
Anders Celsius
Born in Uppsala, Sweden in 1701 Anders Celsius was raised in an eminent family of talented mathematicians and astronomers, and he followed in those footsteps graduating from the local university. As an astronomer he was part of a series of observations and expeditions that took place at the time to confirm that the earth was indeed flattened at the poles rather than a perfect sphere; but his legacy is of course in his scales for determining temperature. While they both lived at the same time, there is no evidence that Celsius ever met or corresponded with Fahrenheit, but in 1742 he proposed a slightly different set of gradients, which on paper were simpler to work with, and in many ways they are.
For some reason he set his 100° mark to be the freezing point of pure water (well, actually he marked it as the ‘melting point of ice’, which is effectively the same thing) and zero as the boiling point of water. For the purposes of the weather this makes perfect sense with 100 increments, but for more accurate, precision temperatures Fahrenheit is better as you have 180 increments between the two points. Today with digital thermometers calculating into the decimal point it’s a moot point, but on a mercury thermometer it did make a difference.
In 1745, a year after Celsius died, Carl Linnaeus reversed the scale so that zero became the freezing point of water, and so on. Jean-Pierre Christin also used the system inverted as we know it today, and called it a ‘centigrade’, which is a term that is still, even today used instead of Celsius to mean the same thing. As mentioned, both °C and °F are centigrades, for they climb between zero and a hundred as two base points. The two scales are interchangeable – to go from Celsius to Fahrenheit you do the equation X9 /by 5 and add 32 (and of course simply reverse it to go the other way). They converge at the bitterly cold -40°
Kelvin
As mentioned, both Celsius and Fahrenheit have their merits and their limitations, but how do we feel the cold, and how cold can it really get? There is a limit to how cold mercury can record, as its freezing point is around -38° C (which is about the same in Fahrenheit), so, going colder needed another mechanism. The coldest recorded temperature on earth comes from the Vostock Station on Antarctica, at an eye watering -89.2° C, and the coldest permanently inhabited town, Oymyakon in Russia has an average winter temp of -58° F (- 50° C), so, another scale was needed and a different technology.
Humans, like most mammals, will recoil at scorching heat and bitter cold, but can feel a variety of more moderate temperate sensations – we enjoy cool breezes in the middle of the summer and warm sunshine lying on the garden lawn; but we’re not so keen on sub-zero temperatures or sweltering conditions accompanied with that joy of high humidity. This is known as ‘thermoception’, and relies on a relay of skin to nerve signals, probably regulated for consistency with our smaller hairs. We cannot cope with -50° C, and through our evolution during the last ice age, our ancestors developed thick clothing and certain physiological adaptions. But, with having to live and work below -38° C, we not only needed an updated system, but also a way to comprehend what would be known as ‘Absolute Zero’: the final-final.
Although born in what is now Estonia, Thomas Johann Seebeck was a German physicist who, after much experimentation and up against competition figured out that electric currents (particularly the voltage) were directly and proportionally affected by temperature on the metal wires used, and showed that conductors – and now super-conductors – have a strong relationship with the scale. Known as the Seebeck Effect or thermoelectric effect, this thermal technology allowed scientists to pursue and record temperatures below what could be achieved by mercury alone. It was the first step on finding the holy grail of absolute zero.
In 1802 Sir Humphry Davy theorized that electricity and magnetism were connected, and that electricity could be ‘made’ using both chemical and mechanical mechanisms. He was right, but while he could demonstrate it in the lab and lecture theatres, he had no formula or way to prove why. Published in 1865 in a truly amazing work, the Scottish scientist James Clerk-Maxwell produced the formulas that showed not only was Davy right, but that electricity, magnetism and light were essentially different expressions of the same thing – where atomic movement and vibration could be converted into energy. This would be instrumental in understanding the next stage in temperature, building on the Seebeck Effect.
Born in Belfast, Ireland in 1824 William Thomson moved to Scotland with his family at the age of six, when his father got a position as mathematics professor at Glasgow University. After an incredible education, which took him across Europe Thomson junior followed his father into the university as both mathematician and natural philosopher – a role he would hold for 53 years. So, incredible was his academic contribution he was the first scientist to be elevated to the British peerage and take a seat in the House of Lords, as Lord Kelvin – named for the river that runs past the ancient halls of the university. Working with Joule on energy and collaborating with Clerk-Maxwell, Thomson figured out that temperature, like light was a function of atomic activity and that there, if you run the clock back so to speak you get to point zero, regardless of the substance. When everything stops and neither light or heat are emitted he calculated it as -273.15° C (-459.67° F). This is Absolute Zero. Thomson aligned the scale to Celsius and is known as the Kelvin Scale – so, the freezing of water (0° C / 32°F) is 273.15° K. This is the international standard unit for academic and industrial gradients of temperature; even, weather forecasters in the United States figure out the temp in Kelvin and then convert to Fahrenheit to deliver to the public.
Maxwell’s Demon
Without Maxwell’s formulas and equations Marconi couldn’t have invented the radio, John Logie Baird the television or allowed Alan Turing to decode the Enigma machine – Albert Einstein had his photograph on the wall in his office at Princeton University, New Jersey. He famously said, I stand not on the shoulders of Newton, but on the shoulders of Clerk-Maxwell – he simply was the man who changed everything. In 1867 he proposed a thought experiment whereby a theoretical being can be a gatekeeper of atoms between two boxes of different temperatures, allowing some to pass depending on speed of vibration and heat – “A Being who can play a game of skill with the Molecules”. According to Thomson’s Second Law of Thermodynamics, in a vacuum two spaces with different temperatures will reach an equilibrium, and cold gas cannot pass from one chamber into one containing hot gas, but reverse. Clerk-Maxwell suggested otherwise. As he’s earlier proved molecules move at different speeds – if at the beginning we have two chambers with the same temperature and molecules moving in all directions within the given chamber the gatekeeper will let the fast molecules move into the other, slowly warming it up; it could let the slower molecules move in the other direction cooling it down – the second law does not allow for heat to transfer like this. Lord Kelvin would call this hypothetical being “Maxwell’s Demon”. It was a thought experiment, the like Einstein enjoyed and would later employ, and it would throw them all for a loop. However, in order for the little demon to have the energy to open and shut the trapdoor it would have to draw from the warmer molecules and so the entropy couldn’t be violated and to date no-one has proven that the second law is wrong. But, it made for interesting discussions and led to a far better understanding of what was going on.
Although he probably got this one wrong it opened up the door of Thermodynamics, and with the influence of people like Kelvin and Clerk-Maxwell among many others such as Rankine and Planck, the role of temperature was heading off down a road that would lead to scientific discovery, uncharted invention and of course medical advancement.
Since antiquity, the greatest killer to stalk the land has always been Malaria; even today it carries off more people (sadly, mostly children) than any other disease. According to Nature and National Geographic magazines, Malaria has been responsible for the deaths of over half of all people who have ever died. From the Italian, meaning ‘Bad Air’ it was especially feared by the Romans, who would lay sacraments and offerings to the goddess, Dea Febris – the Goddess of Fever – the tell-tale sign. One of her daughters, sired by Saturn was Dea Quartana – the goddess of Malaria as the fever returned every four days or so. Black Death, Smallpox, Tuberculosis and all the other horrors of medieval Europe, it was still Malaria that caused the greatest human catastrophe. In 15th century London, you were statistically better off staying in the city during a visitation of the plague then heading to the local countryside where you would be scythed by the ‘Great Fever’. Solving the Malaria problem (well, in Europe at least) meant a whole raft of societal changes, but also meant thermometers in your mouth, armpit or most likely elsewhere. The correlation between accurate body temperature reading and illness was a vital step forward.
The ancients believed that fever was a curse, or a symbol of divine displeasure, but it is of course the body’s natural response to infection or disease – the contractions of the muscles and the upturn in metabolism help the immune system to kill off what has invaded. This can go haywire, and a run-away affect can kick in which ironically can be even more dangerous. The Greeks, not entirely convinced it didn’t have something to do with the gods suspected a more earthly explanation, and while the Romans were offering up gifts to Dea Febris, their scholars and physicians also concluded that fever had more to do with the response to illness. As noted above, the experience of the Great Fever of Malaria must have brought about such conclusions. But, again like the weather a fever can be mild, hot or roastingly dangerous; but, there was no way to put a number on it, and therefore no way to tailor the medicine to the bug or the fever it engendered. They obviously had no notion of pathogens, bacteria or viruses, which doesn’t really help when trying to cure someone – but, unlike us they did have a vast experience of horrific disease and so cure was more about symptoms – and fever was one of them. All basic cures were herbal in nature, and any decent holistic, home-schooled, village practitioner would have been able to estimate the scale of a fever by touch and levels of sweat alone. But, herbs and cold-compresses can only take you so far. Knowing the heat value of a fever is a crucial component in modern medicine – from a couple of Tylenol tablets to hospital-grade anti-inflammatories, and an accurate thermometer is a crucial part of the doctor’s tool-kit.
Everyone is different, but generally speaking, we run our internal engine at around 37.5° C, hitting 38.5, 39 (or 100 - 102° F) does need attention, but until the 19th century these numbers were impossible to accurately gauge or understand in terms of importance. If you hit 104° F, you need to call the ambulance before organ failure kicks in – back in the day, how would you even know that was the figure? The problem with either the mercury or alcohol thermometer was the second you took it out of the patient’s mouth the liquid fell back down the tube towards room temperature, so you couldn’t get an accurate reading. In the 18th century physicians Anton de Haen and Gerard Van Swieten had correlated that the progression of illness matched the ups and downs of temperature, so getting this right was vitally important.
In 1867 British doctor, Sir Thomas Allbutt (although I’m sure he preferred the all-mouth method), designed a 6” long mercury thermometer based on the Fahrenheit scale that could take a mouth reading after five minutes. The key to this innovative success was a pinch or bend in the inner tube, which prevented the mercury from retracting unless the vial was shaken. We’ve all seen a doctor shake a thermometer; and this is why. It led, for the first time, to an actual accurate bedside temperature reading. It must have saved countless lives. During the Second World War, Thomas Hannes developed the ear thermometer, which gives a far quicker, and less intrusive reading. The digital age wasn’t far behind. We are all very aware of the figure 100.4° F or 38° C as the firewall that prevented you going anywhere from the shops to a funeral during Covid. The accuracy which can now be deduced would have been bewildering to the Founding Fathers who conquered Cold.
Every time we get a scratchy throat, or feel feverish and rubbish; every time we set our washing-machine for the right load or coffee filters for the right beans; every time we board a plane, decide coat or no coat, gloves or no gloves, or even sending men to the moon and bringing them back; we need to thank the likes of Anders Celsius, Daniel Fahrenheit, Ole Rømer, and all the other pioneers of temperature for making it possible, and in no small degree.