COP26

Snow Models, Ice Models and Climate Models Generate ‘Data’ According to Scientists

Arctic Sea-Ice

This is what Professor Johan Rockstrom posted on Twitter 2 days ago:

Here is Rockstrom’s profile. As you can see he’s an earth science bigwig on ‘global sustainability’ and ‘planetary boundaries’ and he’s also Director of the Potsdam Institute, so he’s definitely an ‘expert’ who we should take very seriously. When he says that the Arctic sea ice ‘tipping element’ is fast approaching a ‘tipping point’ of no return, we should put our fingers to our lips and tremble with trepidation whilst whispering ‘Oh my God’, over and over, in barely audible, abject, stupefied terror.

Here’s what that Graun article says:

Arctic sea ice thinning twice as fast as thought, study finds

Less ice means more global heating, a vicious cycle that also leaves the region open to new oil extraction

Sea ice across much of the Arctic is thinning twice as fast as previously thought, researchers have found.

Arctic ice is melting as the climate crisis drives up temperatures, resulting in a vicious circle in which more dark water is exposed to the sun’s heat, leading to even more heating of the planet.

OMG, ‘climate crisis, vicious circle, even more heating’. We’re all going to DIE!

So what’s the evidence, where’s the data for this imminent irreversible planetary catastrophe? Well, it’s models, innit:

Calculating the thickness of sea ice from satellite radar data is difficult because the amount of snow cover on top varies significantly. Until now, the snow data used came from measurements by Soviet expeditions on ice floes between 1954 and 1991. But the climate crisis has drastically changed the Arctic, meaning this information is out of date.

The new research used novel computer models to produce detailed snow cover estimates from 2002 to 2018. The models tracked temperature, snowfall and ice floe movement to assess the accumulation of snow. Using this data to calculate sea ice thickness showed it is thinning twice as fast as previously estimated in the seas around the central Arctic, which make up the bulk of the polar region.

Robbie Mallett of University College London (it’s gone right down hill since I left, I can tell you), who led the study, says:

The Soviet-era data was hard won, Mallett said. “They sent these brave guys out and they sat on these drifting stations and floated around the Arctic, sometimes for years at a time, measuring the snow depth.” But the Intergovernmental Panel on Climate Change identified the lack of more recent data as a key knowledge gap in 2019.

Yep, those hardy Russians actually went out and collected real data from the real world. They got off their arses and endured arduous conditions for long periods in order to physically measure sea ice thickness. This is what used to exclusively be called ‘data’. But now ‘data’ can be obtained by sitting on your lazy backside in a nice warm room in front of a computer screen, using ‘models’. Weather models, climate models, snow models, ice models, you name it, they’ve got models for everything these days and they generate ‘data’. You can probably even download them as an app on your iPhone, so you can now do what those brave, intrepid Russians did even whilst sipping your soy latte in some cafe in Islington. It’s great. Way back in 2019, even the IPCC admitted that there was a lack of real data on sea ice thickness. Now, 2 years into the post normal, post empiricist, post colonial, post Enlightenment, computer generated era of ‘Science’ (which governments religiously ‘follow’ to produce allegedly ‘evidence-based policy’ on stuff as diverse as public health in a pandemic, bad weather and sea level rise), we have new data which ‘evidences’ an imminent tipping point in Arctic sea-ice decline due to the fast approaching anthropogenic fossil fuel carbon-based Thermageddon.

Here are a few quotes from the actual UL paper:

To investigate the impact of variability and trends in snow cover on regional sea ice thickness we use the results of SnowModel-LG (Liston et al.2020aStroeve et al.2020). SnowModel-LG is a Lagrangian model for snow accumulation over sea ice; the model is capable of assimilating meteorological data from different atmospheric reanalyses (see below) and combines them with sea ice motion vectors to generate pan-Arctic snow-depth and density distributions.

SnowModel-LG exhibits more significant interannual variability than mW99 in its output because it reflects year-to-year variations in weather and sea ice dynamics.

SnowModel-LG creates a snow distribution based on reanalysis data, and the accuracy of these snow data is unlikely to exceed the accuracy of the input. There is significant spread in the representation of the actual distribution of relevant meteorological parameters by atmospheric reanalyses (Boisvert et al.2018Barrett et al.2020). The results of SnowModel-LG therefore depend on the reanalysis data set used.

So basically, their new model which relies upon meteorological reanalysis data (more models) shows that interannual variability in weather conditions in the Arctic is much greater than thought and this results, curiously, in the regional trend in sea ice thickness decline being also larger than previously estimated in some areas.

4.3 New and faster thickness declines in the marginal seas

As well as exhibiting higher interannual variability than mW99, SnowModel-LG values decline over time in most regions due to decreasing SWE values year over year. Here we examine the aggregate contribution of a more variable but declining time series in determining the magnitude and significance of trends in .

We first assess regions where was already in statistically significant decline when calculated with mW99. This is the case for all months in the Laptev and Kara seas and 4 of 7 months in the Chukchi and Barents sea. The rate of decline in these regions grew significantly when calculated with SnowModel-LG data (Fig. 10; green panels). Relative to the decline rate calculated with mW99, this represents average increases of 62 % in the Laptev sea, 81 % in the Kara Sea and 102 % in the Barents Sea. The largest increase in an already statistically significant decline was in the Chukchi Sea in April, where the decline rate increased by a factor of 2.1. When analysed as an aggregated area and with mW99, the total marginal seas area exhibits a statistically significant negative trend in November, December, January and April. The East Siberian Sea is the only region to have a month of decline when calculated with mW99 but not with SnowModel-LG.

We also analyse these regional declines as a percentage of the regional mean sea ice thickness in the observational period (2002–2018; Fig. 11). We observe the average growth-season thinning to increase from 21 % per decade to 42 % per decade in the Barents Sea, 39 % to 56 % per decade in the Kara Sea, and 24 % to 40 % per decade in the Laptev Sea when using SnowModel-LG instead of mW99. Five of the 7 growth-season months in the Chukchi Sea exhibit a decline with SnowModel-LG of (on average) 44 % per decade. This is much more than that of the 4 significant months observable with mW99 (25 % per decade). We find the marginal seas (when considered as a contiguous, aggregated group) to be losing 30 % of its mean thickness per decade in the 6 statistically significant months when SIT is calculated using SnowModel-LG (as opposed to mW99).

So it’s the marginal seas, more than the central Arctic region which, according to this study, are declining even faster in sea ice thickness than previously estimated. So let’s take a look at the map of sea-ice thickness for this year, May 2021 and compare it with 10 years ago, May 2011

Can you spot the significant decline in sea-ice thickness? Here is what marine biologist Susan Crockford says about this year’s sea-ice thickness:

Surprising sea ice thickness across the Arctic is good news for polar bears

This year near the end of May the distribution of thickest sea ice (3.5-5m/11.5-16.4 ft – or more) is a bit surprising, given that the WMO has suggested we may be only five years away from a “dangerous tipping point” in global temperatures. There is the usual and expected band of thick ice in the Arctic Ocean across northern Greenland and Canada’s most northern islands but there are also some patches in the peripheral seas (especially north of Svalbard, southeast Greenland, Foxe Basin, Hudson Strait, Chukchi Sea, Laptev Sea). This is plenty of sea ice for polar bear hunting at this time of year (mating season is pretty much over) and that thick ice will provide summer habitat for bears that choose to stay on the ice during the low-ice season: not even close to an emergency for polar bears.

Thick ice along the coasts of the Chukchi and Laptev Seas in Russia seems to be reasonably common, see closeup of the 2021 chart below:

Note that the Chukchi Sea and Laptev Sea both have thick ice this year. These two were singled out by the study above as showing the fastest declines in sea-ice thickness; indeed the Chuckchi provides the Graun headline ‘Arctic ice thinning twice as fast as thought’. Perhaps it is just interannual variability and these regions will show a marked decline next year, placing polar bears once again at risk of extinction. Alarmists can but hope.

Matt Ridley in the Telegraph

Matt also takes aim at the epidemiological and climate modelers, who are so fond of their worst case scenarios, in the Telegraph. He says:

The Government’s reliance on Sage experts’ computer modelling to predict what would happen with or without various interventions has proved about as useful as the ancient Roman habit of consulting trained experts in “haruspicy” – interpreting the entrails of chickens.

Again and again, worst-case scenarios are presented with absurd precision, sometimes deliberately to frighten us into compliance. The notorious press conference last October that told us 4,000 people a day might die was based on a model that was already well out of date.

Pessimism bias in modelling has two roots. The first is that worst-case scenarios are more likely to catch the attention of ministers and broadcasters: academics are as competitive as anybody in seeking such attention. The second is that modellers have little to lose by being pessimistic, but being too optimistic risks can ruin their reputations. Ask Michael Fish, the weather forecaster who in 1987 reassured viewers that hurricanes hardly ever happen.

Then he identifies the tendency I have criticised here, namely the false assumption that the output of models can be treated as ‘data’:

As Steve Baker MP has been arguing for months, the modellers must face formal challenge. It is not just in the case of Covid that haruspicy is determining policy. There is a growing tendency to speak about the outcomes of models in language that implies they generate evidence, rather than forecasts. This is especially a problem in the field of climate science. As the novelist Michael Crichton put it in 2003: “No longer are models judged by how well they reproduce data from the real world: increasingly, models provide the data. As if they were themselves a reality.”

Examine the forecasts underpinning government agencies’ plans for climate change and you will find they often rely on a notorious model called RCP8.5, which was always intended as extreme and unrealistic. Among a stack of bonkers assumptions, it projects that the world will get half its energy from coal in 2100, burning 10 times as much as today, even using it to make fuel for aircraft and vehicles. In this and every other respect, RCP8.5 is already badly wrong, but it has infected policy-makers like a virus, a fact you generally have to dig out of the footnotes of government documents.

I was pointing out the parallels between climate and Covid modelling in April last year:

“They got it wrong the second time because they relied upon an epidemiological model (adapted from an old ‘flu model) which predicted 510,000 deaths from a virus which we knew virtually nothing about.

Climate change modellers never get it wrong, simply because even when their models don’t agree with reality, this is either because the observations are wrong, or because they still ‘do a reasonable job’ of modelling past and present climate change (especially when inconvenient ‘blips’ are ironed out by retrospective adjustments to the data), but principally because the subject of their claimed modelling expertise lies many years off in the future – climate change to be expected in 2050 or 2100, when the real impacts will begin to be felt. Imperial’s and IMHE’s worst case scenarios look way off, just weeks after they were proposed and after governments acted on the modeller’s advice. Their assumptions are being rapidly challenged by new data and research. Nothing similar happens in climate change land. Their worst case scenario (RCP8.5), though comprehensively debunked, still lives on and is still being defended by Met Office scientists on the basis that ‘carbon feedbacks (however unlikely) cannot be ruled out’.

Ice models and climate models combined are data points

At least, they are according to Dr Tamsin Edwards of King’s College London, writing in the Graun:

Sea levels are going to rise, no matter what. This is certain. But new
research I helped produce shows how much we could limit the damage: sea level rise from the melting of ice could be halved this century if we meet the Paris agreement target of keeping global warming to 1.5C.

The aim of our research was to provide a coherent picture of the future of the world’s land ice using hundreds of simulations. 

Connecting parts of the world: the world’s land ice is made up of global glaciers in 19 regions, and the Greenland and Antarctic ice sheets at each pole. Our methods allow us to use exactly the same predictions of global warming for each. This may sound obvious, but is actually unusual, perhaps unique at this scale. Each part of the world is simulated separately, by different groups of people, using different climate models to provide the warming levels. We realigned all these predictions to make them consistent.

Connecting the data: at its heart, this study is a join-the-dots picture. Our 38 groups of modellers created nearly 900 simulations of glaciers and ice sheets. Each one is a data point about its contribution to future sea level rise. Here, we connected the points with lines, using a statistical method called “emulation”. Imagine clusters of stars in the sky: drawing the constellations allow us to visualise the full picture more easily – not just a few points of light, but each detail of Orion’s torso, limbs, belt and bow.

Not only are model outputs ‘data’; they are also stars in the firmament! Tamsin and the other eighty four authors of this study are also very fond of focusing on worst case scenarios:

So, for those most at risk, we made a second set of predictions in a pessimistic storyline where Antarctica is particularly sensitive to climate change. We found the losses from the ice sheet could be five times larger than the main predictions, which would imply a 5% chance of the land ice contribution to sea level exceeding 56cm in 2100 – even if we limit warming to 1.5C. Such a storyline would mean far more severe increases in flooding.

How did they generate this particular set of ‘data points’? This is explained in the actual paper:

Given the wide range and cancellations of responses across models and parameters, we
present alternative ‘pessimistic but physically plausible’ Antarctica projections for risk-averse
stakeholders, by combining a set of assumptions that lead to high sea level contributions.
These are: the four ice sheet models most sensitive to basal melting; the four climate models
that lead to highest Antarctic sea level contributions, and the one used to drive most of the ice
shelf collapse simulations; the high basal melt (Pine Island Glacier) distribution; and with ice
shelf collapse ‘on’ (i.e. combining robustness tests 6 and 7 and sensitivity tests 6 and 10). This
storyline would come about if the high basal melt sensitivities currently observed at Pine
Island Glacier soon become widespread around the continent; the ice sheet responds to these
with extensive retreat and rapid ice flow; and atmospheric warming is sufficient to
disintegrate ice shelves, but does not substantially increase snowfall. The risk-averse
projections are more than five times the main estimates: median 21 cm (95th percentile range
7 to 43 cm) under the NDCs (Fig. 3j), and essentially the same under SSP5-85 (Table 1;
regions shown in Extended Data Figure 4: test 11), with the 95th percentiles emerging above
the main projections after 2040 (Fig. 3d). This is very similar to projections under an
extreme scenario of widespread ice shelf collapses for RCP8.5 (median 21 cm; 95th percentile
range 9 to 39 cm).

I’m sorry Tamsin, but model output is not data and your worst case scenario of glacier melt and resultant sea level rise is not physically or socio-economically ‘plausible’. Climate scientists and epidemiological modelers do not live in the same world as the rest of us, but they insist that we make plans and real sacrifices to prepare for the nightmarish world which they do inhabit, if only on a part time basis.

Climate Crisis Update March 2021: It’s now as hot on planet earth as it was on average throughout the 30 year period 1991-2020.

This means that, this month at least, according to UAH satellite data, the world is now no warmer than it was in 2012, at the end of the long global warming ‘pause’ from 1998 to 2012. In other words, all of the global warming in the El Nino years after 2012 has been reversed. If the current run of cool months continues, it will not be long before the running 13 month mean coincides with the new 30 year climate normal of 1991-2020. This may happen around November 2021, when the great and the good of UN IPCC and climate concerned world leaders meet in Glasgow for COP26. They’ll be discussing how to avoid man-made Thermageddon in a world which will have refused to warm significantly in 30 years, telling us all that we must give up our cars, our jet set lifestyles (which the fascist vaccine passports will probably already have severely curtailed), our gas boilers and any hope of selling our old houses because of the introduction of new green insulation standards which make them prohibitively expensive to upgrade.