Wednesday, December 9, 2020

Lomborg’s 2020 study: ‘Data shows trend towards ‘smaller area’ in drought’ – Also floods, hurricanes, wildfires and sea-level rise are not following climate activist claims




THe only way to characterize this climate review is the phrase 'same old, same old'  The global climate is not changing at all in any way that could ever be called significant.  There are still large cycles at play out there, but they have always been at work.

We continue to be at risk from the chilling effect of massive volcanic events.  Long term we will need to establish better protection from all that.  Yet we already have with our huge globall food reserves and our ability to ship massive reserves all over.

A cold snap in Europe can be met with Southern farm production been railed north quickly.  And that is actually the most at risk region rom historical records.  No where else is this so significant.



 Lomborg’s 2020 study: ‘Data shows trend towards ‘smaller area’ in drought’ – Also floods, hurricanes, wildfires and sea-level rise are not following climate activist claims

https://www.climatedepot.com/2020/12/05/lomborg-data-shows-trend-towards-smaller-area-in-drought-also-floods-hurricanes-wildries-and-sea-level-rise-are-not-following-climate-activist-claims/


Bjorn Lomborg: Droughts: For drought, the IPCC concludes “there is low confidence in attributing changes in drought over global land areas since the mid-20th century to human influence” (IPCC 2013a, 871). Moreover, it concludes “there is low confidence in a global-scale observed trend in drought” with drought having “likely increased in the Mediterranean and West Africa and likely decreased in central North America and northwest Australia since 1950” (IPCC 2013a, 50). The IPCC repudiated previous findings from 2007, saying our “conclusions regarding global increasing trends in droughts since the 1970s are no longer supported” (IPCC 2013a, 44). This was because new data showed no increased global drought (Sheffield et al., 2012; van der Schrier et al. 2013), and one study even showed a persistent decline since 1982 (Hao et al., 2014), while the number of consecutive dry days has been declining for the last 90 years (Donat et al., 2013, 2112). 


Floods: The USGCRP summarizes the IPCC to say they “did not attribute changes in flooding to anthropogenic influence nor report detectable changes in flooding magnitude, duration, or frequency” (USGCRP 2017, 240).


Wildfires: While deforestation has reduced the amount of forests, it is likely that fires in forests have declined even in percentage of the remaining forest areas across the past century.


Hurricanes: The IPCC concludes that we cannot confidently attribute hurricanes to human influence: “There is low confidence in attribution of changes in tropical cyclone activity to human influence” (IPCC 2013a, 871). Indeed, globally, hurricanes are not getting more frequent: “current data sets indicate no significant observed trends in global tropical cyclone frequency over the past century” (


Sea level: Globally, over the past 30 years, rising sea levels have not resulted in more land underwater. Adding up all the coastal land lost and reclaimed, it turns out that the total coastal area has increased by more than 13,000 km² (Donchyts et al., 2016). This is perhaps most visibly the world's largest coast reclamation of the 80 km² of Palm Island and adjacent islands along the coast of Dubai, but across the world, many countries have shaped and extended their coastlines by land reclamation. Bangladesh, despite popular understanding, has net added about 480 km² of land in the face of sea level rise.



By: Admin - Climate DepotDecember 5, 2020 3:30 PM with 0 comments



https://www.sciencedirect.com/science/article/pii/S0040162520304157



2.3. Drought

It is instructive to look at a few, concrete impacts of the most visible issues that are associated with the portrayal of climate change devastation. President Obama repeatedly emphasized climate change means that we both are seeing and will see “more extreme droughts, floods, wildfires, and hurricanes” (Obama 2013). The UN Secretary-General similarly claims that “climate disruption is happening now, and it is happening to all of us. … Every week brings new climate-related devastation. Floods. Drought. Heat waves. Wildfires. Superstorms” (Guterres 2019). In a recent survey, it was found that such extreme events are what make most people change their minds on climate (EPI and AP-NORC 2019).


Yet, the data doesn’t support or only marginally supports such claims. Moreover, there are almost invariably more effective policies to reduce net impacts.


For drought, the IPCC concludes “there is low confidence in attributing changes in drought over global land areas since the mid-20th century to human influence” (IPCC 2013a, 871). Moreover, it concludes “there is low confidence in a global-scale observed trend in drought” with drought having “likely increased in the Mediterranean and West Africa and likely decreased in central North America and northwest Australia since 1950” (IPCC 2013a, 50). The IPCC repudiated previous findings from 2007, saying our “conclusions regarding global increasing trends in droughts since the 1970s are no longer supported” (IPCC 2013a, 44). This was because new data showed no increased global drought (Sheffield et al., 2012; van der Schrier et al. 2013), and one study even showed a persistent decline since 1982 (Hao et al., 2014), while the number of consecutive dry days has been declining for the last 90 years (Donat et al., 2013, 2112). The new IPCC 1.5°C report concurs, but adds that there is medium confidence that greenhouse gas warming has contributed to increased drying in the Mediterranean region (IPCC 2018, 196).


The World Meteorological Organization has through the Lincoln Declaration on Drought Indices recommended that “the Standardized Precipitation Index (SPI) be used to characterize the meteorological droughts around the world” (Hayes et al., 2010). Fig. 9 shows the global area under severe meteorological drought for 1901–2017, showing no increase over the last 116 years.


Fig 9

Download : Download high-res image (324KB)

Download : Download full-size image

Fig. 9. Global area in severe meteorological drought, 1901–2017 measured by Standardized Precipitation Index (SPI) being less than −1.5 over 6 months, (Watts et al., 2018). Linear best fit, not significant.


The US Fourth National Climate Assessment reaffirmed the IPCC finding and stated unequivocally that “drought has decreased over much of the continental United States in association with long-term increases in precipitation” (USGCRP 2017, 49–50, 231). Both IPCC and USGCRP find that there is currently no attribution possible for drought (IPCC 2013a, 913; USGCRP 2017, 236). Thus, it is incorrect to say that currently we are seeing the climate impact of drought, either globally or in the US.


However, the IPCC suggests with medium confidence that with extreme emission scenarios (RCP8.5), it is likely that drought risk could increase in currently dry regions towards the end of the century (IPCC 2013a, 1032). Similarly, the USGCRP finds that “under higher scenarios and assuming no change to current water-resources management, chronic, long-duration hydrological drought is increasingly possible by the end of this century” (USGCRP 2017, 240). Thus, it is possible to argue that climate change can make drought worse, but it is important to point out that this is only with high-end scenarios and towards the end of the century. Moreover, the USGCRP makes it clear that this potential worsening requires an assumption of no change to water-management. In reality, such change is not only likely but also much more efficient. A recent study for California showed that during droughts, reservoir operation can reduce the drought deficit by about 50%, whereas extensive water usage (mostly irrigation) can almost double drought duration and deficit (He et al., 2017)––both actions that can be more readily changed than CO₂ levels.


2.4. Flooding

The IPCC cannot say whether flooding on a global level is increasing or even if the flooding is increasing or decreasing: “There continues to be a lack of evidence and thus low confidence regarding the sign of trend in the magnitude and/or frequency of floods on a global scale over the instrumental record” (IPCC 2013a, 112, 214). The USGCRP summarizes the IPCC to say they “did not attribute changes in flooding to anthropogenic influence nor report detectable changes in flooding magnitude, duration, or frequency” (USGCRP 2017, 240). Flooding in the US has increased for some areas (the upper Mississippi River valley) and decreased for others (Northwest). However, “formal attribution approaches have not established a significant connection of increased riverine flooding to human-induced climate change” (USGCRP 2017, 231). The new IPCC 1.5°C report finds that “streamflow trends since 1950 are not statistically significant in most of the world’s largest rivers” and that more streamflows are decreasing than increasing (IPCC 2018, 201)


For the future, USGCRP argues that given we know heavy precipitation will be increasing, it seems likely that this could “contribute to increases in local flooding in some catchments or regions” (USGCRP 2017, 242, Keigo 2018, 146). However, they also acknowledge that we don’t even know when we will be able to detect any impact from climate on flooding (USGCRP 2017, 231). The IPCC similarly concludes that global warming would lead to an expansion of the area with significant increases in runoff, which can increase flood hazards (IPCC 2018, 203), but also emphasizes that “trends in floods are strongly influenced by changes in river management” (IPCC 2013a, 214).


Again, it is simply unwarranted to posit current flooding as an example of impacts from climate change. Even in the future, this is much more strongly influenced by other human impacts like river management and extensive building on floodplains than climate change. A recent study points out that “despite widespread claims by the climate community that if precipitation extremes increase, floods must also,” it actually seems like “flood magnitudes are decreasing” (Sharma et al., 2018).


In this respect, flooding is definitely an example of the Expanding Bull’s-Eye Effect. While we don’t have global analyses, we can look to US data to show the impact of correcting for the Expanding Bull’s-Eye Effect.


In an analysis of vulnerability in Atlanta from 1990–2010, the authors find that the number of exposed housing units on the 100-year floodplain has increased by about 58% in just 20 years (Ferguson and Ashley 2017), although outside the regulatory 100-year flood zone growth was slightly higher at 71%. This means that with the same amount of flooding and all other things equal, Atlanta in 2010 would on average see 58% more housing units flooded than in 1990. We need to also consider that each house has become bigger and more valuable, meaning losses would be even higher.


If we look at the US inflation-adjusted flood costs from 1903–2018 in Fig. 10, it is apparent that costs are now 370% of what they were in 1903, from an expected cost of $3.5 billion in 1903 to $12.9 billion in 2018. This could be used to suggest that flooding is getting worse and climate might be responsible.


Fig 10

Download : Download high-res image (418KB)

Download : Download full-size image

Fig. 10. Flood costs for the US, 1903–2018 in $2018 and in percent of GDP. Loss data from National Weather Services (NWS 2015), which discontinued its data from 2015. 2015–18 is from individual year reports, which do not seem entirely consistent with previous years (Statista 2018). GDP from (BEA 2019a; Smits et al., 2009), and when relevant adjusted for inflation by (BEA 2019b). Best estimate with 2nd order polynomial least square.


However, we must first adjust for the many more buildings being built on the floodplains. On a US scale, housing units have doubled from 68 million in 1970 to 137 million in 2017 (Census 2011, 2018a). Moreover, they have become about 50% larger since 1973 (Klotzbach et al., 2018, 1371), and the average price of almost $400,000 today is 280% of its inflation-adjusted cost in 1970 (BEA 2019b; Census 2018b). If the number of houses on floodplains have increased similarly in numbers and value, it would be reasonable to expect an increase in total costs of 280% x 200% or 560%. Given that the Atlanta study showed that houses on the 100-year floodplain only grew at 80%, it would perhaps be more realistic to expect an increase of 280% x 200% x80% or 448%.


Unfortunately, we don’t have continent-wide estimates of houses in floodplains so one simple way to adjust the flood costs is to divide the impact by total US GDP. This grew slower, with 2017 GDP at 317% of the 1970 GDP, so this is a conservative correction. Yet, the right-hand side of Fig. 10 shows a very different picture, with costs in 1903 at 0.48% of GDP, dropping almost an order of magnitude to 0.055% by 2018.


Importantly, by itself this does not show whether flood events are fewer or vulnerabilities have declined, but it does show that flooding is not getting out-of-hand but rather constitutes an ever-smaller problem for the American economy.


2.5. Wildfire

A recent academic paper on wildfire summarizes:


many consider wildfire as an accelerating problem, with widely held perceptions both in the media and scientific papers of increasing fire occurrence, severity and resulting losses. However, important exceptions aside, the quantitative evidence available does not support these perceived overall trends. (Doerr and Santín 2016)


By examining sedimentary charcoal records spanning six continents, we find that global burning has declined sharply since 1870 (Marlon et al., 2008). To a large extent this is because of the so-called pyric transition, where humans stopped burning wood at home and started burning fossil fuels in power plants and cars (NAS 2017, 13). This means that today, fire has all but vanished from houses. As one fire-expert points out, it is “possible to live years in a modern house without ever seeing the fires that once, almost by definition, made a house a home” (Pyne 2001, 161). By boxing in fire in engines and power stations, we have been able to reduce its presence in the rest of the world.


There is plenty of evidence for this reduction in fire (Arora and Melton 2018; Li et al., 2018; Yang et al., 2014), with satellites showing a 25% reduction in burnt area just over the past 18 years (Andela et al., 2017). As is evident in Fig. 11, the primary factor in global burnt area reduction over the past 110 years is humans: when they start living and planting crops they want to avoid fire (Knorr et al., 2014), and do so with fire suppression and forest management.


Fig 11

Download : Download high-res image (637KB)

Download : Download full-size image

Fig. 11. Changed burnt area from 1901–2007, based on model runs with and without humans, climate, CO₂ and nitrogen deposits. This graph identifies how humans and climate are reducing burnt area, whereas fertilization with nitrogen and CO₂ increases the burnt area. In total, burn area has declined more than 1.4 million km², from almost 5 million in the 1900s to just above 3.5 million km² in the 2000s, (Yang et al., 2014).


While deforestation has reduced the amount of forests, it is likely that fires in forests have declined even in percentage of the remaining forest areas across the past century. A recent simulation shows that the burnt area for crops and pasture has increased globally since 1900, but burnt area in secondary and especially primary land (disturbed but recovering and undisturbed land) has declined more, reducing annual burnt area by a third (Ward et al., 2018, 135). If the fraction of burning in forest and non-forest primary and secondary lands has stayed constant, this means that even forests are now experiencing less burnt area, given that forested areas have declined less at 15% (Hurtt et al., 2011, 137–138).


For nations, we have the longest forest fire data series from the US. The US National Climate Assessment’s main conclusion on wildfire is that “incidence of large forest fires in the western United States and Alaska has increased since the early 1980s” and that these are projected to increase further with higher temperatures (USGCRP 2018, 231). While the factual part of this quote is correct and the projection likely correct, it needs to be seen in context. Yang et al., 2014, 259–260) finds that fire suppression in the US and elsewhere has about halved burnt area in the northern extra-tropics, and only in the last decades has it picked up a couple of percent.


If we look at the entire US wildfire data set in Fig. 12, as documented by the US Forest Service, we see that while there has been an increase from 3 million acres burnt in the 1980s to 7 million in this decade, it is dwarfed by the 39 million annual acres burnt in the 1930s and likely even higher burn rates before that.5 Thus, if anything, while climate change might be increasing burn risks, it does so from a very modest level, compared to historical data.


Fig 12

Download : Download high-res image (339KB)

Download : Download full-size image

Fig. 12. Wildfire burnt area in the US 1926–2018, and estimated decadal burnt area 1900–2000. Annual data from (Census 1975, L48-55; NIFC 2019), decadal data from (Mouillot and Field 2005, 404–5). (Reynolds and Pierson 1941, Table 4) indicates that fire consumed even more of the US forests in the 19th century; see also (Marlon et al., 2012).


As in other areas we have discussed, the conversation on wildfire often leaves out the human component. A study showed the relative effects on fire and found that when humans are around, they override the effect of climate (Syphard et al., 2017). Specifically, they found only human variables were significant for fire, such as “distance to road,” “distance to developed,” “population,” and “proportion developed,” whereas “precipitation” and “temperature” were very insignificant and explained nothing.


The expanding bull’s-eye effect is clear in Fig. 13, which shows an increased number of people and their possessions being placed in fire’s way. In 2010, the US had 124 million housing units, 700% of the number in 1940 (Strader 2018, 549). For the Western US, 22 million homes in 2010 were 1,250% of the number in 1940.


Fig 13

Download : Download high-res image (318KB)

Download : Download full-size image

Fig. 13. Million homes in high wildfire risk zones from 1940–2050. Data from 1940–2010 is from (Strader 2018, 557) and covers homes in medium to very high fire risk zones in the entire Western US. Data for 2010–2050 comes from (Mann et al., 2014, 447) and is a BAU growth projection of homes in California within very high fire hazard severity zone. Notice the different scales for Western US on the left and California on the right. The risk zones are not comparable; fire risk is actual fire risk, since that risk is much higher around Idaho, the worst in California is only “medium.” California zones are based on wildfire hazard severity zone by the California Fire and Resource Assessment Program.


But what matters for risk increase is the number of houses built where fire happens. Since most of the risk is in the West, the entire US only saw about 6 million houses or 5% of its increase going into these risk zones. But within the Western US, about a third of all new homes were built in medium-to-very high fire risk zones.


Thus, in 2010, the number of houses at risk in the West is 1,150% of what it was in 1940. Even if the fire risk remains the same, we are likely to see many more structures destroyed by fire.


Another study shows the likely risk increase in a subset of the Western US, namely California, up to 2050 (Mann et al., 2014). Using physical hazard zones based on factors like vegetation density and slope severity, they project under Business-As-Usual the number of houses in the highest fire hazard zone, and find the number will likely increase by 50% to 1.7 million homes in 2050.


As Fig. 13 suggests, the two studies probably show about the same trend for California and for the Western US. That means that in 2050, the number of houses at risk of fire will be 1,700% of what it was in 1940, entirely because of more houses built in high-risk zones.


I am not aware of any US estimates for fire costs, adjusted for risk increases, but several studies of bushfire (wildfire) in Australia have done exactly that (Crompton et al., 2010; McAneney et al., 2019). They find that while bushfires since 1925 have destroyed more houses and killed more people, this is because of more people and more houses in vulnerable areas. When the number of houses damaged is adjusted for the number of houses at risk, the trend in houses damaged is (insignificantly) decreasing (Crompton et al., 2010, 305). Similarly, normalized damage costs from bushfires declined (insignificantly) from 1966 to 2016 (McAneney et al., 2019, 17)


Future wildfire is estimated to increase with global warming. Globally, compared to the year 2000, a worst-case, high warming trend will increase global burned area by 8% in 2050 and 33% in 2100 (using RCP8.5 and changes in managed lands (Kloster and Lasslop 2017, 64). In California compared to 1961–1990, global warming by itself will increase median burned area by 15–20% in the middle of the century, and 40% towards the end of the century (Bryant and Westerling 2014; Fig. 2). But the 15–20% climate-driven increase for California from 1976–2050 is rather small compared to the almost 300% increase in number of houses in the highest hazard zone over the same period (Fig. 13). This shows that the planning decisions on where to place future growth of houses is much more important than the climate impact. That is also the conclusion of a study on future wildfire risk in California: “the effects of growth scenarios tend to dominate those of climate scenarios” (Bryant and Westerling 2014).


Wildfire has declined dramatically, both globally and for the US, over the past century. While it is likely global warming will increase wildfire somewhat in the future, the much larger impact will come from planning decisions of whether to allow much more housing in high-risk areas.


2.6. Hurricanes

Hurricanes, or tropical cyclones, are the costliest catastrophes in the world. The cost of US landfalling hurricanes alone constitute two-thirds of the entire global catastrophe losses since 1980 according to global reinsurer Munich Re (Weinkle et al., 2018). Hurricanes Katrina, Sandy, Harvey, Irma, and Florence have all been used as examples of how global warming is making extreme weather worse––perhaps most pithily in the Bloomberg Businessweek cover of Hurricane Sandy with a picture of a blackout New York and letters in font size 300: “It’s Global Warming, Stupid” (Barrett 2012).


Yet, this is not what the peer-reviewed literature says. The IPCC concludes that we cannot confidently attribute hurricanes to human influence: “There is low confidence in attribution of changes in tropical cyclone activity to human influence” (IPCC 2013a, 871). Indeed, globally, hurricanes are not getting more frequent: “current data sets indicate no significant observed trends in global tropical cyclone frequency over the past century” (IPCC 2013a, 216). However, they do find that “frequency and intensity of storms in the North Atlantic have increased” but because of particulate air pollution (IPCC 2013a, 50, 7). We cannot blame this storm increase in the Atlantic on climate: “the cause of this increase is debated and there is low confidence in attribution of changes in tropical cyclone activity to human influence” (IPCC 2013a, 113).


The US National Climate Assessment agrees that hurricane activity in the Atlantic has increased, but attribution is not currently possible (USGCRP 2017, 259, 258).


The latest paper confirms this: “currently we cannot attribute changes in North Atlantic hurricane intensity to human-related forcings” (Trenary et al., 2019). The Geophysical Fluid Dynamics Laboratory at NASA similarly tells us that not only is attribution not yet attainable, but we can’t know for at least a couple of decades (GFDL/NASA 2019). They tellingly conclude: “the historical Atlantic hurricane frequency record does not provide compelling evidence for a substantial greenhouse warming-induced long-term increase.”


Moreover, as Fig. 14 shows, the number of continental US landfalling hurricanes shows no trend in frequency or intensity — in fact, the trend is slightly (statistically insignificant) declining for both all and major (category 3 and up) hurricanes (Klotzbach et al., 2018).


Fig 14

Download : Download high-res image (392KB)

Download : Download full-size image

Fig. 14. Number of continental United States landfalling hurricanes 1900–2019. Left, all hurricanes, right, major hurricanes (category 3 and above), with (insignificant) regression lines, (Klotzbach et al., 2018), with 2018–19 from personal communication with authors.


So while Bloomberg and many news media sources confidently claim that hurricanes are being exacerbated by global warming, it would be more helpful to look at the bull’s-eye, which is definitely expanding.


As Fig. 15 shows, the US population since 1900 has more than quadrupled. But moving to the coastline has clearly been much more alluring. The population of all the coastal counties from Texas to Virginia on the Gulf and Atlantic coast has seen population increase from less than 2 million to more than 31 million in 2020, 1,640% of the 1900 population. There are now many more people living in Dade and Broward counties in South Florida than the entire coastal populations from Texas to Virginia in 1940. Incredibly, Florida’s 35 coastal counties have increased a phenomenal 67.7 times, from less than a quarter-million to over 16 million in 2020.


Fig 15

Download : Download high-res image (326KB)

Download : Download full-size image

Fig. 15. Population index (1900=1) 1900–2020 for the US, the 123 coastal counties on the Gulf and Atlantic coast until and including Virginia, and the 35 coastal counties in Florida. Data 1900–2010 from (Census 1992, 2010, 2012), and 2020 data for the US from the 2017 population prediction (Census 2017) and using linear extrapolation for each county in the two other data sets.


Clearly, when a hurricane hit in the past, it would only affect a much smaller number of people––if a hurricane ripped through Dade and Broward today, it would in some way be the equivalent to a hurricane ripping through the entire Gulf and Atlantic coast in 1940.


Housing units on the coast have similarly seen a spectacular increase (Freeman and Ashley 2017). In 1940, there were 4.4 million units within 50 kms of the coast all the way from Texas to Maine. In 2000, the 26.6 million units were 600% of the 1940 number. And almost everyone wants to live close to the coast––the first 50 kms have twice as many houses as the next 150 kms inland.


That many more people live in the paths of hurricanes with many more (and more expensive) houses goes a long way to explain why the cost of hurricanes keeps going up as seen on the left in Fig. 16. This data is often used to suggest that global warming is making hurricanes worse and more damaging.


Fig 16

Download : Download high-res image (372KB)

Download : Download full-size image

Fig. 16. Left side, cost of all landfalling hurricanes in the continental US from 1900–2019 in $2019. Right side, same hurricanes, cost if they had hit the US as it looks today (Pielke and L.andsea, 1998, 199; Klotzbach et al., 2018; Weinkle et al., 2018, 2005), with 2018–19 from personal communication with Pielke. Dotted line is linear best fit.


But correcting for the many more people and more expensive houses tells a very different story. Consider the Great Miami Hurricane of 1926, which tore through downtown Miami. Because only about 100,000 people lived there at the time, with less costly houses, the inflation-adjusted damage ran to $1.3 billion. However, modeling the cost of the very same hurricane tearing down the same path today would make it the costliest US catastrophe ever, with damage worth $254 billion. Modeling all 212 US continental hurricane landfalls as if they landed in today’s setting of people and infrastructure corrects for the expanding bull’s-eye and shows that there is no significant increase in hurricane-adjusted costs: this can be seen on the right in Fig. 16. Similar results are found for Australia (McAneney et al., 2019) and China (Chen et al., 2018).


Looking to the future, the IPCC finds that the best, but weak, evidence suggests that hurricanes will become fewer but more intense, as does USGCRP and GFDL (GFDL/NASA 2019; IPCC 2018, 178; Keigo USGCRP 2018, 257). This will lead to more costly hurricanes. But as population keeps growing and the number of houses 50 km from the coast could more than double this century, these changes will increase damages much more, swamping the climate signal. In one recent model (Gettelman et al., 2018), the researchers first take out social change, so society stays as it is today, and explore what will then happen with hurricane damages from much increased sea temperatures that could take place in 2070–90. They find that total global damages from hurricanes will increase from $67 billion to $97 billion, a 45% increase. However, the impact is much worse if we keep temperatures as they are today but let society grow richer, with more people and goods in harm’s way. This will cause hurricane damage to grow much faster to about 300% of today’s cost.


Simulating stronger adaptation as societies grow richer and expressing the costs in percent of GDP makes this point even clearer. Today, hurricanes cost about 0.04% of global GDP (Bakkensen and Mendelsohn 2016; Mendelsohn et al., 2012). Over the century, society will keep getting richer and able to afford to spend more resources on resilience and adaptation. If we assume hurricanes stay as today (no climate change), global hurricane damages in 2100 will make up a much lower cost share of 0.01% of GDP. However, if we expect stronger but fewer hurricanes, along the lines of IPCC’s projections, the global cost share will increase to 0.02% of GDP. Taking a step back, climate change will make future hurricanes more damaging (0.02% instead of 0.01%), but because the world is getting much richer, hurricanes will have a lower cost share in 2100 than they do today (0.02%, not 0.04%).


Thus, decision-makers should consider how to best reduce hurricane damage: through climate policy that reduces future temperature rises, or through social policies that reduce vulnerability through adaptation or lifting people out of poverty. As we will see later, even very strong climate policy will cost a lot but only have a little temperature impact in many decades. Therefore, it turns out that social policies are typically much more effective––for some interventions, a dollar spent on reducing vulnerability can help 52 times more than one spent on climate policy (Pielke 2007).


2.7. Becoming more resilient: wealth and human ingenuity

When looking to the future, it is easy to foresee problems but harder to envisage solutions. We started this section looking at coastal flooding with and without adaptation (Section 2.2). As sea levels rise, it is easy to think more people and structures will be flooded. The reality is that humanity is ingenious. Richer people will have more options to be resilient, protecting land, valuables, and people, resulting in fewer people flooded and a lower damage fraction of GDP.


We will see this globally documented below, resulting in fewer climate-related deaths and a lower weather-related fraction of GDP costs. But here, it is worth noticing how this resilience is already playing out in a myriad of local and global settings.


Globally, over the past 30 years, rising sea levels have not resulted in more land underwater. Adding up all the coastal land lost and reclaimed, it turns out that the total coastal area has increased by more than 13,000 km² (Donchyts et al., 2016). This is perhaps most visibly the world’s largest coast reclamation of the 80 km² of Palm Island and adjacent islands along the coast of Dubai, but across the world, many countries have shaped and extended their coastlines by land reclamation. Bangladesh, despite popular understanding, has net added about 480 km² of land in the face of sea level rise.


Locally, we see this adaptation most clearly where catastrophes have made sea level rise much faster. The small Ubay island at the center of the Philippines has never been more than a sand bank, at high risk of flooding (Esteban et al., 2019). After a 7.2 magnitude earthquake in 2013, it experienced land subsidence probably in excess of 1 m, so that it is now completely submerged during the highest tides of the year.


Despite attempts to relocate the impoverished community to the mainland, residents have remained, adopting century-old adaptation strategies of elevating the floors of their houses using coral stones, or placing their houses on stilts. At the same time, they elevate their belongings using especially adapted furniture, and use elevated pathways that traverse the island, so they are still mobile during high tides, collecting rainwater in water tanks. They have also adapted their evacuation strategies from being evacuated only in strong typhoons to evacuating to the mainland in weaker storms. One meter of sea level rise leaves communities on low-lying islands worse off, but even if they are very poor, they can adapt and essentially ward off most of the negative impacts.


Resilient adaptation is even achievable for richer but still developing nations. Indonesia’s capital Jakarta, home to more than 12 million people, has for decades seen land subsidence from groundwater extraction and the load of buildings and constructions compressing the soil (Abidin et al., 2011). Since 1925, the coastal half of the city has subsided by 2–4 m (Andreas et al., 2018; Fig. 2). Coastal GPS indicate subsidence rates of about 10 cm per year (Abidin et al., 2011, 1765). The huge coastal development area of Pantai Mutiara has subsided by 1.8 m in the last 19 years, now having a mean elevation of 31 cm below sea level (Park et al., 2016).


Yet, Jakarta has largely tackled this through building dikes and elevating port wharfs, along with elevating new construction areas, while reclaiming land with new luxury projects (Esteban et al., 2019). (Of course, ending groundwater extraction would by far be the most effective way to stop Jakarta from sinking further.)


Stepping back, Jakarta is dealing with a relative sea level rise of 1.8 m in just a few decades, much more than worst-case global projections for the next eight decades. It is doing so with ingenuity and technological capability supported by more financial resources.


A case from a rich country comes from the 2011 Tohoku earthquake and tsunami, which not only devastated Japan but also caused a substantial sinking, lowering the northeastern part of Japan by 78 cm (Esteban et al., 2019). This caused much of the coastline to be barely above sea level and large parts to be flooded at high tides. The Japanese government responded with a massive program of public works, raising some areas by up to 8 m and whole cities by 3 m, ensuring that no ground was lost to the sea (Esteban et al., 2015; Esteban et al., 2019). Of course, rich world wealth makes such an effort possible, but the fact that 200 km of coastline can be raised in a matter of years shows that it is possible and feasible to see adaptation to lower effective sea level rises happen over a century.


2.8. Becoming more resilient: fewer deaths

When establishing the seriousness of the impact of a catastrophe, maybe the single most important human measure is the death toll or, more technically, the excess death rate (Ó Gráda 2010, 86–87). Unlike most other measures like “people affected” it is not subject to shifts in social constructs. Nonetheless, evidence shows that demographic calculations typically lead to lower estimations of excess deaths than those provided by journalists and other contemporary observers (WPF 2019). Reaching back in history, more of the estimates are provided by observers rather than based on academic study, which possibly gives earlier data an upward bias. On the other hand, it is also likely that going back in time, the historical record leaves out more catastrophes, and that earlier events are also less likely to be recognized or recorded as catastrophes, conceivably causing a downward bias (Hasell and Roser 2017).


The leading database for all catastrophes is the International Disaster Database, commonly known as the Emergency Events Database (EM-DAT 2019). From 1900 to 2019 it lists 38.6 million deaths from disasters. About 39% are labeled biological (viruses and bacteria) and what they call “complex” but is almost entirely the politically enforced 1932 starvation in the Soviet Union (the Holomodor). The other 23.4 million deaths fall into four main categories: 50% droughts, 30% floods, 11% earthquakes, and 6% storms, with 3% from all other causes (such as avalanches, heat waves, mudslides, etc.).


Take these disaster deaths and split them into deaths that could either be affected by climate (that is, weather disasters that could be affected by the changing climate) and not affected by climate, and take the averages of deaths across decades (given the high year-on-year variance) and we get the graphs shown in Fig. 17.


Fig 17

Download : Download high-res image (765KB)

Download : Download full-size image

Fig. 17. Climate and non-climate-related deaths and death risks from disasters 1920–2018, averaged over decades. Data comes from EM-DAT (2019), using floods, droughts, storms, wildfire, and extreme temperatures for climate-related deaths, and earthquakes, tsunamis, and volcanos for non-climate-related deaths. Average per decade 1920–29, 1930–39 up to 2010–2018, with data plotted at midpoint (1924.5, 1934.5, with last incomplete decade at 2014). For instance, the 2004 tsunami, which killed 227,000 people, shows up as 22,700 people each year for 2000–2009. However, the tsunami “only” contributed about half of all deaths from non-climate-related deaths in the 2000s at 454,000, making the annual non-climate-related deaths for the 2000s 45,400. Population data from (OurWorldInData 2019).7


In the right panel, we see the annual death risk for a single person from both climate-related and non-climate-related deaths has declined, indicating a lower social vulnerability. However, climate-related risks have declined much more: over the past century, the non-climate risk has declined by 85% but the climate risk has declined by an astounding 99%. Had a person lived her entire 70-year life at the climate-related risk in the 1920s, she would have had 1.7% chance of dying from a climate-related catastrophe.6 Living at the risk of the 2010s, the life risk for dying of climate-related disasters was 0.018%.


In total numbers, the decline is smaller (as the global population has quadrupled), but still impressive at reducing global deaths from climate-related disaster from almost half a million people each year to less than 20,000 per year in the 2010s––a reduction of 96%. For non-climate-related deaths, the reduction is about 50% from the 1920s to the 2010s, but the trendline is almost flat.


It is to be expected that it is much harder to avoid death from non-climate-related disasters, since these are mostly earthquakes that are hard to predict. Hence, only better building standards can help. However, the large reduction in climate-related deaths from disasters shows a dramatic increase in climate resilience, likely mostly brought about by higher living standards, a reduction in poverty, improvement in warning systems, and an increase in global trade, making especially droughts less likely to turn into widespread famines.


The same declining trend for climate-related mortality rates is found across individual hazards from flood, flash flood, and coastal flood, over heat and cold deaths to drought and wind damage (Formetta and Feyen 2019). A 10-year moving average from 1980 to 2016 shows a 6.5-fold reduction in the mortality rate (ranging from a twofold reduction in floods to a 16-fold reduction in flash floods).


We often forget how much of the world was devastated by famines in previous centuries. Although famine outside of wartime disappeared from the developed world after the mid-nineteenth century (Ó Gráda 2010, 8), large famines continued in poorer countries, with the late 1870s killing more than 7 million in India and 9.5–13 million in China (Ó Gráda 2010, 21). Even the 1928–30 drought was described by the Committee of the China Famine Relief Fund as “one of the most wide-spread and severe famines in many decades,” spreading inland to the upper reaches of the Yellow River, Inner Mongolia, Gansu, and Shaanxi, where “three successive harvest…failed to materialize,” leaving more than 50 million people in total ‘“severely affected” (Fuller 2015, 157–58). In total, the Famine Trends Dataset estimates 5.5–10 million dead (WPF 2019), with EM-DAT conservatively counting 3 million deaths.


Fig. 17 shows that we are now much less vulnerable to climate impacts than at any time in the last 100 years. It is possible that climate change has made impacts worse over the last century (although the discussion on floods, droughts, wildfire, and hurricanes suggests this is not the case), but resiliency from higher living standards has entirely swamped any potential climate impact.


2.9. Becoming more resilient: impact costs

The second-most important impact measure after deaths is the total cost. In the US, one such measurement of costs from climate impacts is the heavily promoted “Billion-Dollar Weather and Climate Disasters” (NCEI 2019). This time series shows how the number of disasters costing an inflation-adjusted billion dollars or more has increased from about three in the early 1980s to about 15 in the late 2010s, and is commonly referenced to show how increasing temperatures cause more climate damage. In early 2019, a Washington Post article was distributed across the US with the telling headline: “More billion-dollar US disasters as world warms” (Dennis and Mooney 2019). In strong language, the journalists outline how the “number of billion-dollar weather disasters in the United States has more than doubled in recent years, as devastating hurricanes and ferocious wildfires that experts suspect are fueled in part by climate change have ravaged swaths of the country,” citing an “alarming trend” which is “fueled, at least in part, by the warming climate.”


In an economic commentary, Zagorsky, 2017 critiques the NCEI billion-dollar disaster statistic:


Even with the inflation adjustment, a key reason we have more costly disasters is simply that the economy is much bigger today than it was in the 1980s.


When the economy was smaller, disasters caused less economic damage. There were fewer homes, factories, and office buildings to destroy, so it was harder for a natural disaster to cause a billion dollars’ of damage.


Since 1980, the U.S. economy has more than doubled. … In other words, a storm happening today will cause more damage than an identical one occurring decades earlier simply because there is more to destroy.


He suggests a simple adjustment, setting a threshold each year that is equivalent to the fraction of the entire GDP of an inflation-adjusted billion-dollar destruction in 1980. That means that a billion-dollar disaster in 1980 would have caused $2.3 billion in costs in 2010, in a 2.3-times-bigger US economy. Thus, we should only count the number of disasters with a disaster cost higher than $2.3 billion This reduces the number of 1980-billion-dollar disasters in recent years dramatically.


While Zagorsky does not present any statistical test, it is easy to replicate his data, and his adjustment shows that the highly significant increase in billion-dollar disasters disappears. From a linear regression showing a highly significant extra billion-dollar disaster every four years, and an R2 of 0.54, we get an insignificant, slight upwards slope of one billion-dollar disaster more every 25 years, and an R2 of 0.06.


Moreover, as shown above with the increasing bull’s-eye effect, GDP is sometimes likely to provide an insufficient adjustment, and at times this will be spectacularly insufficient. While the average GDP per person in the US increased 8.5 times from 1900 to 20168 and population increased 4.25 times, we would expect about 36 times as much damage (8.5 × 4.25) from more people and more expensive stuff. However, when Florida’s coastal population increased more than 64 times, we should expect 544 times more damage from hurricanes in Florida. Using GDP will under-adjust by 15 times.


Thus, for the US, it is a better option to use the existing and specific data available with the relevant adjustments, which we saw above shows no significant signs of adjusted increase for hurricanes, floods, and droughts.


However, GDP adjustment is the only option for effectively comparing disaster costs across the world (Pielke 2019). Moreover, it is also how all the UN member nations have decided to measure progress on making cities and human settlements safe and resilient in Goal 11.5: “decrease the direct economic losses relative to global gross domestic product caused by disaster” (SDGs 2015), and in its indicators for reducing vulnerability to climate-related extreme events (IAEG-SDGs 2019, 1.5.2).


Since data before mid-1990s are not complete (Pielke 2019, 2), we start the analysis in 1990, although that may bias the analysis towards showing escalating disasters over time. However, the analysis, shown in Fig. 18, clearly demonstrates that global weather-related costs over the past 28 years have not increased. It has most likely declined from 0.26% of global GDP in 1990 to 0.19% in 2018. The other global disaster estimate, from Aon Benfield, is only available from 2000–18 (AonBenfield 2019). While it is generally a third higher than Munich Re, the data is closely matched (R2=0.90) and if backcasting with Munich Re data to recover the data from 1990–99, it has an only slightly faster decline, from 0.34% to 0.27%.


Fig 18

Download : Download high-res image (296KB)

Download : Download full-size image

Fig. 18. Global weather-related disaster cost share of global GDP 1990–2018. Costs from 1990–2017 from Munich Re in (Pielke 2019), 2018 costs from (MunichRe 2019), global GDP from (Worldbank 2019), using the latest World Bank Global Economic Prospects GDP from January 2019 to estimate global GDP for 2017 and 2018. Linear best estimate, decline is not statistically significant.


The same declining trend for climate-related loss in percent of exposed GDP is found across individual hazards using Munich Re data. A 10-year moving average from 1980 to 2016 shows a 4.5-fold reduction averaged across all hazards mortality rate (Formetta and Feyen 2019, Table B2).


Thus, on the resiliency indicator as agreed by all nations in the SDGs, the cost of weather-related disasters relative to global GDP has not increased, and likely decreased. Again, this does not indicate that there are no relative increases in weather disasters (although the above discussions on droughts, floods, wildfires, and hurricanes also showed little or no increase globally), but only that resiliency has outpaced any such increase.


3. Global warming’s total impact on current and future welfare

There is a literature going back almost 30 years trying to estimate the total costs of impacts of climate change (Cline 1992; Nordhaus 1991; Nordhaus and Moffat 2017; Tol 2009). These estimates typically try to capture the most important and highest cost impacts, such as agriculture, sea-level rise, energy, and forestry. Some, such as the FUND model, also include cost impacts from water resources, tropical storms, extratropical storms, biodiversity, cardiovascular and respiratory diseases, vector-borne diseases (such as malaria), diarrhea, and migration. Others, such as PAGE model, attempts to include costs of potential discontinuities, such as the Greenland ice sheet melting rapidly (Diaz and Moore 2017).


3.1. The climate damage functions from integrated assessment

The UN Climate Panel did a survey of all the relevant studies estimating the net costs of global warming at different global temperatures (IPCC 2014a, 690), and Fig. 19 shows the updated version. It indicates that now (with about 1°C global temperature increase), it is not even certain if the net global impact is positive or negative, but it is certainly not a large negative. The impact of 1.5°C is likely slightly negative — the latest IPCC report found that the cost of 1.5°C was 0.28% of global GPD (IPCC 2018, 256; Watson and Quere 2018, 23).


Fig 19

Download : Download high-res image (237KB)

Download : Download full-size image

Fig. 19. Total impact from temperature increase measured in percent of global GDP, based on 38 published estimates in the literature (Nordhaus and Moffat 2017), which is an update of (IPCC 2014a, 690, SM10-4). Size of circles shows the weight of the individual studies (larger circles for latest estimates, using independent and appropriate methods, smaller circles for earlier estimates, secondhand studies or less appropriate methods). Best regression is estimated using median quadratic weighted regression (quantile regression). To reflect unquantified costs, the adjusted best regression has added 25 percent of the monetized damages to reflect these non-monetized impacts (Nordhaus and Sztorc 2013, 11).


Nordhaus and Moffat (2017) examine a number of different ways to parametrize the data, settling on the median quadratic weighted regression, showed in Fig. 19. It estimates the cost of 4°C (which is likely what we will see at the end of the century without any additional climate policies) at 2.9% GDP loss. They point out that while most studies include key sectors, none include all sectors, with especially many non-market impacts missing, including losses from biodiversity, ocean acidification, and melting permafrost. While it stands to reason that the most costly sectors would have been modeled, the estimates are likely to be underestimates of the true damages. To adjust for that, Nordhaus and Moffat add 25% in damages, which while consistent with estimates from other studies (Nordhaus and Sztorc 2013, 11) is still somewhat of a judgment call, since it is essentially estimating what hasn’t been analyzed. However, this means the best estimate for the damage of 4°C is a reduction of GDP of 3.64%. For comparison, the 1.5°C IPCC report finds the cost of unmitigated warming by 2100 to be 2.6% of GDP (at a slightly lower 3.66°C)(IPCC 2018, 256)


3.2. Agreement across integrated assessment models

The models that attempt to estimate the climate impacts and monetize their impacts are known as integrated assessment models (IAMs, e.g. IPCC 2014b, 422ff). There are at about 20 IAMs (Weyant 2017) but many are more detailed process IAMs, and we will here focus on the three most well-known and used costs and benefits IAMs, which have also been used by the US government to estimate the Social Cost of Carbon (IWG 2016), namely DICE (Nordhaus and Sztorc 2013), FUND (Tol and Anthoff 2014), and PAGE (Hope 2011).


Because IAMs can estimate the impact costs from unmitigated climate and the policy costs from climate mitigation, they can help identify the optimal climate policy, which we will look at below. But their damage module can also help identify the global costs for different temperatures. It is often pointed out that while DICE and PAGE estimate similar levels of total damages, FUND projects much lower impacts, with global net benefits at lower levels of warming. However, this turns out to mostly rest on the fact that FUND models dynamic vulnerability, expecting richer populations to be less affected by most challenges (Diaz and Moore 2017; Tol 2002). For instance, the load of vector-borne diseases like malaria might increase as temperature increases, but when a population becomes sufficiently rich, it can afford an effective health care that essentially eradicates malaria.


In Fig. 20, we can see that the three IAMs, when they all include dynamic vulnerability, have about the same cost structure. Similarly, leaving out dynamic vulnerability for all three IAMs (the versions in dotted lines) indicate higher costs but are still fairly similar. Notice that PAGE explicitly includes catastrophic impacts after 3°C, which further emphasizes that these cost estimates are reasonable estimates of the full impact of temperature increases including potential catastrophes. Moreover, PAGE leaves out adaptation, which would again lower the cost estimates slightly.


Fig 20

Download : Download high-res image (506KB)

Download : Download full-size image

Fig. 20. Impact from temperature change for three IAMs, measured in percent of global GDP, for both dynamic (solid) and static (dotted) vulnerability (Diaz and Moore 2017, Fig. 2c). FUND originally includes dynamic vulnerability, whereas the solid DICE and PAGE are estimated. DICE and PAGE originally include static vulnerability, and the dotted FUND is estimated. The black curve denoted “Nordhaus” is the best estimate including omitted damages from Fig. 19.


In the following we will use the blue cost estimate from Nordhaus including omitted damages in Fig. 19, based on the literature review of available cost estimates, which is also marked as solid black in Fig. 20. This is, if anything, an overestimate as it both leaves out dynamic vulnerability and some adaptation, both of which would lead to lower estimates of costs.


3.3. Catastrophes, biodiversity, ocean acidification missing from the GDP costs?

While the costs in Figs. 19 and 20 are expressed in percent of GDP, this does not mean they are all monetary costs. Some, like changes in heating and cooling costs or hurricane damages, are clearly monetary costs. But others, such as cost of wetlands loss and biodiversity loss in FUND, are not. They are constructed to be understood as equivalent to an experienced welfare loss — that is, when we talk about a specific climate impact resulting in a 0.1% loss of GDP, it means the impact will have the same disutility as an income reduction of 0.1% of GDP.


Yet, a common objection to the Nordhaus’ cost curve is that many important costs such as catastrophes, biodiversity loss, and ocean acidification have been left out (Diaz and Moore 2017; Weyant 2017). This objection is rather weak, for three reasons.


First, many of these problems are actually assessed in some or all the IAMs. For instance, all three models include some estimate of catastrophic impacts. PAGE includes an explicit estimate for the costs of increased risks of tipping points, such as the Greenland ice sheet disintegration and a disruption of the monsoon or of the thermohaline circulation. It is modeled as an increasing probability, starting at 3°C, of a 15% GDP loss. For 4°C it reaches a cost of 0.71% GDP. DICE includes catastrophic impacts in its net damage based on Nordhaus’ survey of catastrophic outcomes (Nordhaus 1994). FUND similarly includes the costs of catastrophic outcome through tails of its parameter distributions.


Second, many of these omitted impacts are rather small. Nordhaus recently estimated the cost of one such catastrophe: the Greenland Ice Sheet entirely disintegrating over the next two thousand years (Nordhaus 2019). It shows that although the costs can mount to hundreds of trillions of dollars by the third millennium, by 2100 it will have a trivial impact. It will, through higher sea level rise, have a cost of $91 billion or 0.012% of GDP (and even by the year 3000 the cost incurred is rather modest at 1.3% of GDP).


Likewise, FUND includes a modeled cost of biodiversity loss of 0.21% GDP by 4°C (Diaz and Moore 2017). Compare this to the current annual global domestic spend on biodiversity at $20 billion (UNEP 2014, 435) or 0.027% of GDP, and the spend on all biodiversity and ecosystem services including development aid and agricultural subsidies at $52 billion (Parker et al., 2012, 29) or 0.077% of GDP. Despite UNEP calling for increasing investments to $150 billion–$440 billion per year (0.18–0.51%) for biodiversity, the spend has stayed almost constant at 0.027% (UNEP 2014, 435), indicating that the current revealed willingness to pay for securing biodiversity is not much higher than the present 0.027%.


The cost of acidification is not included in either model, and it has so far only been estimated in one study (Colt and Knapp 2016), which finds what a complete ocean acidification would imply, with 2000 ppm CO₂ and average ocean pH of 7.5 by the year 2200. They model a complete collapse of commercial marine capture fisheries, a complete collapse of recreational and subsistence marine capture fish harvests, and a complete loss of tourism and recreation from coral reefs. This would not spell the end of fish consumption, because even very high acidification would have minimal impact on aquaculture, which is already now controlling all or most inputs, such as buffered water. And aquaculture is already producing two-thirds of the total, global, first-sale value of fish (FAO 2018, 2). Yet, the loss of ocean fisheries, recreational and subsistence fisheries, and all coral reef tourism and recreation is not trivial. The researchers estimate the worst-case cost of this complete collapse at $301 billion. Estimating the growth rate of the 22nd century as similar to the middle-of-the-road SSP2, this translates into global GDP by 2200 at $4,040 trillion, meaning that the worst-case cost of a complete marine fisheries and coral reef tourism collapse in 2200 is equivalent to a loss of 0.0075% of GDP.


Third, the 25% damage addition from the black to the blue line in Fig. 19 was exactly intended to include extra, uncounted costs. For the Nordhaus estimate at 4.1°C it leads to a cost increase from 3.06% GDP loss to a GDP loss of 3.82%. This means an estimate of unmodeled losses worth 0.77 percentage points. Compare this to the cost of current biodiversity of 0.027% of GDP, to the collapse of the Greenland Ice Sheet worth 0.012% of GDP, or the cost of a complete marine fisheries and coral reef tourism collapse at a cost of 0.0075% of GDP. This indicates that the 0.77 percentage point has space for very many left-out costs — indeed, it could accommodate a hundred different impacts, each as negative as the worst-case complete loss of marine fisheries and coral reef tourism collapse in 2200.


In conclusion, not only are catastrophes and biodiversity not absent from the impact models, but the additional 0.77 percentage points from unmodeled costs can accommodate these and many other such costs, meaning that this cost estimate is likely not underestimated.


3.4. Unrealistic alternative loss models

In the last years there has been an alternative approach to the mainstream climate cost estimates that have generated dramatically higher costs (Pretis et al., 2018) (Burke et al., 2015; Burke et al. 2018; Hsiang et al., 2017).


Here, let us concentrate on Burke et al. (2015) and its clone (Burke et al. 2018), which both produce a global estimate. The first paper contributes global estimates of the damage impact from global warming, showing that impacts for SSP5 in 2100 likely will reduce global GDP by 23%, which is “many times larger than leading models indicate.” This result stems exclusively from estimating how national growth rates depend on the average national temperature. They find that cold countries grow less fast when temperatures drop for a single year, and grow faster when temperatures are slightly higher in one year. The obverse is true for hot countries, where cold shocks increase growth rates, and heat shocks decrease growth rates. They find the inflection point at 13°C. If these trends hold for the rest of the warming century, they find that cold countries will grow faster and hot countries slower than they would otherwise have done. In 2010, the majority of the world’s GDP was created in countries below 14°C (the US is at 13.6°C). But most of the population is in countries above 14°C, and the expansion over the 21st century in both GDP and population will mostly take place in countries over 14°C. So, in 2100 with the SSP5 and a population-weighted temperature rise of 4.3°C (from RCP8.5), more than eight times more GDP will be produced in countries with an average temperature above 14°C. Thus, if the growth rate increases for the countries below 13°C but decreases for countries above 13°C, the slowdown will be large and cumulative over the 21st century.


These results crucially rely on an absence of adaptation over the 21st century. In their description of the data, they claim that “results using data from 1960 to 1989 and 1990–2010 are nearly identical (Fig. 2c)” and that “substantial observed warming over the period apparently did not induce notable adaptation.” Yet, the growth relationship for the two time periods as shown in their Fig. 2c actually changes from an optimum of 12.3°C to an optimum of 14°C, a change of 1.7°C, whereas the average temperature between the two time periods changed just 0.39°C (HadCRUT4 2019). So, surprisingly, based on their own evidence, the world more than adapted to the temperature increase of 0.39°C.


So, instead of extrapolating without adaptation and finding a 23% reduction, one could more reasonably argue that on their own data, nations actually adapt and even adapt beyond the temperature increase. If the same model is run with this assumption, by 2100, the 4.3°C will have moved the optimum to 31.5°C (=13+4.3 × 1.7/0.39). This would cause the model to show that global warming would increase by almost 1,100% rather than decrease by 23%.


Both these formulations are deeply suspicious. Burke et al. (2015) shows absurd GDP results, with Iceland becoming 30 times richer than today, and Mongolia 200 times richer than today, becoming four times richer per person than the US.


A new study (Letta and Tol 2019) shows that the extrapolation for reduction in GDP growth is empirically unfounded for rich countries, meaning the total impact of the Burke et al. (2015) argument cannot be 23% but maximally 3%.


Another paper, cross validating Burke et al. (2015) and similar papers, (Newell et al., 2018) shows that these are incredibly vulnerable to mis-specification. A slight change in the GDP-maximizing temperature can change whether GDP of a few major economies would benefit from or be harmed by projected warming.


Indeed, they find that simply adding a cubic temperature term to Burke et al. (2015) reduces the GDP impact by half to 11%. Removing their country-specific time trends makes the model predict an increase in GDP of 12%, as does using region-year fixed effects (+10%).


Across all the model estimates (Newell et al., 2018) finds the GDP impacts range from −48% to +157%, and the weighted, average effect is actually a positive 13.5%. They find that estimating the GDP damage as a level effect rather than as a growth effect is much more robust and is very likely to imply 1–2% GDP loss by 2100.


While the IPCC (IPCC 2018, 256) references (Burke et al., 2018) for a cost estimate of 1.5°C and 2°C, they also reference (Watson and Quere 2018) for the costs of 1.5°C, 2°C and 3.66°C (no policy), which finds costs of 0.28%, 0.46%, and 2.62% of GDP, which is very close to the black Nordhaus line in Fig. 20 (not surprising, since the costs were modeled in PAGE).


In conclusion, apart from being simply implausible (with Mongolia becoming the second-richest people in 2100), this alternative approach to cost estimates is ill-founded and vulnerable to mis-specification. It emphasizes why the well-established, decade-long research behind the cost estimates of Figs. 19 and 20 are the more likely cost estimates to be used.

No comments:

Post a Comment