All posts by Stephan Lewandowsky

AGU Fall Meeting sessions on social media, misinformation and uncertainty

We have proposed several sessions for the AGU Fall Meeting in San Francisco on 3-7 December: on uncertainty, misinformation and social media. AGU members are invited to submit abstracts for the sessions – the deadline to submit an abstract is August 8. Details of the sessions are:

GC014: Construing Uncertainty in Climate Science

Session Abstract:
Uncertainty forms an integral part of climate science, and it is often cited in connection with political arguments against mitigative action. However, the implications of uncertainty are not always well understood. In particular, uncertainty is often misunderstood to imply that the risk from climate change may be minimal, whereas in fact greater uncertainty translates into greater risk. This session will examine how uncertainty can be misconstrued, how such misconstrual can be avoided, what the implications of uncertainty are for risk management, and why the notion of uncertainty plays such an important role in cognition and decision making as it relates to climate change.

Convenors: John Robert Hunter, Stephan Lewandowsky, James Risbey

Invited Speakers:

  • Naomi Oreskes
  • Gerard Roe
  • Paul Baer
  • Mike Raupach

ED042: Understanding & Responding to Misinformation

Session Abstract:
Scientists face many challenges in effectively communicating science to the public, not the least being the presence of misinformation. However, there are actually positive educational opportunities available in the correction of misinformation. This session will explain the psychology and origins of misinformation, the cognitive processes at play when correcting misconceptions and recommended approaches to effective myth debunking. The session will be relevant to scientists seeking to explain their science, to communicators wishing to effectively outreach to the public and educators who may need to respond to misinformation in the classroom.

Convenors: Stephan Lewandowsky, John Cook

Invited Speakers:

  • Daniel Bedford
  • Max Boykoff

PA013: Social Media and Blogging as a Communication Tool for Scientists

Session Abstract:
Social media and the Internet has become an increasingly indispensable tool for scientists and communicators. This session will feature key figures in the climate blogosphere who have adopted novel and effective methods of communicating climate change science on the Internet. They will discuss the risks and rewards of new media, covering issues such as the challenges and advantages of crowd sourcing, viral marketing, Internet marketing and traffic generation, the use of smartphones, the management of online communities and social media sites such as Twitter and Facebook.

Convenor: John Cook

Invited Speakers:

  • Michael Mann
  • Michael Tobis
  • Peter Sinclair
  • Zeke Hausfather

AGU Members, click here to submit an abstract for either session.

Climate Uncertainty and Emission Cuts

It is very clear that uncertainty is no one’s friend. We have seen that greater uncertainty about the evolution of the climate should give us even greater cause for concern. We have seen that all other things being equal, greater uncertainty means that things could be worse than we thought. We have also seen that greater uncertainty means that the expected damages from climate change will necessarily be greater than anticipated, and that the allowance we must make for sea level rise will also be greater than anticipated. All of those results arise from simple mathematics, and we do not even have to resort to any economic modelling to understand how greater uncertainty translates into greater risk.

Let us now consider the other side of the risk-management equation: We have so far focused only on the cost of climate change, including the cost of adaptation measures (e.g., to put your beach front home on stilts). It is time to consider the cost of mitigation and how that is affected by uncertainty.

Mitigation means that we seek to reduce the damages from climate change by reducing carbon emissions. This is quite possible but it doesn’t come for free: We need to decarbonize our economy by converting to alternative sources of energy, and this clearly involves an economic cost (in addition to the new job opportunities that come with conversion to a new economy).

So how does this cost relate to uncertainty about the climate? How does greater uncertainty affect the cost of mitigation?

To answer this question, we need to deal with a few preliminary considerations.

Emissions vs. Atmospheric Concentration

First, we need to realize that to stabilize atmospheric concentrations of greenhouse gases, our current emissions have to tend towards zero. Just cutting 5% or 10% as suggested by political leaders (at best) will achieve nothing. This fact is illustrated in the figure below, and the underlying reasons are outlined in detail here—if the figure below does not make sense, then you should revisit this post before reading further.

What this figure shows are the global emission trajectories (in Gigatonnes, Gt, of carbon) that are required to limit humanity’s total cumulative emissions (that is, the sum total of all carbon that we will ever emit) to a certain level. Each line in the figure refers to a specific total “budget” of carbon, shown in the legend in Gt. So for example, the red line will limit total emissions to 1,000 Gt, whereas the black line will let us emit 3,000 Gt, and so on. (This figure, and all others in this post, rely on the equations provided by Raupach et al., 2011.)

It is clear that all lines tend towards zero sooner or later, and it is clear that the more we want to limit the budget, the steeper and the sooner the required emission cuts. Without going into details, it should be self-evident that steeper cuts translate into greater economic cost—cutting emissions by 5% can be achieved by efficiency measures; cutting them by 50% in a year requires that roughly half your factory is turned off.

The Cost of Delay

We can clarify the nature of emission trajectories further by picking a carbon budget and examining the required trajectories as a function of the time when we commence mitigation. The figure below shows those trajectories for a budget of 1,000 Gt (this budget was chosen for a good reason which I turn to below) and for four different times at which we commence mitigation:

The figure makes two points: First, the longer we wait, the steeper the required emission cuts. Second, if we wait till 2041 to commence mitigation, we would have to decarbonize the world economy overnight. Needless to say, this is technologically and politically impossible. This absolute boundary exists for any budget, no matter how large, it’s only the time at which it is hit that differs between budgets—in the same way that no matter how much money you have, at some point you will have to stop spending because you are out of cash!

It is important to establish the existence of this absolute boundary because then the economic discount rate ceases to matter.

What is the discount rate and why would it matter?

The discount rate is used by economists to balance future costs against current expenditures. For example, should I spend $1000 today to fix something that will cost $2000 to fix in 10 years time? Perhaps, but what if the future cost were only $1250? After all, in 10 years time we expect to be far wealthier than now, and $1250 then may appear to be pocket money whereas $1000 now feels like a lot of money. In a nutshell, the discount rate determines by what percentage we reduce the cost of future expenditures to make them commensurate to today’s dollars.

In the context of mitigation, the discount rate matters because it could be argued that even though delayed mitigation is more costly than cutting emissions now (because steeper cuts are required), those delayed (greater) expenses are not “worth as much” as today’s costs because they are discounted. Given a (very) large discount rate, one might therefore be tempted to argue that mitigation should be delayed until 2021, because even though this will increase costs, the increase is compensated by the delay. Although the required discount rate would likely be large, it cannot be dismissed. Indeed, Weitzman (2007, p. 705) noted that “…the biggest uncertainty of all in the economics of climate change is the uncertainty about which interest rate to use for discounting.” It would therefore be preferable to remove this source of uncertainty.

Fortunately, the discount rate and the uncertainty surrounding it, ceases to be relevant when we reach the absolute boundary in the above figure, at which point it no longer matters by how much we discount future costs. Decarbonizing the economy over night is impossible, and switching the world economy off is inconceivable, and those two facts are not subject to discounting.

Uncertainty and Mitigation

It follows that to examine the impact of climate uncertainty on mitigation, we should focus on an absolute boundary that is not subject to arguments involving a discount rate.

To relate climate uncertainty to mitigation, it is helpful if we first consider the relationship between total carbon emissions and the expected peak temperature increase. This is shown in the figure below, which is produced using the equations provided by Raupach et al. (2011).

The figure shows that to keep warming limited to 2°C—the maximum considered “safe” by scientists—we must limit cumulative emissions to around 1,000 Gt. This explains why I used a budget of 1,000 Gt in the earlier figure: If you look at it again, you can see for yourself how urgently we have to embark on mitigative action.

So what about uncertainty?

I again used the variance in our estimate of climate sensitivity as an indicator of uncertainty—if you are unclear about what that means, refresh your memory here.

I again assumed that the uncertainty surrounding climate sensitivity has a fat-tailed lognormal distribution (cf. Roe & Baker, 2007). I varied the variance—and hence uncertainty—of that distribution from minimal to quite large without changing its mean. At each level of uncertainty, I drew 1000 samples from the sensitivity distribution and computed the carbon budget that would limit warming to 2°C. In other words, I translated the uncertainty in climate sensitivity into uncertainty about the budget that we had left to spend.

The results are shown in the figure below and are worth exploring in detail.

Consider the top panel first. The blue line plots the mean carbon budget (across the 1000 samples) as a function of the uncertainty surrounding climate sensitivity (for this demonstration, climate sensitivity is expressed in units of KW-1 m2, with a mean of .741 and variance as shown in the figure above. This value corresponds to a sensitivity of 2.75°C in response to a doubling of CO2, which is right around the best estimate).

When uncertainty is minimal (left-most blue data point), the expected budget is around 1000 GtC, which is precisely the value that in the earlier figure gave us 2°C peak warming—as it should be because we are examining budgets to limit warming to 2°C.  

When uncertainty increases, the mean total budget also increases with uncertainty (other points on the blue line). That is, the greater our uncertainty about climate sensitivity, the greater the expected (average) carbon budget that we have available to limit warming to 2°C. This is good news, at least at first glance, because it means that greater uncertainty might give us greater leeway. However, that is not the complete picture, because what is relevant in this instance is not just the mean (average), but also the remainder of the distribution of carbon budgets: Remember, I sampled 1,000 possible actual sensitivities from a distribution, each of which is associated with an actual carbon budget—we therefore are now dealing with a distribution of possible budgets, and the shape of this distribution describes how likely those various budgets are.

The remaining lines in the top panel of the figure describe the shape of this distribution by plotting its median (i.e. the point that divides the distribution into an equal number of observations, such that 500 are above and another 500 below that point), its 25th percentile or quartile (labelled Q25, referring to the point that has 250 budgets below it and the remaining 750 above), and finally its minimum. The latter is particularly important because it tells us what the least (and hence most constraining) carbon budget is that we may encounter at a given level of uncertainty about the climate.

The figure shows very clearly that the minimum budget decreases as uncertainty increases, before it flattens out as it hits the horizontal dashed line. What does this mean? It means two things: First, as uncertainty increases, the worst-case scenario—namely, the lowest possible carbon budget that we may have available to limit warming to 2°C—becomes worse. Second, and even more worryingly, the worst-case scenario rapidly hits an absolute and inescapable brick wall. That brick wall is the dashed line, which represents the amount of carbon (346 Gt) that humanity has already emitted.

In other words, as uncertainty increases, the worst case scenario becomes the worst imaginable case—namely the case where we have to switch off our economy not only over night but to do so now.

Right now.

Overnight.

Needless to say, this is impossible.

Now, one might argue that the worst-case scenario shouldn’t worry us too much, because on average greater uncertainty means that we have a greater carbon budget to play with. This argument, alas, overlooks the bottom panel of the figure.

The bottom panel of the figure plots the probability of the carbon budget hitting that conceptual minimum. No carbon budget can go below that minimum—because we’ve already spent it!—but the number of budgets, out of the 1000 sampled, that hit the minimum varies with uncertainty in climate sensitivity.

And the probabillity goes up as uncertainty increases.

The number of worst-case budgets increases with uncertainty, from zero (if uncertainty is tiny) to more than 8% (.08), when uncertainty is large. To put this into context, that risk is more than 1,000 times greater than the risk of dying in a car crash in Australia (expressed as fatalities per 100,000 population).

So not only does greater uncertainty put the worst-case scenario into a very bad spot, it also increases the likelihood of that worst-case scenario being true. In a nutshell, increased uncertainty means that the likelihood of us having to turn off the economy overnight right now, goes from zero to a non-negligible 8%.

Would you still drive your car if the risk of a crash went up from its current level by a factor in excecss of 1000? Hint: That means more than 1,600,000 Australians would die every year in car crashes. 

So no, uncertainty is no one’s friend, whether we talk about damages from climate change or the costs of mitigation. There is no escaping those simple mathematical facts.

There is only one way to escape that uncertainty: Mitigation. Now.

 

References

Raupach, M. R.; Canadell, J. G.; Ciais, P.; Friedlingstein, P.; Rayner, P. J. & Trudinger, C. M. (2011). The relationship between peak warming and cumulative CO2 emissions, and its use to quantify vulnerabilities in the carbon–climate–human system. Tellus, 63B, 145-164.

Roe, G. H. & Baker, M. B. (2007). Why is climate sensitivity so unpredictable? Science, 318, 629-632.

Weitzman, M. L. (2007). A Review of The Stern Review on the Economics of Climate Change. Journal of Economic Literature, 45, 703-724.

Do you want some science with your entertainment?

On Thursday next week the ABC (Australia, 26 April, 8:30pm AEST) will be airing the documentary I can change your mind about … climate, which has been attracting quite a bit of media attention already. Its main protagonists are two polar opposites: A conservative politician, former Senator Nick Minchin, and a young climate activist, the founder and chair of the Australian Youth Climate Coalition Anna Rose.

Nick Minchin is well known for his opposition to climate science and also for opposing the notion that second-hand tobacco smoke is detrimental to your health. Anna Rose has been campaigning for climate action in Australia for years and heads a grassroots organization of 57,000 members.

The narrative of the documentary is that Anna seeks to change Nick’s mind about climate change, whereas Nick is trying to change hers. They travel around the world together, meeting up with supporters of their respective positions, discussing the scientific evidence and challenging each other’s positions.

This documentary will be followed by a “Q&A” panel, consisting of the two main protagonists (antagonists?) Nick and Rose, and also mining magnate Clive Palmer, social researcher and writer Rebecca Huntley, and the Chief executive of the CSIRO Dr Megan Clark. This panel may be quite a lively event, given that Clive Palmer has recently alleged that the CIA funded Greenpeace to harm Australian industrial interests.

Notably absent from the panel are, you guessed it, climate scientists. Dr Clark runs the CSIRO but her qualification is in economic geology not climatology. However, this peculiarity should not be of too much concern because Mr. Palmer will likely ensure that the panel retains its entertainment value.

And those viewers who seek more than entertainment and want some scientific information as well are not left out in the cold either: On the day, during the broadcast of the documentary and the subsequent Q&A panel, Australian climate experts will run a twitter feed (via The Conversation) and also a live blog (hosted by Crikey). The details will be announced here on Shapingtomorrowsworld and elsewhere on the web closer to the time.

So tune into the ABC for entertainment and grab the science via Twitter and our live blog.

Keeping Dry: Uncertain Sea Level Rise and the Risk of Floods

We have already seen that uncertainty about the future evolution of the climate is not your friend because it means things could be worse than anticipated. And we have shown that as uncertainty grows, then it is almost inevitable that the expected damage from climate change will also increase.buil

The reason that expected damage goes up together with uncertainty is because the overall relationship between global temperature rises and expected damages is non-linearly increasing: That is, a little bit of warming costs very little, but as temperatures continue to increase, the costs grow disproportionately. There is a figure in the previous post which visualizes this relationship.

How do we know that the “damage-function”, which relates temperature increases to damage, is non-linearly increasing in this manner? For one, virtually all economic models have converged on the same conclusion, namely, that increasingly greater warming accelerates the associated cost−thereby guaranteeing that greater uncertainty translates into greater risk, as we saw last time.

In this post, I show why greater uncertainty translates into greater cost in another way: Rather than relying on economic modeling, I use fairly simple mathematics. This has the advantage that we can avoid the complexity of economics and still make the point rather forcefully. So in the remainder of this post I explain recent work by Tasmanian scientist John Hunter that has explored the issue beautifully (Hunter, 2011).

Let us consider sea level rise (SLR) for this example because it has the advantage of being easily and unambiguously interpretable: We can all agree that being dry is good whereas being flooded is bad.

Staying Dry

Suppose you have built your dream home on the coast of Patagonia within easy reach of your favourite beach. How likely is your home to get flooded? On average you will, of course, be dry−that’s the whole point of building a home safely above sea level as dictated by the rules of your local council or shire. Local governments have had plenty of experience to determine where it is safe (enough) to build.

But how did those council engineers make that determination? Roughly speaking, they combined knowledge about the average sea level with an allowance for extreme events. To clarify this distinction consider the top panel in the figure below.

The figure shows current average sea levels (indicated by the vertical blue line) together with the distribution of possible sea levels at any given instant. Due to ocean currents, tides, storms, and so on, there is natural variability in sea level all around the world, even without climate change. Hence you will experience sea levels that vary around the average (i.e. the mean) from day to day, as shown by the distribution in the top panel. It is therefore insufficient to look at the mean alone: We must make an allowance for variation as well.

That allowance is determined by policy makers to guard against all but the most extreme events. Thus, levees are built to withstand storm surges and your beach-front property must be built sufficiently far from the ocean so it does not get inundated easily. That precautionary allowance is labeled by the arrow in the top panel, which extends to the red line representing the maximally extreme events that building codes allow for.

However, very occasionally some outrageously extreme event, such as the conjunction of a high tide and a vicious storm, may cause your property to nonetheless be flooded. This would correspond to any sea level to the right of the red line in the top panel. Anyone who buys property close to the beach knows that such an extreme event might occur every so often, perhaps every 100 years or so.

Now enter climate change and the associated sea level rise (SLR).

The consequences of SLR are shown in the bottom panel of the preceding figure. It is apparent that mean sea level (the solid vertical blue line) will be greater than before. For simplicity, let’s suppose that the average SLR will be 50 cm by century’s end. (There is no need to get bogged down in the details, but that value is not an unreasonable assumption. So let’s use it in the remainder of this post without expressing a firm commitment to that particular value or any other.)

Now here is the interesting part: As a beachfront property owner, you aren’t just concerned about mean SLR by itself: instead, you are primarily concerned with how often your property will be flooded in the future. This is an important realization because it turns out that we need to be concerned with the additional allowance for extreme events on top of mean SLR.

How much do we need to allow for natural variation on top of mean SLR in order to keep the risk of flooding constant, on the order of 1 in a 100 years (or whatever they may have been before climate change came along)?

This total allowance is indicated by the red question mark in the bottom panel of the figure because we don’t know yet what that is.

Let’s find out.

Keeping Floods at Bay

We begin by assuming that we want to keep the flooding risk to your property constant even after the sea has risen by .5 m. Obviously, we need to extend the height of our protective dam or levee (or raise our house onto stilts) by at least .5 m, just to cope with the average increase.

But that’s just the beginning.

Now we need to work out how much extra we have to allow to be prepared for extreme events−in other words, if the sea rises .5 m on average, how much will it rise when there is an extreme event, such as a storm tide? Or equivalently, by how much do I have to extend my levee or dam in order to keep the number of inundations constant? What is the required allowance, on top of mean SLR, to keep my risk unchanged?

Enter uncertainty.

Allowances for Uncertainty

It turns out that this allowance for extreme events is a function of the uncertainty in the estimate of the mean expected SLR. If we take the mean and additionally estimate the uncertainty in that expectation, then we can compute the extra height of dams and levees that we need to build in order to allow for future extremes.

Hunter (2011) provides the mathematics that are necessary to compute these allowances, and I do not present them in detail here because this post is about the role of uncertainty not the mathematics per se. (If readers are interested in the details, leave a comment and I may add them later.)

The results of his analysis are shown in the figure below.

 

The large panel on the left shows the allowance that is required to cope with an SLR of .5 m as a function of the uncertainty in that estimate.

For now, we can ignore the differences between the three functions and consider them as one family of curves that show the same quantity: How much extra sea level do I have to allow for in order to keep my risk of flooding constant if on average I expect an SLR of .5 m.

Consider first the horizontal dashed line: That line represents the mean expected SLR, which is why it is drawn at .5 m. If there were no uncertainty in that estimate, then that’s all we had to allow for−which is why all three functions in the panel converge onto the dashed line on the left, as uncertainty (on the X-axis) shrinks towards zero.

If our knowledge were perfect, we’d allow for .5 SLR and we could relax (after putting our home on 50 cm tall stilts).

Now consider the implications of greater uncertainty: All three functions rapidly diverge from the dashed horizontal line as uncertainty increases and they do so in an accelerating manner. The greater the uncertainty in our projection of SLR, the greater the allowance we have to make in order to keep our exposure to floods constant.

Note that this is a result of greater uncertainty only−the average expected SLR remains constant; the only thing that changes is the precision of the estimate (expressed here as the standard deviation of the SLR projections).

This result derives directly from the mathematical properties of extreme values and is not in doubt: The greater the uncertainty, the greater the required allowance to protect against extreme events.

Now consider the three thumbnail panels on the right of the figure: They represent three different distributions of the expected SLR, with the familiar normal (bell-shaped) distribution in the top panel, and a raised cosine and a uniform (rectangular) distribution in the center and bottom thumbnails, respectively. The color coding of those distributions conforms to the color of the functions in the left-hand panel.

It can be seen that if uncertainty about SLR is normally distributed, increasing uncertainty has a particularly drastic effect (blue line in the main panel). For example, if uncertainty is .362 m (a value consonant with post-IPCC results; Nicholls et al., 2011), then under this assumption of normality one would have to allow for more than 1 meter of SLR to keep the risk of flooding constant−that’s double the expected SLR of .5 m! With the other two distributions, uncertainty has a lesser but still non-linearly increasing effect.

The fact that all three curves are non-linearly accelerating despite widely divergent assumptions about the distribution of uncertainty implies that the main result is extremely well supported. Greater uncertainty translates into a greater need to allow for variation on top of SLR, and that allowances grows in an accelerated fashion.

This result follows from basic mathematics, without any economic modeling, and it reveals particularly starkly why uncertainty is no one’s friend.

This concludes our three-part examination of the role of uncertainty in assessing the damage cost arising from climate change.

The next issue to be considered in future posts is how uncertainty affects the flipside of the equation; namely, the cost of mitigation. Irrespective of damage costs, does uncertainty make mitigation more imperative or does it lessen the urgency of action?

Take a guess….

References

Hunter, J. (2011). A simple technique for estimating an allowance for uncertain sea-level rise
Climatic Change,
DOI 10.1007/s10584-011-0332-1.

Nicholls, R. J.; Marinova, N.; Lowe, J. A.; Brown, S.; Vellinga, P.; de Gusmão, D.; Hinkel, J. & Tol, R. S. J. (2011).
Sea-level rise and its possible impacts given a `beyond 4°C’ world in the twenty-first century
Philosophical Transactions of the Royal Society (A), 369, 161-181

The Inescapable Implication of Uncertainty

In a previous post, we saw that uncertainty is not your friend. In a nutshell, if there is uncertainty, things could be worse than anticipated as well as better.

The previous post encapsulated this in a figure, which I show again below because it serves as a departure point for our next installment of our uncertainty analysis:

The figure shows synthetic distributions of climate sensitivity (if you don’t know what that is, begin by reading the previous post). The means of the four distributions in the above figure are identical but their standard deviations differ considerably, from .49 in the top left to 2.6 in the bottom right. The spread of each distribution characterizes the extent of uncertainty surrounding the mean estimate of 3 degrees.

The consequences of this increase in uncertainty are as obvious as they are inescapable: As uncertainty increases, the probability of a truly catastrophic outcome (defined as climate sensitivity exceeding the highly conservative 5°C mark; indicated by red lines in the figure) increases from a small .07% in the top left panel to a concerning 14% in the bottom right. In other words, increasing the standard deviation of our distribution fivefold, from .5 to 2.5, increases the likelihood of catastrophe by a factor of 200. (The word “catastrophe” has been over-used in the climate arena, but a 5°C increase would be truly catastrophic by most accounts, and most climate scientists are very concerned even about the possibility of considerably lower increases.) See update below.

The first conclusion about the climate system therefore has to be that the greater the uncertainty, the greater the potential for catastrophe.

That said, another aspect of the above figure appears to offer comfort at first glance. As uncertainty increases, from the top left to the bottom right panel, the proportion of the distribution towards the lower end of sensitivity also increases as a trivially necessary consequence of keeping the mean constant. The proportion of low estimates (< 2°C) reaches a whopping 42% when the uncertainty is greatest (bottom-right panel in the figure). This seemingly invites an alternative interpretation: With nearly half of all estimates of sensitivity below the ostensibly-safe “guardrail” of 2°C, perhaps one could legitimately ignore the upper tail, however fat it gets with increasing uncertainty? See update below.

Whether this gamble is advisable might at first glance appear to be a matter of subjective preference—some people may legitimately think that a 40%+ chance of being safe outweighs a 14% likelihood of catastrophe. As it turns out, however, we can push our analysis further and show that the lower end of the climate sensitivity distribution does not offer the comfort it implies at first glance.

From Sensitivity to Cost

To understand the reasons for this, we must first relate climate sensitivity to the likely damage associated with climate change. Common sense dictates that greater sensitivity translates into greater cost: For example, if sensitivity is low and sea levels rise by only 2 cm, damage will be relatively minimal (i.e., we lose a few meters of beach). If sensitivity turns out to be higher, and sea levels rise by 60 cm, the damage is considerable (i.e., we need to build dams and levees or move people out of harm’s way, all at great cost).

What is even more important than the fact that damage cost increases with climate sensitivity is to ascertain the functional form of that increase: Will a .5°C increase of sensitivity from 2.5°C to 3°C incur the same additional cost as an increase from 3°C to 3.5°C? What about an increase from 4.5°C to 5°C?

It turns out that the precise form of this damage function is subject to debate. However, what does not appear to be subject to debate among economists is the fact that the damage function is convex (e.g., Nordhaus, 2010; Tol, 2011; Weitzman, 2010). “Convex” means that the rate at which damages are increasing with increasing climate sensitivity is itself increasing. This is illustrated in the figure below, using a highly convex (quadratic) function for illustrative purposes.

Consider first the top panel of the figure (Panel A). The panel itself contains three smaller panels, and the largest one in the top-right quadrant displays the cost function used for this example. The horizontal panel at the bottom shows the climate-sensitivity distribution from before, with the mean (3°C) highlighted by a vertical blue line. The left vertical panel shows the distribution of expected damage costs.

The damage-cost distribution was obtained by taking each observation in the sensitivity distribution and “reflecting” it onto the vertical axis using the convex damage-cost function. Units on the damage-cost axis are omitted because the figure is not seeking to convey actual dollar values (although economists believe that they can compute those values for future anticipated warming).

The most important message from the above figure arises from comparison of panels A and B.

The only difference between the two panels is the degree of uncertainty associated with climate sensitivity: The mean sensitivity is identical, but the spread (standard deviation) of the sensitivity distribution is greater in Panel B (standard deviation 2.5) than in Panel A (standard deviation .5).

Now consider the consequences of increasing uncertainty on damage costs: Although mean sensitivity is the same across both panels, the average expected damage increases with uncertainty—the mean damage in Panel A is lower than in Panel B. The comparison is made easy by comparing the lengths of the two vertical double-headed arrows, which point to the mean damage in each panel. It is quite clear that the arrow is longer—representing greater expected cost—in Panel B than in Panel A.

In a nutshell, if we expect X°C warming, the expected damage cost associated with that degree of warming is a function not (only) of X but also of the uncertainty associated with our estimate of X—and the greater that uncertainty, the greater the expected damage.

Not only is uncertainty not your friend, but greater uncertainty translates into greater expected loss.

And There is More

And this is just the beginning, because there are a few more points worth making about the above figure: First, not only does increasing uncertainty about climate sensitivity increase the mean expected damage cost (i.e., best mean prediction), but it also increases the uncertainty around that expected damage cost—and that increase in uncertainty is particularly dramatic. Of course, uncertainty surrounding expected damages is highly relevant because it must be taken into account when assessing the total expected risk from climate change. To illustrate, whereas expected (mean) damage increases by “only” around 50% between the two panels, the associated standard deviation (uncertainty) of the damage increases 10-fold.

Another point worth making about the figure is that greater values of climate sensitivity likely translate into quicker evolution of the climate, all other things being equal (e.g., Bahn et al., 2011, Fig. 2). In other words, greater uncertainty about sensitivity (Panel B) not only translates into greater expected damage, but that damage is also likely to arrive sooner rather than later because the rate of temperature increase is greater with greater sensitivity. (This must not be confused with the fact that greater sensitivity may entail a longer asymptotic response time of the climate system; e.g. Hansen et al., 1985. That is, with greater sensitivity the climate system warms more quickly, but because it ultimately reaches a higher temperature, getting to that asymptote may take longer than with lesser sensitivity.)

This is an important point to bear in mind because if the greater damage were delayed, rather than accelerated, economists could claim that its absolute value should be temporally discounted (as all economic quantities typically are; see Anthoff et al., 2009). But if greater damage arrives sooner, then any discounting would only further exacerbate the basic message of the above figure: Greater uncertainty means greater real cost.

To sum up, uncertainty is no one’s friend. Greater uncertainty means things can be worse than you think. And greater uncertainty means you’ll pay more for the damages arising from climate change than if there were less uncertainty. In fact, you may end up paying much more than anticipated.

Uncertainty is no one’s friend.

The next post in this series we will examine how uncertainty affects the likely cost of mitigation.

Update 28/3/12: It has been drawn to my attention that the 5°C cutoff for an outcome to be labeled “catastrophic” was too conservative; that is, temperature increases considerably less than that would be associated with outcomes that most people would consider catastrophic. Conversely, limiting temperature increases to 2°C may not be “safe.”

I do not necessarily disagree, but those issues are not central to the point made here: Wherever one places a cutoff above or below the mean, the implications of the fat tails are identical, and it does not matter where exactly the “catastrophe” lurks. The crucial fact is that greater uncertainty translates into greater likelihood of catastrophic (or bad or terrible) outcomes, all other things being equal.

References

Anthoff, D.; Tol, R. S. J. & Yohe, G. W. Discounting for Climate Change Economics: The Open Access Open Assessment E-Journal, 2009, 3.

Bahn, O.; Edwards, N. R.; Knutti, R. & Stocker, T. F. Energy policies avoiding a tipping point in the climate system Energy Policy, 2011, 39, 334-348.

Hansen, J.; Russell, G.; Lacis, A.; Fung, I.; Rind, D. & Stone, P. Climate Response Times: Dependence on Climate Sensitivity and Ocean Mixing Science, 1985, 229, 857-859.

Nordhaus, W. D. Economic aspects of global warming in a post-Copenhagen environment Proceedings of the National Academy of Science, 2010, 107, 11721-11726.

Weitzman, M. L. What is the “Damages Function” for Global Warming – and What Difference Might it Make? Climate Change Economics, 2010, 1, 57-69.

Tol, R. S. J. The Social Cost of Carbon Annual Review of Resource Economics, 2011, 3, 419-443.

 

Uncertainty is not your Friend

The Australian Future fund is tasked with delivering high risk-adjusted returns on public funds, such as the Australian Government’s budget surpluses, in order to cover the Government’s unfunded superannuation liability arising from entitlements to public servants and defence personnel.

The Chairman of the Future Fund, David Murray, recently suggested on national TV with respect to climate change that “if we’re not certain that the problem’s there, then we don’t – we shouldn’t take actions which have a high severity the other way.”

This attitude towards uncertainty is not atypical: numerous news commentators have cited uncertainty about the severity of climate change in support of their stance against taking any mitigative action.

In a nutshell, the logic of this position can be condensed to “there is so much uncertainty that I am certain there isn’t a problem.” How logical is this position? Can we conclude from the existence of uncertainty that there certainly is no problem?

This conclusion appears inadvisable for a number of reasons that will be examined in this series of three posts. To foreshadow briefly, there are three reasons why uncertainty should not be taken as a reason for inaction on climate change:

  • Uncertainty should make us worry more than certainty, because uncertainty means that things can be worse than our best guess. Today’s post expands on this point below, by showing that in the case of climate change, uncertainty is asymmetrical and things are more likely to be worse, rather than better, than expected.
  • In the second post, I will show that it is a nearly inescapable mathematical constraint that greater uncertainty about the future evolution of the climate necessarily translates into greater expected damage cost.
  • Finally, the presence of uncertainty does not negate the urgency of mitigative action. There may be uncertainty about our carbon budget—that is, the amount of greenhouse gasses we can emit before we are likely to exceed temperature increases that are deemed “safe”—but the implications of there being a budget are that delaying mitigative action will necessarily end up being more costly later.

Uncertainty means things could be worse than anticipated

What does uncertainty mean in the context of climate change? Although the word uncertainty has been used in many different contexts, a good definition is to equate uncertainty with imprecision in our best estimate of the future evolution of the climate.

In particular, I focus on the likely “sensitivity” of the climate, which refers to the expected increase in global temperatures in response to a doubling of CO2 from pre-industrial levels. According to best current estimates, climate sensitivity is anywhere between 2 degrees (Celsius) and 4.5 degrees, with a best estimate of about 3 degrees. Because an eventual doubling of CO2 from pre-industrial levels (~275 ppm) cannot be ruled out, given today’s concentration of ~392 ppm and the on-going increase of 2 ppm per year, this estimate of climate sensitivity is also a good estimate of the temperature increase from pre-industrial levels that we may experience towards the end of the century.

Without going any further, we can already draw one conclusion from this fact: If our best guess of climate sensitivity is 3 degrees, and the uncertainty range is 2-4.5, then things could be worse than expected. We expect 3 degrees but might get 4.5—of course, we could also get as “little” as 2, but we are ignoring the vast majority of possible outcomes if we assume (or hope) that we will “only” get 2 degrees.

So clearly, uncertainty means that things could be worse than anticipated.

But the problem does not end there. There are two additional aspects of uncertainty that we need to consider.

First, we must consider the distribution of climate sensitivity estimates. We know that there is a “best” (mean) estimate, and we know that there is a range of most likely values. But it turns out that climate scientists can do better than that: they can provide a distribution of possible values of climate sensitivity which attaches a probability of occurrence to a range of possible values.

The figure below shows the distribution of climate sensitivity, taken from a paper by Roe and Baker (2007).

For present purposes, the most crucial aspect of the figure is its asymmetry: It has a “fat” upper tail and a fairly skinny lower tail. That is, there is little likelihood that climate sensitivity will turn out to be more than 1 degree less than the best estimate of around 3-3.5 degrees (i.e. there is little chance of it being below 2), but there is a considerable chance of it being more than 1 degree above that value—note how it is not inconceivable that it might be as high as 5 or even 6 degrees.

There are some very good—and intellectually fascinating—reasons for why the climate sensitivity distribution is asymmetric and has a fat upper tail, but they are beyond the present scope. Interested readers may want to consult Roe and Baker (2007) or Roe and Armour (2011). (For a critique of the work by Roe and colleagues, see Zaliapin & Ghil, 2010, 2011.Their critique does not affect the presence of fat tails, only the generality of the reasons underlying its occurrence).

So uncertainty doesn’t just mean that things could be worse than anticipated—in the case of climate, chances are that things will be worse rather than better than anticipated.

But remember, the problem does not end there. We have to consider one more aspect of uncertainty in climate sensitivity.

This final consideration concerns the effects of the magnitude of uncertainty. All other things being equal, should we be more worried by greater uncertainty or less worried? If scientists had really down-played uncertainty—as some commentators have insinuated—what would the effects be? What if uncertainty is actually greater than scientists think?

The answer is fairly straightforward, and it is illustrated in the figure below using a statistical Monte Carlo simulation.

The four panels in the figure contain synthetic (i.e., simulated) data that are sampled from a “lognormal” distribution—so if you took the log of the values and then plotted them, they would take on the familiar bell shape. I used a lognormal distribution because it has the fat-tail property that we know is an attribute of climate sensitivity.

To make my point, I ensured that the mean of the four distributions is identical (around 3, with a tiny amount of deviation introduced by the simulation). However, the standard deviations (spread) of the distributions differ considerably, from .49 in the top left to 2.6 in the bottom right. The spread of each distribution characterizes the extent of uncertainty surrounding the mean estimate of 3 degrees.

What are the consequences of increasing uncertainty? The consequences are indicated by the red line at the 5 degree mark in each panel. This represents a temperature increase that would indeed be nothing short of catastrophic in light of suggestions that any temperature rise beyond 2 degrees would be extremely difficult to cope with (Richardson et al., 2009).

It is obvious from the figure that the probability of a truly catastrophic outcome increases with uncertainty; from a small .07% in the top left to a concerning 14% in the bottom right. That’s a 200-fold increase in the likelihood of a catastrophic outcome when uncertainty increases from .49 to 2.6.

So uncertainty means things could be worse than anticipated.

Uncertainty in climate evolution means things are likely to be worse, rather than better, than anticipated.

And the greater the uncertainty, the more likely we could be faced with some very serious consequences.

Unlike the Chairman of the Australian Futures Fund, the real uncertainty professionals know this. Which is why Munich Re, a reinsurance behemoth, has identified climate change as one of the greatest risks facing mankind.

References

Richardson, K. et al., (2009). Climate Change: Global Risks, Challenges & Decisions. Synthesis Report of the Copenhagen Climate Congress. (University of Copenhagen).

Roe, G. H. & Armour, K. C. (2011). How sensitive is climate sensitivity? Geophysical Research Letters, 38, L14708.

Roe, G. H. & Baker, M. B. (2007). Why Is Climate Sensitivity So Unpredictable? Science, 318, 629-632.

Zaliapin, I. & Ghil, M. (2010). Another look at climate sensitivity. Nonlinear Processes in Geophysics, 17, 113-122.

Zaliapin, I. & Ghil, M. (2011). Reply to Roe and Baker’s comment on “Another look at climate sensitivity” by Zaliapin and Ghil (2010). Nonlinear Processes in Geophysics, 18, 129-131.

From the Debunking Handbook to “Widerlegen: aber richtig!”: Die deutsche Übersetzung

Das “Debunking Handbook,” daß John Cook und Stephan Lewandowsky for einigen Monaten auf Englisch produziert haben, ist inzwischen mehr als 465.000 mal heruntergeladen worden. Unter Anderem ist das “Handbook” von Richard Dawkins und Al Gore auf deren Internetseiten empfohlen worden.

Das “Handbook” ist jetzt auf Deutsch übersetzt worden, und wir machen es nun hier zum herunterladen verfügbar. Der deutsche Titel ist “Widerlegen: aber richtig!

Wir bedanken uns sehr bei unseren ehrenamtlichen Übersetzern, Bärbel Winkler und Oliver Marchand, für ihre sehr detaillierte und ausführliche Arbeit.

Wir freuen uns daß die deutsche Version schon vor ihrem Erscheinen in den deutschsprachigen Medien ein Echo gefunden hat. Zum Beispiel berichtete die Süddeutsche Zeitung am 1. Februar über unsere Arbeit sehr ausführlich.

Wir zitieren den Anfang dieses Artikels da er klar stellt warum das Widerlegen heutzutage so besonders wichtig ist: “Manche Irrtümer enden tödlich. Der Irrglaube, dem der ehemalige südafrikanische Präsident Thabo Mbeki aufsaß, kostete laut mehrerer Studien etwa 365 000 Menschen vorzeitig das Leben. Sein Fehler führte dazu, dass Kindern, Frauen und Männern Knoblauch, Rote Bete und Vitamine empfohlen wurden, um ihre HIV-Infektion zu behandeln. Wirksame antiretrovirale Medikamente lehnten Mbeki und seine Gesundheitsministerin Manto Tshabalala-Msimang ab. Diese Arzneien hätten das Leben vieler Menschen verlängert und verhindern können, dass zigtausend schwangere Frauen das HI-Virus an ihre Babys weitergaben. Doch Mbeki hatte Internetseiten sogenannter Aids-Leugner Glauben geschenkt. Darauf behaupten die Anhänger dieses Mythos, dass Aids nicht durch das HI-Virus ausgelöst werde. Es hagelte Kritik. Wissenschaftler weltweit liefen Sturm. Auf Konferenzen und in Publikationen wurde Mbeki mit Argumenten und Fakten überschüttet. Er ließ sich nicht abbringen. Sämtliche Argumente prallten an ihm ab.”

Genau darum dreht es sich: Das Festklammern an unsinnigen Behauptungen kann gewichtige Konsequenzen haben—vom Klimawandel bis zu AIDS ist es extrem riskant, den überwältigen wissenschaftlichen Beweisstand zu ignorieren.

Wir hoffen daß “Widerlegen: aber richtig!” dazu beitragen kann die öffentliche Debatte auf einer mehr faktentreuen Basis zu gestalten.

An English post about the German Debunking Handbook has been posted on Skeptical Science.

Disasters that Come and Go—But They Will be Back

Who hasn’t heard the phrase “in today’s dollars”? We all know that this refers to the price of goods being adjusted to reflect the passage of time.

For example, if we want to know whether cars have become more expensive since the 1950s, it would make very little sense to compare the sticker price of Holden‘s classic HR in 1966 ($2,167) to today’s Commodore (approximately $36,000) without adjusting for inflation. In fact, the 1966 HR would have cost $24,091 using 2010 dollars, the latest year that the Reserve Bank Inflation Calculator will handle.

Still, that’s about $10,000 less than today’s Commodore, so have cars become more expensive?

Comparing prices and costs across time is therefore not entirely trivial and it is easy to be led astray, in particular because any such adjustment is affected by a number of considerations that are not always spelled out. The preceding example adjusted for inflation but it did not adjust for “how much car” you are getting now, as opposed to 45 years ago. And when you consider the evolution of the quality of products over time, even a low-end Barina at $13,990 probably offers more value now (USB and iPod compatibility, wow!) than an HR did in 1966.

But how would one quantify the value of that USB and iPod compatibility? How much “more car” can you purchase now compared to 1966? This is a tricky issue and there are some ingenious solutions but I do not believe anyone would claim that it has been resolved altogether.

And that was the easy part.

Now let’s consider an issue that has caused considerable public and scientific debate, namely whether there is a “signal” from climate change in the recent acceleration of extreme weather events.

Are we experiencing climate change and is the number of recent freak weather events a sign of this increasing climate instability?

This question has no easy answer at the moment.

Let’s begin by counting the number of weather-related disasters around the globe during the last 30 years: They are represented by the increasing function in the figure below (taken from Neumayer & Bartel, 2011). There is no doubt that this number is increasing—in fact, the frequency of weather-related disasters has nearly tripled in the last 30 years.

The figure also contains a (virtually) flat line, which represents the number of geophysical disasters—that is, things like earthquakes and volcanic eruptions that are not affected by a changing climate. The fact that this line is nearly flat tells us that the increasing disaster count isn’t an uninteresting consequence of better accounting or reporting. If geophysical disasters had also been increasing dramatically, then that would suggest that the trend was likely due to factors other than climate change, such as better detection of events as technology improved or better reporting as more developing countries had the funds to keep accurate records.

Case closed? Climate change obviosuly led to an increasing number of weather disasters around the globe?

It is tempting to draw this conclusion but that would be premature. There is another way of looking at these data.

Partialing out Wealth: The Intriguing World of Normalization

Rather than merely counting events, insurance companies and other professional risk analysts are interested in finding out whether insured losses from disasters have increased over time. Considering losses has the additional advantage that unlike counting of disasters, there is no argument over what constitutes a “disaster”—whereas if you count events you need to know what counts as an event; how many uprooted trees does it take for an event to be a bona fide disaster?

However, when we turn to measuring losses, this is where the problem of “today’s dollars” and how to make comparisons between the qualities of a 1966 HR and today’s Commodore rears its ugly head in full complexity.

Not only do we have to use today’s dollars to look at disaster-related damage over time, but we have to account for other factors that might increase insurance losses for reasons unrelated to climate change. For example, more people may move into attractive areas—such as floodplanes or coastal strips—that are also disaster prone. Property values may rise faster than inflation, and as people get wealthier, more precious and costly objects are put in harm’s way. Insured disaster losses may increase over time for those reasons alone and without any additional contribution from climate change.

The challenge is to account for those other factors by statistical means, thereby removing them from the analysis to permit identification of the net effect of climate change on insured losses. This process is known as normalization.

It turns out that after normalization, most of the time—but not always; Barthel and Neumayer (2012)—there is no longer an increase in disaster-associated losses over time. If we adjust the earlier losses to today’s figures, similar to the way in which we expressed the price of a 1966 Holden HR in today’s dollars, then most researchers have thus far failed to find a climate-associated trend in the data (see Bouwer, 2011, for a review).

This is illustrated in the figure below, also taken from Neumayer and Bartel (2011), which shows temporal trends for Global losses from non-geophysical disasters after normalization, using two different normalization techniques. Neither trend line is significantly different from zero.

This is good news, of course, because it means that thus far the insurance industry does not have to cope with additional losses from climate change. (And we don’t have to cope with higher premiums.)

But is it the whole story?

No. And this is where things get to be particularly interesting.

We Don’t Just Get Wealthier—We Also Advance Technologically

Because the moment one begins to “normalize” temporal trends by adjusting for relevant variables, intellectual rigor demands that one include all such relevant variables. That is, it is insufficient to include only variables related to increasing risk exposure (e.g., growing wealth, people moving into disaster-prone areas): One must also consider variables that decrease exposure to risk over time.

For example, today’s building standards render homes far more resistant to wind damage: Your grandfather’s fibro shed may have easily collapsed under a hefty breeze but your beach-front double brick structure is likely to withstand anything short of Hurricane Rambo. Likewise, your great-aunt couldn’t call in the cavalry when her farm was threatened by bush fire, whereas today you might get assistance from a pair of helicopters shortly after reporting a fire.

Most normalization research to date has not accounted for those variables because they are extremely difficult to quantify. (And most researchers have been at pains to point that out; e.g., Neumayer & Barthel, 2011, pp. 23-24.)

This failure to include relevant variables has drastic implications, because as noted by Nicholls (2011), it means that the absence of a loss trend rests on the assumption that there have been no improvements—zero, none, zip—in our ability to issue advanced warning of weather hazards, in our building codes, in our fire fighting abilities and so on. (Anthes et al., 2006, and Trenberth, 2010, have made similar points; though see Pielke et al., 2006).

In effect, normalization research to date largely rests on the oddly inconsistent pair of assumptions that (a) we have built up enormous wealth during the 20th century but (b) did so without any technological advance whatsoever.

So where does this leave us?

The issue of normalization of disaster losses is tricky and cannot be considered resolved. We urgently need research to quantify and account for the effects of better mitigation technology that has—so far—kept losses from disasters manageable compared to historical precedents.

Absent such quantification we can be sure of only one thing: If those variables had been accounted for during normalization, the observed loss trends would have been greater—how much greater is unknown at the present time, but greater for sure.

We also know that the raw numbers show a dramatic increase over the last few decades, and although it would be inadvisable to attribute all of that to climate change (after all, a hidden variable such as the number of leprechauns that are born on 17 March every year might also be responsible), it seems equally inadvisable to dismiss that trend altogether.

Evaluating Risk and Thinking About the Future

As usual, therefore, we are left with some uncertainty. The normalized trend of disaster losses is certainly underestimating the true state of affairs, and equally, the raw number of disasters comes with its own set of problems that may, if anything, overestimate the trend.

In light of that uncertainty, what could (and should) we conclude?

First, we must acknowledge that risk judgments and tradeoffs are inherently subjective and subject to preferences. Some people don’t get on planes because the risk is too great (for them) while others not only get on planes but jump out of them because (for them) that’s a risk worth taking.

However, that doesn’t mean that evaluating risks is a free-for-all game without any rules: For example, just because there is uncertainty doesn’t entitle one to be certain that there isn’t a problem.

This is a crucial point, so let’s revisit it without the double negatives: uncertainty cannot imply that a problem is certainly absent. So appealing to uncertainty about trends in disasters to conclude that we don’t have a climate problem constitutes a fallacy of reasoning.

And in the context of weather-related disasters, a balanced assessment of risk would probably consider the following additional facts:

  • meteorological records have identified an increasing number of “fire risk” days in Australia. This trend is not subject to normalization because it simply looks at the conditions that might give rise to further disaster losses.
  • the most recent IPCC assessment regarding extreme events has judged it virtually certain that heat extremes will increase, and there is every expectation that heavy precipitation will occur more often in the future as well.
  • the same report predicts with high confidence that there will be further coastal inundation, and instability in mountain regions including landslides.
  • oh, and did I mention that the number of weather-related disasters has tripled in the last 30 years?

It would be unwise to conclude that we definitely don’t have a problem.

References

Anthes, R. B.; Corell, R. W.; Holland, G.; Hurrell, J. W.; MacCracken, M. C. & Trenberth, K. E. (2006). Hurricanes and Global Warming—Potential Linkages and Consequences. Bulletin of the American Meteorological Society, 87, 623-628.

Barthel, F. & Neumayer, E. (2012, in press). A trend analysis of normalized insured damage from natural disasters. Climatic Change.

Bouwer, L. M. (2011). Have Disaster Losses Increased Due To Anthropogenic Climate Change? Bulletin of the American Meteorological Society, 92, 39-46.

Neumayer, E. & Barthel, F. (2011). Normalizing economic loss from natural disasters: A global analysis Global Environmental Change, 21, 13-24.

Nicholls, N. (2011). Comments on “Have disaster losses increased due to anthropogenic climate change?” Bulletin of the American Meteorological Society, 92, 791-793.

Pielke, R. A. J.; Landsea, C.; Mayfield, M.; Laver, J. & Pasch, R. (2006). Reply to “Hurricanes and Global Warming—Potential Linkages and Consequences.” Bulletin of the American Meteorological Society, 87, 628-631

Trenberth, K. E. (2010). Fixing the Planet? Science, 330, 1178-1179.

 

Anthes, R. B.; Corell, R. W.; Holland, G.; Hurrell, J. W.; MacCracken, M. C. & Trenberth, K. E.
Hurricanes and Global Warming—Potential Linkages and Consequences
Bulletin of the American Meteorological Society, 2006, 87, 623-628

Estivation Renovations

Whereas most species of bears hibernate, Australians indulge in estivation instead, preferring slightly cooler beaches to their offices, especially when university air conditioning is turned down or off.

We at Shaping Tomorrows World have therefore not posted anything for about 3 weeks, having instead focused on an upgrade of the site, which is now ready for 2012.

Here is the list of new features:

  • The login field on the left has been moved up to facilitate leaving comments.
  • Authors’ names now link to their bio pages.
  • All posts by the same author are now accessible from their bio page.
  • The Our Authors button at the top (in the yellow ribbon) now allows instant access to a data base of all our contributors, and from there to each author’s posts. We find this new feature particularly useful.

We look forward to resuming regular posts shortly. For overseas visitors, January 26 is Australia Day and it marks the end of the semi-official estivation period. After that the air condiotioning is back on and academics gradually ramp up to their ~60-hour work weeks.

We wish all our readers a happy 2012.