To make mistakes is human. To make them repeatedly is careless and short-sighted. To do so in pursuit of profit and at the expense of the natural systems on which we all depend is unforgivable.
For a species undeniably so intelligent and technologically advanced, humans are remarkably bad at learning lessons from past mistakes. The last century has provided all too many examples of industrial developments that have had widespread and serious consequences for health and environment. Some of these consequences were predictable from the outset (such as for leaded petrol and asbestos), while others were arguably harder to foresee (for example, TBT antifouling paints and ozone-depleting substances). And yet we still struggle to act on early warnings or to design systems that steer us towards sustainability.
In its first review, Late Lessons from Early Warnings, published in 2001, the European Environment Agency (EEA) broke new ground in analysing the use and misuse of scientific evidence in guiding public policy, how early warnings of harm had been missed or ignored and how greater understanding and acceptance of the complexity of the world around us could yield better decisions. Among its 12 lessons were the need to recognise uncertainties and ignorance and the value of long-term monitoring, to be explicit about assumptions, to evaluate claims of technological benefits as critically as those of risks, and to act with precaution, reducing harm where there are reasonable grounds for concern.
Twelve years on, the EEA’s second volume of Late Lessons confirms, with a series of new case studies, that progress has been inexorably slow, or even imperceptible. Scientific uncertainty is still over-used as a justification for delay or inaction and lack of evidence still portrayed as evidence to justify business as usual. Too great a focus on technologies rather than needs (in the case of agriculture for example), still commonly favours high-productivity, resource-intense “top down” systems over what may well be more relevant, socially and environmentally sustainable alternatives which involve local people and draw upon local knowledge.
There are many other insights. The case of booster biocide antifoulants for ships highlights the need for greater foresight and lateral thinking about solutions. That of neonicotinoid insecticides in seed dressings and their possible role in honeybee decline reveals fundamental weaknesses in risk assessment and authorisation procedures, as well as the pressures on science when commercial stakes are high and debates polarised. Past and recent nuclear accidents have shown that ‘low probabilities’ are not ‘no probabilities’ and that, once one threshold is crossed, the improbable can become the unavoidable. And the case of endocrine disrupting substances in Europe’s drinking water shows that, even when the scientific case is accepted, action can be blocked as simply too expensive.
Costs, of course, are always relative. The costs of inaction may be harder to quantify but will inevitably rise if and when early warnings prove right. At the same time, the financial risk of acting on false alarms, so often used as an argument against precaution, has so far proven relatively small.
In so many cases, it is not humanity at large that is to blame, but the irresponsible actions of a small number of corporations which perceive that they have too much to lose. And yet it is not enough to apportion blame – we have to find new ways and renewed courage to act if we are to avoid repeating past mistakes. Enabling greater public participation in decision-making is undoubtedly part of the battle.
As we face unprecedented global change from climate perturbations, ocean acidification and habitat destruction and the emergence of new threats from nanomaterials, biotechnologies and the prospect of climate engineering, there remains no shortage of warnings. However late the lessons, we owe it to our children and to this incredible planet on which we live to learn them.
David Santillo, Senior Scientist at Greenpeace International