Please note that this is BBC copyright and may not be reproduced or copied for any other purpose. RADIO 4 CURRENT AFFAIRS ANALYSIS RISKY BUSINESS TRANSCRIPT OF A RECORDED DOCUMENTARY Presenter: Diane Coyle Producer: Innes Bowen Editor: Nicola Meyrick BBC White City 201 Wood Lane London W12 7TS 020 8752 6252 Broadcast Date: 11.12.03 2030-2100 Repeat Date: 14.12.03 2130-2200 Tape Number: Duration: Taking part in order of appearance: Michael Meacher Former Environment Minister Dr John Graham Administrator, Office of Information and Regulatory, US Government Dr Andy Stirling Senior Lecturer, Science, Policy Research Unit, Sussex University Dr Caroline Lucas Green MEP for South East England Dr Paul Drayson Former Chief Executive of Powderject Science Entrepreneur in Residence, Saïd Business School, University of Oxford Dr Neil A. Manson Philosopher, McDonnell-Barksdale Honors College The University of Mississippi Adam Burgess Lecturer in Sociology, University of Bath Author of Cellular Phones, Public Fears and a Culture of Precaution, published by Cambridge University Press, 2003 Prof William J. Baumol Professor of Economics, New York University COYLE: From designer babies to Frankenstein foods, from mobile phone radiation to global warming, our technological age holds many new fears. Does this mean our approach to technology should be better safe than sorry? MEACHER: I think it should be more precautionary. I don’t think massively more precautionary, but I am very worried that in the next five, ten years, we may come to see that we’ve all been imbibing chemicals gradually and slowly, but cumulatively, which do produce an epidemic of cancer and other serious conditions and that we should have been trying to find less toxic, less risky substitutes. SEGUE GRAHAM: There are people who are suggesting that a chemical before it should be used in the economy must be proven conclusively safe for long term, as well as near term, before it’s introduced into the market. Now what does that mean? What would have happened to pharmaceuticals? What would have happened to plastics? What would have happened to a lot of the basic consumer products that we use on a day-to-day basis? Some of this rhetoric simply isn’t practical. COYLE: Former Environment Minister Michael Meacher, and the US government’s chief regulator John Graham. They’re on opposite sides of the debate about an approach to the regulation of technology known as the precautionary principle. Watch out for this term. You’re likely to hear it a lot more in the next few years. For people concerned with environmental and consumer safety, the precautionary principle is already a buzz phrase. And as a philosophy, it’s influencing decision-making in some key areas of public concern such as genetic modification and mobile phone safety. Essentially it involves taking a far more cautious approach to all kinds of innovations than we have in the past, introducing extra testing for example, or even preventing applications of controversial technologies. So what are the arguments for introducing a new precautionary principle into the regulation of science and technology? Andy Stirling is a senior lecturer at the Science Policy Research Unit at Sussex University. STIRLING: The real reason that we need a precautionary approach generally is because there are real limits to what scientific risk assessment can actually do for us. There are plenty of situations where risk assessment is a very powerful tool, but it’s limited - it can’t address surprises, it doesn’t address properly where there are different interpretations of the science. So in situations like that, the precautionary approach gives us a clearer guide as to what to do. It doesn’t tell us exactly what to do, but it just flags up a situation where we should take more care. COYLE: But doesn’t it raise the question that if you don’t have scientific evidence on which to proceed, you need some other criteria for making your judgments and what are they? STIRLING: It means you should be a bit more rigorous and a bit more careful about the evidence that you use, so rather than just saying this looks safe enough, you might compare it with other options. Or you might look at the benefits to see if they outweigh the uncertainties, if it’s still worth proceeding. Or you might make a point of involving a lot more scientific disciplines in the examination of the case, whatever it may be. Or you may go to different interest groups such as the workers or the local communities to find out if your assumptions and the questions that you’re asking of a science are actually sensible. COYLE: These arguments have been gaining ground amongst policymakers. For example, a new European Union regulation due to come into force the year after next means companies will have to prove the safety of about a hundred thousand chemicals currently in use, whether or not there’s evidence they’re harmful. This could include even everyday items such as the ink and ballpoint pens. Advocates of the precautionary principle argue that it’s not good enough to wait until there’s scientific proof that a product or technology causes harm. They believe the public often realises there’s something wrong long before the scientists and big business are prepared to admit it. Caroline Lucas is the Green MEP for South East England. LUCAS: When it comes to mobile phone masts, the vast majority of the scientific community will tell you there’s no problem and they’re saying basically that risk is negligible. But you’ve got some … you know whole streets in London whereby they’re being called Cancer Street or something because there are much higher levels of cancers and leukaemias in that street than would be according to the national average. Also in that street you’ve got a far, far higher concentration of mobile phone masts than you have elsewhere. I think when you’ve got a combination of something like that and when you’ve got some evidence in the scientific community to suggest that there could be problems, then that seems to me to be a good example of whereby a proportionate response based on the precautionary principle will tell you not to put those masts in areas where people are most vulnerable; and that tends to be around schools where children’s bone on their skulls and so forth hasn’t developed fully and where they’re going to be more vulnerable. COYLE: What kind of evidence would change your mind about it? I mean suppose there were over time lots of independent studies showing that in the case of the masts there’s no particular higher prevalence of various types of illness or cancers or whatever. What would be enough evidence to change your mind about that? LUCAS: To my mind, already there is evidence that for some people, some people are negatively affected by the radiation from mobile phone masts. And so I suppose what you’re asking me is if someone could just wave that away with a magic wand and say well actually it doesn’t exist and it’s all been some terrible mistake, then yes that would change my mind. But that seems pretty unlikely. COYLE: Magic wands being rare, this means that advocates of the precautionary principle place little reliance on the conventional approach to assessing risks, which is to presume a product or technology is safe unless mainstream scientific opinion believes there’s evidence to the contrary. The precautionary principle’s supporters think there are too many unknowns to rely on the traditional method. Andy Stirling of the Science Policy Research Unit. STIRLING: The precautionary principle itself is simply a response to scientific uncertainty and the fact that risk assessment doesn’t address uncertainty in the strict sense. In a way, you could say precautionary principle is an application of science. It’s actually a more scientific approach than risk assessment because it acknowledges that there are things beyond risks – there are surprises, there are ambiguities, there are divergent opinions, which risk assessment is not geared to address and precaution is simply acknowledging that. And so if we don’t acknowledge that, we’re actually being unscientific. The essence of precaution is simply about being rigorous about uncertainty. COYLE: The appeal of playing safe with science was spectacularly reinforced by one especially alarming example of scientists getting their predictions badly wrong. The BSE crisis of the 1990s, the epidemic of mad cow disease affecting British beef, severely diminished many people’s trust in the scientific establishment. Although the number of people affected by variant CJD is much lower than many pessimistic forecasts made at the time, the experts were too slow to conclude that the disease could infect humans despite widespread concern about the risk. Many of us, including politicians, drew the conclusion that science did not have all the answers. MEACHER: I think people who say we need to go by the science, and people don’t understand the science, it’s too complicated, and if only they knew as much as we did then they’d take the same decision – I think that is deeply cynical, anti-democratic and unacceptable. COYLE: Former Environment Minister Michael Meacher MEACHER: My preference is ordinary people, but I do realize you must qualify that a bit. It is perfectly true that many ordinary people are not as well informed as I would like them to be and I think the way out of that is not to dismiss them, but to try and educate and inform them better so that they are able to express an informed opinion. But the scientific elite, I must say – and I say this with sadness really – I think has become too closely involved with industry. That isn’t to say that there are not many eminent scientists whose integrity cannot be impuned and who are absolutely committed to their science. I am not making a generalized comment at all, but there are a number of individuals who I think are so closely involved in industry in terms of research grants, in terms of funding, in terms of employment that, with the best will in the world, your judgement is affected. And I think people are right not simply to take the scientific elite at face value. COYLE: The trouble with distrusting the scientific assessment of the risks because of doubts about its objectivity is that it’s not clear exactly what the alternative basis for regulatory judgements should be. Not only are supporters of the precautionary approach hostile to scientific evidence which happens to conflict with public fears, vulnerable as these are to misunderstanding and emotion, but they also seem to want a degree of certainty science could never provide. Dr Paul Drayson was formerly chief executive of Powderject, a biotech company, and is now science entrepreneur in residence at the Saïd Business School in Oxford. DRAYSON: The precautionary principle is based upon people having to prove a negative. And you just can’t do that in science, you just can’t prove the absence of something, so it is going to lead to innovations not being brought to people’s use that should have been on the basis of rational argument. And I think it’s where this regulation enforces the precautionary principle to the extent that it becomes impossible for entrepreneurs like myself to raise money to develop new products where it actually damages the morale of people who are working in these areas, then it is going to have an effect. And of course these effects are invisible - you don’t see the lack of a positive benefit of a new medicine which never got to market, but you do see the effect of something which has caused harm and so it’s important for people to remind themselves. It’s like for all of us, when we’re in the dark and we’re feeling scared, we have to tell ourselves that actually there’s no reason, it’s just an irrational fear, and most of us have been afraid of the dark at some point. But I do think you have to overcome that. You have to tell yourself that if you’re going to make a decision about something, it has to be based on evidence. COYLE: In practice, the precautionary principle seems to be applied to anything that seems hi-tech and unnatural, especially if it looks like having commercial potential for big business. But according to philosopher Neil Manson of the University of Mississippi, there isn’t any logic in favouring the traditional and the natural over the new and the technological without having evidence of potential harm. After all, any course of action - either adopting a new technology or restricting it - could in theory be harmful. MANSON: You could have very easily a defender of the biotech industry or genetic modification or whatever construct a scenario whereby if they don’t genetically modify organisms or if we don’t continue using carbon fuels, then that could lead also to some sort of catastrophic outcome. I could, for example, argue that if we tried to drastically reduce the emissions of carbon dioxide into the atmosphere, that this would lead to the destruction of the global economy and that that would lead to a rise in militarism and nuclear war, which would bring about a nuclear winter, which is another scenario that’s out there. Now I don’t of course think that that’s likely to happen, but the point is if you’re just looking at bare possibilities, it’s a possibility of some things happening. And so arguing simply from possibilities without any regard to the probabilities of their happening, which is I think what’s going on with this catastrophe idea, is not a good way to reason. And yet I think, from what I read in the literature on the precautionary principle, that that’s in fact what a lot of people have in mind. COYLE: So let me see if I follow this. It sounds very reasonable that you want to avoid catastrophic events, but it’s only rational to take account of unlikely but catastrophic risks if you know that you’re right in the first place because there’s a whole universe of possible catastrophes out there? MANSON: Yes, just mere prospects are insufficient to justify taking a course of action. You need more than that; you need evidence and probabilities. But if you’re doing that, it seems to me you’re getting … getting back to that thing that I think advocates of the precautionary principle are trying to avoid, which is risk analysis, cost benefit analysis where you assign numbers to outcomes and also weights to the value of disvalue of that outcome, and I think that’s what the precautionary principle tries to avoid. I’m saying I think it’s unavoidable. COYLE: So evidence on the degree of risk in each case must make a difference to which ones we pick out to treat more cautiously. After all, we can’t be certain that anything is safe. So does that mean that we must subject everything to a more rigorous and comprehensive assessment than the tests already undertaken by scientists and businesses? Obviously not. It’s just not practical. The recent farm scale evaluations of GM crops cost £5 million while a research programme into mobile phone safety is costing over £7 million. So how can politicians and regulators decide which products and technologies should be susceptible in the first place to a strictly precautionary assessment? John Graham of the White House Office of Management and Budget, and also one of America’s leading academic experts on public health. GRAHAM: When we do risk benefit analysis, part of the contribution is from science, from engineering, from economics. But there are also value judgements that are made in weighing how important one risk is compared to another benefit, and so there’s an important aspect of policy judgement that goes into risk benefit analysis. You need to have policy judgements being made – for example on whether to do more scientific research before you regulate a technology or whether to regulate it based upon current maybe imperfect scientific information. You can’t make that decision without resorting to some kind of weighing of value judgements. COYLE: Cost-benefit analysis, comparing the likely pros and cons, is exactly what companies and regulators have always done. But the importance of value judgements has become more apparent as public trust in scientific experts and big business has declined. Yet modern technologies are so complex that few of us understand them. This poses a sharp dilemma, because public opinion doesn’t necessarily get it right either. For example, American consumers have been eating GM foods without any concerns for a decade now, but few Europeans are buying them. There’s no way of knowing which group of consumers will prove correct, at least until many more years have passed. So if reluctance to leave decisions to the experts means that we should give greater weight to public opinion, as politicians like Caroline Lucas and Michael Meacher suggest, we still won’t necessarily get clear guidelines for the application of the precautionary principle. MEACHER: I can remember arguing into the night on many occasions about the multilateral environmental agreements – bio- safety protocol, persistent organic pollutants, prior informed consent, all to do with chemicals – very serious issues about what amount of evidence was necessary. The Americans would always say they wanted more. I was trying to pin them down to say that we should try and decide between ourselves how you operationalise the precautionary principle, how much evidence short of certainty is it reasonable to take action? Now you can’t make that a subjective judgement. We would have to set down the criteria which justify the choice of that threshold. I believe that that could be attempted, but there’s very little stomach to do it. COYLE: There’s self-interest in the American stance, as precautionary regulations like the future EU chemicals regime or restrictions on GM crops will hit US exporters. Nevertheless, John Graham, speaking for the Americans, is diplomatic in his assessment of this current transatlantic policy difference. GRAHAM: Public trust in science and engineering, it is like all aspects of public dialogue – it has swings up and down of more and less confidence over time. In the United States right now, for example, we have a very high degree of confidence in our food safety system and the specialists who are responsible for weighing our food safety system. But we understand in Europe right now, for example, there’s a lot of concern and mistrust that has gone on about the food safety system. But these things go in cycles and we will have, I’m sure, an occasion in the US where we suffer that same kind of confidence problem. COYLE: Certainly in this country, even aside from legislation passed at the European level, there are more and more examples of successful campaigns to enforce a precautionary approach. They’re often a popular cause with politicians and the media. Technology scare stories have become a staple of our daily news. Take mobile phones for example. If you believe all the headlines, they can fry your brains, double the risk of cancer and affect your memory. The Stewart Inquiry, set up by the government three years ago, found no evidence that use of mobiles caused harm but even so recommended that children should use them only when essential. Adam Burgess is a lecturer in sociology at the University of Bath and has recently written a book about the application of the precautionary principle to the mobile phone industry. Why does he think policy makers sometimes decide they must err on the side of caution? BURGESS: I think you have to look at a case by case basis to see the process of politicization – why and how it is that certain concerns become elevated over others – and I think in most cases you will find it can’t really be understood in terms of the particular dangers posed by the particular technologies; it’s you know you happens to be in the wrong place at the wrong time, and I think mobile phones is a case in point there. I think the mobile phone experience has come after the politicization of consumer safety in Europe, it’s come after the BSE experience and, therefore, it was essentially politically decided that we’re going to apply a precautionary approach in this as a sort of model experiment regardless of the fact that there’s no evidence to suggest that radiation at this level poses any substantial dangers to human health. So I think it’s the politics of precaution is what we’re talking about here. COYLE: So you see it as a kind of populism, do you? BURGESS: A kind of populism, I think. Certainly I think at the European level, it seems to me that consumer safety has come along as a very attractive identity issue for the European Union. I think they’re drawn to it like a magnet because of the image it gives out. And I think that’s, to an extent, based upon the early American experience where consumer safety arose in the context of a legitimacy crisis in American society and presented itself as a means of reconnecting with society or an attempt to do so – attempting to reconnect with society at its lowest common denominator, which is concerns about safety. COYLE: The desire to respond to concerns close to the heart of consumers, such as the safety of their children, can explain why precautionary campaigns might appeal to politicians, at a time when many people are at least as disillusioned about the political process as they are about scientific expertise. And while she wouldn’t agree about the evidence on mobile phone radiation, Green MEP Caroline Lucas does explain her position as an explicitly political response to distrust of the commercial power of big business. LUCAS: I would argue that when you’ve got just a handful of very powerful corporations in charge of one particular technology, it’s not surprising that they are the ones who are really more or less pushing this one down our throats and are really the ones that are taking the lead on this. If a fraction of the research that’s currently going into GM went into more environmentally benign forms of agricultural research, then I think we’d have a very different picture in front of us and very different choices to make. COYLE: Increasingly as consumers we are rejecting some of the new technologies that large corporations are trying to market. But how much does it matter if we become suspicious of big business? And if governments respond to our fears with tough regulation of commercial innovations, will scientific progress grind to a halt? Supporters of the precautionary principle suggest it will bring its own benefits, such as the stimulus to alternative, more benign kinds of innovation. Andy Stirling of the Science Policy Research Unit. STIRLING: Typically you’ll find where a particular technology – maybe it’s fossil fuels with climate change – is finding itself impeded by a desire to take a more precautionary approach, then you’ll have other technologies that leap in and so you find innovation, yes, maybe is from time to time slowed down or suppressed in one area, but technology in another area will come leaping in like renewable energy. Few things have been more catalytic to the intense innovation and development of technology we see in the renewable energy sector than the position that fossil fuels and indeed nuclear power find themselves in. SEGUE DRAYSON: That’s a reasonable argument but I do think it misses a key point about they way in which technological innovation happens in that often technologies are bridges to other technologies. COYLE: Entrepreneur and scientist Paul Drayson. DRAYSON: So, therefore, if under the precautionary principle a whole area of research is stopped because of a perceived risk which cannot be proven – remember this is stopping something because of a concern about something where there is no evidence for that concern – and what you can miss from that is the technology which would have come out of the second level of development of that research. And you never know how innovation is going to happen and so, again, you can miss out a whole area of new innovation and science by stopping research in one field. COYLE: So in fact you’re saying the uncertainty is actually a source of the creativity? DRAYSON: It is. And I think ironically if you applied the precautionary principle to the precautionary principle itself and you ask yourself well what is the risk of the application of the precautionary principle, well the risk of the application of the precautionary principle is the likelihood that we would miss out on some very important medical innovations. Under that basis, the precautionary principle itself tells you you shouldn’t use the precautionary principle. COYLE: And if you’re still not convinced about the dangers of being too cautious, White House official John Graham invites you to try the following thought experiment: GRAHAM: Suppose it was 1850 and we insisted as the United States government that any new technology that was introduced had to be proven safe in long-term studies over the full life of the product. What would have happened to electricity, to the internal combustion engine, to the computer, to the Internet? Many of these technologies even in widespread use today could not satisfy the tests that they have been proven safe by long-term epidemiology studies. Technological innovation by its very nature involves unknown risks that are not always fully anticipated and, as a consequence, we need to make sure when we go into these decisions that we don’t have a simplistic view of what precaution is all about. COYLE: As well as the question of safety, longer and more thorough testing under a stricter regulatory approach would also make new technologies more expensive to introduce. Even supporters of the precautionary principle like Andy Stirling admit it will impose costs on industry and consumers. STIRLING: It is true that being more rigorous and systematic and comprehensive in the way I’ve suggested, which is what precaution tries to be, does place more burdens on the regulatory appraisal process. Whether that’s born by public sector regulators or by the industry concerned it will vary from case to case – but, yes, I think it’s fair to say it’s a more resource intensive process than it otherwise would be. So it may place costs in the short-term, but in the long run remember you’re only doing this to forestall much greater costs in the long run from environmental impacts or human health impacts or the need to get out of a technology you’ve become locked into. For instance, with nuclear power, one argument is that if we do find that we’re going to use different energy technologies in the future, we will have spent an awful lot of money in developing a technology that in the end we have not carried through to fruition. So that’s placed huge costs on us. So it’s a trade off between short-term and long-term and it’s far from clear how that will play out in general. There may be instances where in retrospect we wish we had just let something happen, but by and large precaution is based on the idea that in many cases we can look back on, we would have been much better off if we had been more precautionary at the beginning, looked more carefully at the product of technology right at the outside and saved ourselves an awful lot of environmental damage, human health damage and – yes - cost. COYLE: The potential costs are not just economic or technological, according to Adam Burgess. There’s a social price to pay for stalling technologies and mounting expensive public inquiries. BURGESS: The very fact of holding an inquiry in a context of politicised consumer safety concerns tends to confirm, on the basis of no smoke without fire, that there must be a danger here. Speaking as a sociologist, I mean my concern is not so much with the consequences in terms of resources being diverted, spending money on things which simply money shouldn’t be spent on in my opinion. I think that what I’m most concerned about is its impact on the fabric of life, so to speak, and the sense in which everyday culture is dominated by an anxiety about the worst possible outcomes and that becomes the way in which we determine how we act. That seems to be an enormously potentially destructive thing about social interaction and about the way we live our lives. COYLE: Whether or not the precautionary principle is turning us into a nation of worriers inclined to snuggle under the duvet for safety, it would be an exaggeration to claim it’s going stop technological progress in its tracks. From a longer perspective, the torrent of innovation will find an outlet. Will Baumol, Professor of Economics at New York University and one of the world’s leading experts on innovation and economic growth, was born in the 1920s and saw the introduction of many of the technologies we now take for granted, from major developments like air travel and electricity in our homes through to apparently more trivial innovations such as zippers – or radio. BAUMOL: By and large, the forces of growth are just so powerful and so pervasive, the outpouring of innovations is so enormous that we will hardly notice the difference. We are living in an age in which the influx of innovations of new products, of new processes is beyond anything that could previously have been imagined. I mean my characterisation of the age is that its astonishing influx of unbelievable innovations is so great that it has become a banality and we can forego innovations that would have been a very much greater loss to some of our ancestors. We can afford to forego more. COYLE: There will indeed be missed opportunities with unknown benefits. But is he right to suggest they don’t matter too much now we’re so affluent? While none of us can know what the path not taken would be like, we consumers do place a high value on even frivolous new products. Would we really want to be without all the spin-offs from science and business, such as ready meals or video games? Still, perhaps the precautionary principle is itself the innovation of an affluent age. Would it be inconceivable in a society where technological progress was anything but banal? Paul Drayson. DRAYSON: In parts of the world which are less developed, they would laugh at us because they would say well this is a luxury you can afford to be thinking like this. And I think we should recognize that possibly – I don’t know this but I have a sense – that possibly our aversion to risk that we have in society, not just about science but just generally that we have in western society, is because our lives are so safe. COYLE: In rich western countries we can afford to take a more precautionary approach if we want to. We could choose to place more emphasis on the potential risks of some innovations. Logically it would be a refinement of the kind of risk assessments that achieved us our position of affluence in the first place. But we shouldn’t expect people in other countries to agree with us – whether that’s Americans preferring to take their chances with GM food, or poor people in developing countries who fear our precaution might hamper their future growth. The precautionary principle would also mean our putting less weight on the potential technological benefits we’re bound to miss. There’s a high cost to always being better safe than sorry. 14