
Physiology News Magazine
Bioresearchers: the end of innocence or insouciance?
Features
Bioresearchers: the end of innocence or insouciance?
Features
© Bill Parry 2002
Bill Parry is a freelance writer and also works at the Institute of Biology
https://doi.org/10.36866/pn.47.14
The scientific community was stunned last year after a team of Australian researchers announced that they had inadvertently developed a highly pathogenic strain of mousepox when trying to create a contraception vaccine for mice. The same manipulation attempted in smallpox could, theoretically, create a similarly lethal strain of the disease that could be used by states or terrorists. After consulting widely with colleagues and the Australian government and military, they went public because they felt they had ‘a duty’ to be upfront about their discovery and to alert the world that simple genetic manipulations could produce unexpected and potentially devastating effects. That was months before the anthrax attacks in the United States. Would they have gone public if the discovery had been made after the attacks? Alistair Ramsay, now Professor of Medicine at Louisiana State University, was part of that team. He says that, although the balance of opinion might have advised that they did not publish, he feels that ‘it still would have been right to publicise our findings to alert society to the very real possibility that genetic engineering can produce these potentially nasty outcomes’.
Physicists and chemists have already had to deal with this issue, and now, more than ever before, many bioscientists are having to as well. Although biomedical research will continue to make impressive and far-reaching beneficial strides in the coming decades, like all technology this progress could be (and if history is anything to go by, will be) misapplied with potentially calamitous effects. So, should bioscientists show more ethical responsibility for the potential ramifications of their research? If so, how so?
The issue attracts a spectrum of opinions. Many, such as Sir Joseph Rotblat, the 1995 Nobel Laureate for Peace, believe that science, at least temporarily in some instances, should ‘not have a completely free run’. On the other hand, Lewis Wolpert, Professor of Anatomy and Developmental Biology at University College London, cherishes ‘the openness of scientific investigation too much’ to regard any research as too dangerous to approach. Between these opinions fall the views of those who express the need for some degree of regulation and monitoring but with a minimum of intrusion.
Rotblat says in his essay ‘Science and Humanity in the Twenty-First Century’ that ‘the need for such responsibility [for one’s actions] is particularly imperative for scientists, if only because scientists understand the technical problems better than the average citizen or politician. And knowledge brings responsibility’. He suggests that scientists adopt, like medical doctors, a type of Hippocratic Oath, and believes that ethical review committees should review research projects, particularly ones that have ‘a direct impact on the health of the population’, such as genetic engineering.
This latter idea is supported by many, including Malcolm Dando, a biological weapons expert and Professor of International Security at the University of Bradford. Given the uncertain future of the Biological and Toxins Weapons Convention (BTWC), Dando proposes an internal ethical review committee system, internationally binding and nationally implemented, that includes academia, industry and government. Under this system, there would be four levels of regulation, separated by the level of toxicity or danger of the research involved, from potentially highly dangerous techniques (e.g. viral vectors) and lethal pathogens (e.g. small pox) to pathogens deemed dangerous, though less of a public threat. The levels of regulation and monitoring, concerning both research and publications, would be correspondingly adjusted to the potential threat involved. The fourth level would consider odd and unexpected findings, such as the super-virulent mousepox virus.
Dando emphasises that this is just one possible system and he encourages bioscientists to think of other possibilities. A similar call was made at a recent summit in New York, called ‘Preserving an open society in an age of terrorism’. Charles Curtis, President of the Nuclear Threat Initiative, told bioresearchers there that they must be ‘the authors, the implementers and the enforcers’ of common sense safety procedures, and that they should do so promptly and publicly.
Whatever system is considered, says Dando, it has to be ‘the least onerous and minimally intrusive’ so that ‘we can preserve most of the free exchange of science. We’re trying to find things that we have to put fences around; we want the number of fences kept as low as possible; but we want to design the best fences we can’.
Many though feel that safety and ethical checks already exist, at least within academia. Alistair Ramsay says that for most publicly-funded research, the peer-review process already weeds out most potentially dangerous or ethically dubious research proposals. His experience includes research in New Zealand, Australia and now the USA. He adds: ‘Within institutions, there are safety committees, which are often quite rigorous in terms of what you’re doing. There is a significant level of control – I wouldn’t call it “censorship”. It’s not infallible, and these committees are not constituted to address direct issues of research ethics, but certainly any work that was overtly threatening would not get through these systems, I don’t believe.’
Wolpert believes that, while scientists have the same moral duties and social obligations as any other citizen, censoring or restricting scientific understanding should not be countenanced, no matter how noble the intention:
The main reason is that the better understanding we have of the world, the better chance we have of making a just society, the better chance we have of improving living conditions. One should not abandon the possibility of doing good by applying some scientific idea because one can also use it to do bad. All techniques can be abused and there is no knowledge or information that is not susceptible to manipulation for evil purposes…. Once one begins to censor the acquisition of reliable scientific knowledge, one is on the most slippery of slippery slopes.
Dr Annabelle Duncan is the molecular science chief at the Commonwealth Scientific Industrial Research Organisation and a former deputy leader of a UN team that investigated bioweapons in Iraq. She has reservations about the internal ethical review system: ‘To apply this on a very wide scale may be problematic; I fear that they could easily become politicised… I favour widespread debate around issues. The open debate ensures that all views are brought to the fore and also has, in the longer term, an important educational element.’ She concurs that censoring potentially dangerous findings is not the answer either. She told me that there was certainly a need for more control and transparency: ‘We need to know who’s working with dangerous pathogens and doing what with them.’ She says that while the modified mousepox virus paper drew attention to the fact that unexpected results could be misused and modified for maleficent purposes, the results could also be used – and were more likely to be used – for beneficial purposes.
Several governments, however, are less confident with present safety checks. Dando has repeatedly warned academics that if they do not proactively design and implement some form of an ethical review committee system, governments will: ‘Those people who say that biology is neutral, that there is nothing we can do, and that research has been and is likely to be misused in the future – I just don’t think that will work. Legislators will constrain them, and I don’t think that biologists are going to get away with that argument for very much longer. In fact, they aren’t.’
They aren’t, because legislation has already been drafted and passed in the UK in the wake of the terrorist attacks in the US last year. Academic and industry concerns surrounding the Anti-Terrorist Act were raised in this magazine (‘Defusing the bioweapons time bomb’, Spring 2002). Many British academics have since expressed alarm over the Export Control Bill, presently making its way through Parliament, saying that it could restrict academic freedom. A recent article in the Economist (‘Secrets and lives’, 9th March 2002) examined government measures being considered in the US primarily to balance academic enquiry and national security. It states:
It certainly seems that federal agencies will—quite reasonably—place more emphasis on assessing the risks of a piece of research before agreeing to fund it. More worrying, from the scientists’ point of view, is that new areas of bioscience may become classified, that the government is considering reviewing work prior to publication (with an option on refusing permission to publish) and that it might insist that the methods section of some research papers are removed.
The US government has already recalled nearly 7000 technical documents that pertain to chemical and biological weapons production. The White House has also asked the American Society for Microbiology (ASM), with a membership of over 44,000, to limit information that could potentially be misapplied to biological weapons from the 11 journals that the ASM publishes. The society’s president, Dr Abigail Salyers, recently wrote: ‘Terrorism feeds on fear, and fear feeds on ignorance,’ adding that information that can improve public safety was the best defence against bioterrorist threats.
Alistair Hay, Professor of Environmental Toxicology at the University of Leeds, emphasises the need for academics to shape and implement a scheme, and soon: ‘Academics should begin to think about what they would like to see, what they would hope to avoid in any legislation, and to shape it from the outset. We need to run an initial scheme past the government before something [they devise] gets too far down the road.’
Alistair Ramsay reflects that scientists, as a whole, generally act in an ethically responsible way. He notes, however, two gaps in the present peer-review committee system: dealing with scientists who are going to do negative research; and dealing with serendipitous findings. Greater transparency could help alert colleagues to suspect or be suspicious of research or behaviour and discourage your typical ‘mad scientist’. Concerning serendipitous findings, he says: ‘I’m fully prepared that we shouldn’t “publish and be damned”, that we have a responsibility to discuss our findings where we sense there’s a potential for misuse. I don’t know the best paradigm for that. What we need now is to put in a mechanism, which scientists will have some input into shaping, that will deal with unexpected findings and how best to potentially control their misuse.’
And many, including Ramsay, add that any review system should involve the public in some capacity. Wolpert states his view unequivocally: ‘I do not believe that scientists, or any other group of experts, should have the right to take ethical decisions on their own that affect the lives of the public. Their ethical beliefs may not reflect the public view and that is why I have always argued that their responsibility is to put their knowledge, and its possible applications, in the public domain.’
The recent anthrax attacks, the current war on terrorism and chilling Al-Qaeda documents discovered that reveal their willingness to use biological weapons, have put bioscientists and the dark potential of their research into the public spotlight. In response, our governments have warned that some freedom will have to be sacrificed for greater security from these threats. While many bioscientists might feel that the current systems adequately keep a check on research and the potential misapplications of it, governments and the public seem less comfortable with their claim: risks over which individuals have little personal control are far less readily embraced, especially with something as terrifying as a biological weapons attack. Moreover, recent events show that there is clearly a need to deal with unexpected results, terrorist attacks and, if possible, aberrant scientists.
While I found little consensus over the specific nature of an internal ethical review committee system, there was overwhelming support for a strengthened BTWC. This internationally binding agreement prohibits the development, testing, production and stockpiling of biological weapons via a monitoring mechanism that promotes transparency and confidence building measures. Negotiations to jump-start this convention will resume in November after talks last year broke down.
In the meantime, legislation is being considered or drafted that could have a significant bearing on many bioscientists’ research and careers. It seems that, like it or not, many bioscientists will be forced to be ethically and professionally more active in containing that risk. Bioscientists can influence that process now or wait for it to influence them.