Research ethics in nanotechnology

IEET Fellow and nanotech researcher Sascha Vonger has dropped a bomb on unethical practices in nanotechnology. According to him, rigorous research is being avoided in favor of flashy 'experiments' that are essentially non-scientific.

"Publish-or-perish culture turned science into an endeavor where deception is vital to get ahead, and nanotechnology ranks as one of the worst. A scientific field that has evolved this far into being a structure wherein deception is basically systemic cannot be trusted to self-regulate."

So what's the harm here? Many nanoethicists focus on existential risk, classic scenarios like grey goo replicators, nano-augmented super humans, and other far out ideas. Some more conservative thinkers worry about inequality, and whether nanotech will merely be a toy for those who are already rich and powerful, or if nanomaterials can be used to improve the quality of life in the third world. And of course, the safety of nanoparticles in the environment has yet to be conclusively established, and there is some evidence that carbon nanotubes can cause cancer.

Vonger proposes another risk, that nanotechnology is failing to be a rigorous science, and that this is unethical. In the classic CUDOS framework, nanotech does not have organized skepticism. Instead of a rigorous examination of an article, the community is relying on various signals (the authors are PhDs, the article is in a respected journal) to verify the integrity of the science. This is far from ideal, but realistically, an individual researcher can't check the fundamentals of every fact or article he uses. An organized, trusted community standard makes science more efficient.

The problem (if Vonger is correct), is that nanotechnology is refusing to accept internal criticism of its technical methods, and is therefore producing bad knowledge. Bad knowledge can be fatal for a discipline in several ways: heralded results can be publicly overturned in an embarrassing way (see Arsenic lifeforms), bad policy decisions can be made on the base of bad science (vaccines and autism), or finally, a series of individually harmless exaggerations can leave a field with no solid conceptual underpinnings. It is this last that nanotech is most vulnerable too, as field essentially born on great expectations, one gravely wounded in the Drexler-Smalley assembler wars, and under an immense burden of popular futurist pressure, its social structure isn't capable of dealing with criticism.

So what's the solution? There isn't an easy one, and it has be implemented across many disciplines. No one ever became famous for disproving a scientific theory, we're intrinsically biased to favor positive result, and ongoing culture wars over creationism and global warming (to give too examples), have made scientists wary of being proved wrong in public. Asking scientists to have the moral fortitude not to engage in 'cargo cult science', is one solution, but in the face of incentive which reward rapid publishing, will not work. Rather, science as a whole should favor more things like Journal of Negative Results, and recognize that research is an inherently uncertain process. Fewer papers, better papers, and perhaps even a tiered system between proven results and speculation, as opposed the informal system of credibility of knowledge we have now.

No comments:

Post a Comment