20130903 (J)
Journal: September 3, 2013
     Index     
Return to:   Site   or   Journal   Description

20150118                         Commerce                         Epistemology                         Science (Risk)

Yucca Mountain: Science, as I define it, cannot say, “What is safe or any other moral thing.” It can say, “Whether this or that potential action is more or less likely to cause this or that consequence.” That’s all, but that’s a lot. I worked at Yucca Mountain where many often said, “Decisions about the safety of the site should be based on science, not politics.” (reminds me of the claims about safety during the coronvirus pandemic, added 20200717)

As good opportunists we scientists took four or five billion dollars saying, “Yes, we can tell you whether the site is safe or not.” No wonder the project “failed”. By the “Yes”, scientists mean they can tell you the relative likelihood of meeting some safety standard set by some other means or agency. I thought at the time and still do that the safety standard for nuclear waste was (and is) set ridiculously low, especially compared to alternatives for energy generation.

The standard is irrational based on fear. I pointed this out to my boss, a chemical engineer, soon after arriving at Sandia who said, "We live in an irrational world.” I worked for a laboratory that prided itself on rationality, mathematics, logic, with the best computer tools in the world. Sandia’s main mission was to make nuclear bombs go off as planned --- but --- "no civilian deaths due to peaceful uses".

We seem to “strain at a gnat, and swallow a camel”. 1000 deaths in a million years is allowable under the nuclear waste regulations, but Sandia makes bombs intended to kill millions of civilians. My many science colleagues in their defense claim the standards don’t really matter, we can still meet them based in part on my calculations.

“So, why not bury the waste now?” I asked.

“Well, some people with whom we have considerable sympathy as honest scientists (the National Academy of Sciences), point out that we are not certain in our predictions, at least not certain enough. There is residual uncertainty that we, with more money can address, though perhaps not reduce; it is “research” after all; we can’t predict the results of “research”. We don’t know the answer before the results."

I then asked, “What is the question you are answering?”

And the answer from manager scientists and opposition scientists alike, “Is the site safe?” (see first sentence)

So, I continue, “What are you studying to make sure the site is safe?” The tower of babel now appears:

“Spores from sands in the Kalahari desert; experimental radar and x-ray scanning to try to observe fluid flowing through 1 cm cylinders; spending habits of Las Vegas and Nevada citizens and their associated feeling about our project; seed counts in crystalized urine of pack-rats. What better than scientists to identify potential effects on site safety.” All elegantly described as shedding light on remaining uncertainty about something or other that could, not would, effect behavior of the site over the next million years or so, and there effect human safety. Focus shifted from “safety” to “uncertainty” that could, however slight, affect site safety.

I then asked them, encouraged by a decision-analyst, “Yes, potentially influential! But how much influence?”

Almost all research scientists responded, “Catastrophically because uncertainty in my area of research could undermine the decision process, so my research is needed, so please fund it.”

"So you say", despite the opinion of system-analysts and now decision-analysts, showing your much exaggerated assessments of importance of your field of research to decisions about site safety. Such analyses showed time and again the “insensitivity” of possible safety outcomes to possible ranges of input variables after research is done, provided by the researchers themselves. All this goes on with the assumption that the ridiculous standard of safety is “reasonable” or at least “tolerable” if not “laudable” as either (1) a moral expression of love for people, or (2) a sure way to assure uncertainty and therefore more research money, always from someone else to support “needed” research.