As the level of the Fukushima accident is raised to the most severe (the "experts" said this would not happen at the start of the crisis), the truth begins to emerge about the radiation danger & awful treatment of workers at the nuclear plant in this excellent investigative piece in the New York Times.
Take this for example:
"Current and former workers said, radiation levels would be so high that workers would take turns approaching a valve just to open it, turning it for a few seconds before a supervisor with a stopwatch ordered the job to be handed off to the next person. Similar work would be required at the Fukushima Daiichi plant now, where the three reactors in operation at the time of the earthquake shut down automatically, workers say."
If radiation exposure was safe, as some are even now trying to claim, would this be happening?
The recent spat between George Monbiot and Helen Caldicott about the dangers of radiation is at a relatively trivial level: just because nuclear power has yielded fewer deaths than coal it does not follow that we should support nuclear.
If we burned live human bodies in incinerators as a biomass fuel - for the sake of argument - we could claim that coal was safer than this. It wouldn't entitle us to say that we should burn coal for power if there are safer alternatives!
The real question is: if a Fukushima or Chernobyl incident - however awful - is a local one, compared to global warming, which is by definition global in its effects, does this justify using nuclear power?
In other words is it right to accept a smaller harm to stave off a larger one?
The reason why this seems difficult to evaluate is the relative maturity of other, renewable, technologies, and the mental roadblock the political majority seems to accept over the true potential of energy efficiency.
I've tried to use my blog to illustrate some examples of these recently - there are many more.
At a climate change networking event last night I talked to Peter Davies, head of the Climate Change Commission for Wales. He pointed out that in 1992 the first wind farms appeared on Welsh hills. Now, twenty years later, the Chinese are churning out hundreds of turbines a week ten times their size.
It's taken 20 years for the technology to reach mass, mainstream, application. If there had been a real sense (wartime) of urgency, this time could have been halved.
Marine current turbines and other marine technologies, anaerobic digestion, short-rotation coppice biomass, algal biofuel production, dye-sensitised PV, solar thermal electric plants, mass rollout of energy efficient cars and products, hydrogen storage, phase change materials, smart meters and the smart grid, eco-retrofitting buildings, solar water heating, heat pumps, offshore wind and so on - the list is developing all the time - collectively can produce/save all the energy we need. Several studies have illustrated the appropriate scenarios.
Currently, official thinking says this can't happen till around 2060 - too late to save us from a minimum 3 degree rise without nuclear power and carbon capture and storage.
My belief - and I am far from being alone - is that both the new generation of nuclear power stations and CCS are currently technically and economically unproven at a mass scale - they're about at the same level of development as solar thermal electric plants and marine current turbines respectively.
If the R&D money that is/would be going into these technologies instead were to be channeled into the safer renewable technologies and into energy efficiency, we could accelerate development and bring forward the 2060 date by twenty years.
This world would be far safer - no oil and gas disasters, no coal and uranium mines, no more Fukushimas - and create far more employment - not to mention it could begin to reduce atmospheric concentrations of greenhouse gases.
Germany, Europe's most successful economy, has just announced it is going to do just this. Why can't every other nation on earth?