By BRITTA LAM
As climate change is looming, it is paramount that we fully understand technologies before we judge them. With projected increases in energy consumption, part of our future lies in decarbonizing the grid. Currently, such efforts are largely propelled by the global shift toward renewables branded as low-risk, natural, and cheap.
On the other hand, Fukushima and Chernobyl have painted nuclear fission as harmful, destructive, and expensive. These descriptions, however, omit the fact that through the mining of rare earth metals, solar energy emits 40 times more radiation per unit of energy than nuclear energy. Left out is the fact that renewables are cheap only because they are heavily subsidized; the same could be said for nuclear energy.
Nuclear fission generates energy through neutron bombardment of rare-earth elements, such as enriched uranium and plutonium, to release thermal energy. Similar to fossil fuel plants, the heat energy is used to boil water, producing steam that spins a turbine to drive a generator.
This reaction results in thermal energy and nuclear waste, which is contained and can be recycled and reused. The uranium, which is contained in fuel rods, is considered spent after approximately two years of undergoing fission. It is then moved to short-term storage for six to nine months in pools to cool off before it is transferred to long-term storage in dry casks.
Unlike most renewables, nuclear energy is reliable because of its ability to be run 24 hours per day safely while matching peak energy demands. Nuclear energy can replace our dependence on fossil fuels, while also dramatically reducing carbon emissions.
Regardless of efforts to increase efficiency and to reduce energy usage in our industrial world, we will always need energy. Thus, with carbon reduction being central to mitigating climate change, our main policies should not be based on perception; they should instead be based solely on knowledge.