Credit: Special Arrangement
In recent years, there has been much discussion about increasing global warming and the need to cut down the usage of fossil fuels while generating power.
In the long term, nuclear fusion, where power is generated by nuclear reactions from the fusion of hydrogen (H) isotopes, especially deuterium (D) and tritium (T), at temperatures of 100 million degrees, is considered the most promising and sustainable future source of energy.
In the 1970s, nuclear fusion among H isotopes was promised to be the most sustainable and abundant future energy source, especially considering the vast abundance of deuterium in the oceans (1 in 5000 molecules in the sea is heavy water, D2O) and ocean covers two-thirds of the Earth’s surface. As Homi Jehangir Bhabha, father of atomic energy, put it at an international conference, nuclear fusion power was as abundant as the deuterium in the seven seas.
Nuclear reactors generate power through nuclear fission when heavier nuclei like uranium break up (fission) into lighter nuclei. Currently, only 10 percent of the world's energy is generated by nuclear energy through fission. The isotope that undergoes the fission reaction, U-235, is present in less than 1% abundance in natural uranium.
This low abundance of the fissionable isotope implies that a nuclear reactor generating 3 gigawatts (GW) (3,000 megawatts-MW) of power requires about 1,000 tonnes of natural uranium. So, to generate 300 GW, you need 100 such reactors! This is still only a tiny fraction of the world's power requirement.
A 3-GW coal power plant would need 10 million tonnes of coal and generate 40 million tonnes of carbon dioxide yearly. So, if 70% of the world's power comes from fossil fuels, at least 40 billion tonnes of CO2 are pumped every year.
What about nuclear fusion reactions? Unfortunately, to trigger sustained Fusion reactions between deuterium nuclei requires a gas to be heated to at least a billion degrees. However, the fusion reaction between deuterium (D) and tritium (T) requires only about a hundred million degrees, 10 times lower than required for D-D fusion.
In an explosive release of energy like in a hydrogen (H) bomb, the high temperatures required are achieved using a uranium or plutonium fission bomb, which heats the surrounding D-T mixture to the required temperature. Note that the so-called H bomb uses D and T isotopes of H.
Laser fusion
The past several decades have involved considerable efforts to heat the D-T plasma to at least 100 million degrees. Recently, in an inertial confinement fusion (ICF) at the National Ignition Facility (NIF) in the United States, 192 powerful lasers with ultra-short pulses heated the D-T mixture and generated more power than was supplied, also called fusion ignition. This is a milestone known as net energy gain or scientific breakeven. This process is called laser fusion.
The plasma is to be heated and confined for a sufficient time to enable the D-T mixture to interact. The Lawson criterion states that the product of the particle density and the confinement time should exceed a certain value for sustained reactions.
Tokamak fusion
This is now being achieved in several devices. Tokamak devices—a magnetic confinement device that will heat plasma to temperatures over 150 million°C to enable fusion—are available in several countries and aim to heat and confine plasmas. The Chinese EAST (Experimentally Advanced Superconducting Tokamak) has confined the plasma for over 1,000 seconds.
The International Thermonuclear Experimental Reactor (ITER), the world's most ambitious energy project to create a self-sustaining nuclear fusion reaction (like the Sun) using hydrogen isotopes deuterium and tritium, is now being built in Cadarache, France. It involves a collaboration between the European Union (EU), India, China, France, Japan, Russia, South Korea and the United States. This project hopes to have a possible fusion reactor by 2035, although commercial setups will take longer. Tokamak devices are used here as well.
Nuclear fusion generates about seven times more energy per unit mass of fuel consumed than nuclear fission. Thus, a 3-GW fusion reactor would require about 200 kg of tritium annually, the output of several 100 Candu-type reactors.
The tritium conundrum
However, there is a real catch in all this! The crucial element, tritium, is unstable and decays in a few years. Unlike deuterium, it does not occur naturally. The civilian stock of tritium in the world in 2024 was estimated to be about 25 kg. The cost of tritium is about 30,000 dollars per gram. So, the tritium required for a 3 GW reactor would cost 10 billion dollars annually. The ITER itself could use up all the available tritium.
Even if we want 10% of the world's power requirement to be obtained from nuclear fusion, we would require a few hundred tonnes of tritium per year. So that at present rates, it would cost 20 trillion dollars per year, comparable to the GDP of most developed countries!
Experiments are being conducted to make fusion reactors (like ITER) self-sustainable in producing tritium by producing it in real-time. A 10-metre-thick lithium blanket surrounding the reactor is needed to generate the tritium in a deuterium-tritium (D-T) reactor. Neutrons produced in the D-T reaction would interact with lithium nuclei to produce tritium. Lithium, again, is scarce and could be much needed for batteries, phones, cars, etc.
Boron-11 fusion is also being explored, but the results are still far from satisfactory. With these practical hurdles, producing power through fusion will remain a pipedream if technological advances cannot find a solution.
(The author is an astrophysicist)