There are two ways to generate electricity from nuclear power. The first is nuclear fission, which is the primary source of power in the world today, accounting for around 10% of total global electricity production.
Then there’s the concept of fusion. Whereas fission separates atoms, fusion joins them. Instead of radioactive fuel, there are several radioactive emissions from the fusion of elements such as hydrogen-3, deuterium, and tritium, which irradiate the containment structures. Lower radioactive waste that lasts less time, but still radioactive waste for those who are concerned.
The only genuine fusion project on the globe, the ITER Tokamak project, has been in the works for decades and is set to go live around 2040, when they’ll put in 50 MW of heat and bring out 500 MW of heat, or 10x the power.
However, I just read an article by Steven B. Krivit in which he said that ITER will not generate more energy than was put in, and that ITER eventually confirmed this to a news source.
Really? This project, which is expected to cost between $18 and $45 billion, is not meant to create more energy? That didn’t seem probable.
After some research, we discovered that ITER will need around 200 MW of energy input to function in complete, since it generates 500 MW of heat. However, due to the exergy of heat, tapping it would only provide roughly 200 MW of power.
So it was a perpetual motion machine, but it couldn’t accomplish anything other than keep its lights on as long as it was fed tritium, which cost roughly $140 million each year.
Putting all of this together, it looks that I will have to agree with Steven B. Krivit and state that fusion-powered energy appears to be as far away as it has ever been.
Reference- New Energy Times, Forbes, Clean Technica