A major issue facing ITER, the international tokamak under construction in France that will be the first magnetic fusion device to produce net energy, is whether the crucial divertor plates that will exhaust waste heat from the device can withstand the high heat flux, or load, that will strike them. Alarming projections extrapolated from existing tokamaks suggest that the heat flux could be so narrow and concentrated as to damage the tungsten divertor plates in the seven-story, 23,000 ton tokamak and require frequent and costly repairs. This flux could be comparable to the heat load experienced by spacecraft re-entering Earth's atmosphere.
New findings of an international team led by physicist C.S. Chang of the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) paint a more positive picture. Results of the collaboration, which has spent two years simulating the heat flux, indicate that the width could be well within the capacity of the divertor plates to tolerate.
Good news for ITER
"This could be very good news for ITER," Chang said of the findings, published in August in the journal Nuclear Fusion. "This indicates that ITER can produce 10 times more power than it consumes, as planned, without damaging the divertor plates prematurely."
At ITER, spokesperson Laban Coblentz, said the simulations were of great interest and highly relevant to the ITER project. He said ITER would be keen to see experimental benchmarking, performed for example by the Joint European Torus (JET) at the Culham Centre for Fusion Energy in the United Kingdom, to strengthen confidence in the simulation results.
Chang's team used the highly sophisticated XGC1 plasma turbulence computer simulation code developed at PPPL to create the new estimate. The simulation projected a width of 6 millimeters for the heat flux in ITER when measured in a standardized way among tokamaks, far greater than the less-than 1 millimeter width projected through use of experimental data.
Deriving projections of narrow width from experimental data were researchers at major worldwide facilities. In the United States, these tokamaks were the National Spherical Torus Experiment before its upgrade at PPPL; the Alcator C-Mod facility at MIT, which ceased operations at the end of 2016; and the DIII-D National Fusion Facility that General Atomics operates for the DOE in San Diego.
Widely different conditions
The discrepancy between the experimental projections and simulation predictions, said Chang, stems from the fact that conditions inside ITER will be too different from those in existing tokamaks for the empirical predictions to be valid. Key differences include the behavior of plasma particles within today's machines compared with the expected behavior of particles in ITER. For example, while ions contribute significantly to the heat width in the three U.S. machines, turbulent electrons will play a greater role in ITER, rendering extrapolations unreliable.
Chang's team used basic physics principles, rather than empirical projections based on the data from existing machines, to derive the simulated wider prediction. The team first tested whether the code could predict the heat flux width produced in experiments on the U.S. tokamaks, and found the predictions to be valid.
Researchers then used the code to project the width of the heat flux in an estimated model of ITER edge plasma. The simulation predicted the greater heat-flux width that will be sustainable within the current ITER design.
Supercomputers enabled simulation
Supercomputers made this simulation possible. Validating the code on the existing tokamaks and producing the findings took some 300 million core hours on Titan and Cori, two of the most powerful U.S. supercomputers, housed at the DOE's Oak Ridge Leadership Computing Facility and the National Energy Research Scientific Computing Center, respectively. A core hour is one processor, or core, running for one hour.