All-in-one simulation tool enables faster, smarter fusion reactor design

The road ahead

FREDA is currently one year into an ongoing five-year project, with much of 2024 spent tackling two of the biggest obstacles to creating a unified modeling framework: integration and uncertainty quantification.

Coupling plasma and engineering simulations is no easy feat, as these individual modeling tools were developed over the course of decades by different teams at institutions all over the world. There are differences in complexity, fidelity and physics across models, like pieces cobbled together from different puzzles. Rick Archibald, leader of ORNL’s Data Analysis and Machine Learning group and data analytics lead for the FASTMath Institute, is one of the computational scientists on the FREDA project responsible for making sure these pieces fit together and the models are consistent across the framework.

When a model-builder wants to examine the activity on a surface or in a volume of plasma, they must generate a mesh, a grid of shapes – thousands to millions of squares, triangles or hexagons – that conforms to the geometry of the subject, and then simulates the physics that occur within each of those shapes. There are always trade-offs when constructing a mesh, though, as smaller grids may be more accurate to the geometry and provide more granularity but are more complex and require more time and computational power to run. 

“The simplest way to generate a mesh would just be to make it a uniform mesh and make it as dense as possible, but that would take a lot of computer time,” Archibald said. “You want to be a little more clever than that, so you can design your mesh around some properties you know about your simulation, so you have better resolution in areas where you want it and a little bit less resolution in places you don’t.”

This problem is then compounded when you try to combine the models and enmesh the meshes, as many of these models were not developed to be combined with other models of different resolutions. Typically, these efforts live in silos, meaning that a plasma expert designs a model assuming they know how a wall is going to behave, and a material scientist modeling a wall assumes how the plasma will behave, Archibald said. 

“The trick here is what happens when you run them together and replace those assumptions with the actual behaviors,” he added. “All of a sudden, those nice solid assumptions get mixed up, and the errors in one confect the errors in the other, and you get all sorts of problems. When you talk about building a whole device, connecting multiple components together and having them all talk to each other, that little problem gets magnified a lot and becomes the key focus of this project.”

Grid generation is also a time intensive process that takes a lot of manpower and expertise in both modeling and physics to perform. FREDA will accelerate and automate the entire process thanks to high-performance computing, machine learning, and the wealth of resources provided by decades of fusion modeling research. The tool will use machine learning methods to analyze the meshes that have already been made for different reactor designs and utilize artificial intelligence to generate new meshes that are optimized around a desired design point without human intervention.

“Machine learning is a really fast-paced field, and the new tools that are coming out of that field have helped us do things that we may not have been able to do in the past,” Archibald said. “Machine learning methods will look at what scientists in the community have generated in these situations and pull them together so that everything you need to run a simulation can be generated and given to you automatically with no person in the loop.”

The other main hurdle to clear is uncertainty quantification. There are many sources of error and uncertainty, depending on the model. Without a full fusion device to validate against, a model can’t fully reflect reality, but if you are able to quantify just how much your model is off by, you can design with a certain degree of confidence.

“There are a lot of assumptions we make within the analysis, but if you want to build a device, you have to have a starting point,” Borowiec said. “Even though the results you get might be uncertain, you can still rely on them more than just hoping for the best.”

When one is modeling a design space, they want to avoid the “cliff’s edge” and not optimize a device as far as possible, because if one of the parameters is off, it will no longer work. Instead, you want to utilize “design under uncertainty,” Borowiec said, where the uncertainties of a model, once quantified, are incorporated into a less optimized and safer design. So even if the assumptions are off by 5 to 10%, the device will still work. 

“If you have uncertainty in the heat fluxes on the first wall of your device and the melting point of that material is 1,000 Kelvin, you don’t want to design for that temperature because the heat may exceed that,” she said. “Instead, you design it for 800 Kelvin so you have some wiggle room, and it will still be okay if the actual temperature exceeds the one you had planned.”

The uncertainty quantification process is ongoing, and as new fusion devices and test stands are built, the experimental data produced will help further improve the models and help FREDA be as accurate and reliable as possible. 

“It takes a village.” 

Given the scale of what FREDA is trying to accomplish, it comes as no surprise that the team behind it is equally large and diverse. Researchers from ORNL’s Fusion Energy, Nuclear Energy and Fuel Cycle, and Computer Science and Mathematics divisions are all contributors on the team, as well as personnel from Lawrence Livermore National Laboratory, General Atomics, Sandia National Laboratories and the University of California San Diego. 

ORNL is specially suited to lead such an endeavor, though, given its breadth of expertise and unique capabilities in fusion topics, strong history of fission, HPC capabilities and advanced materials program. The cross-cutting nature of the lab was one of the reasons Collins came to ORNL in the first place.

“It really takes a village. The nature of this is much different than a lot of other projects that are more focused on just the plasma or subcomponents,” she said. “It brings together these different communities of people, generates diverse thought and helps you to better focus on how you communicate your goals because we have all these tasks running in parallel.”

Park echoed the statement, noting the unusual nature of the collaboration and the mingling of the plasma and engineering modeling communities.

“Usually, the people in plasma modeling have the same background, but the engineering and computational sides come from lots of different disciplines, so developing the environment and combining the knowledge bases is a really important aspect,” he said. “I don’t think any other lab in the world can do this.”

A tool such as FREDA is also vital if the United States wants to accelerate the timelines that have been set to deliver fusion power and meet the growing demands for green energy. Given the time, expense and effort it takes to build a system from scratch, it is not feasible to build and iterate on new facilities until they work. Instead, FREDA aims to combine the hard work done by the community with the best tools available to rapidly design, iterate, and automate the creation of the next generation of fusion devices.

“This is an extremely important project because if we want to make progress, we have to work really quickly,” Borowiec said. “This software is attempting to do that, and if we can get everything to work, we will have a fully integrated and optimized design assessment tool ready for the public and private sectors to utilize.”   

FREDA is funded under the Department of Energy’s Scientific Discovery through Advanced Computing, or SciDAC, Fusion Energy Sciences Partnerships program. SciDAC partners all six Office of Science programs — Advanced Scientific Computing Research, Basic Energy Sciences, Biological and Environmental Research, Fusion Energy Sciences, High Energy Physics and Nuclear Physics — as well as the Office of Nuclear Energy to dramatically accelerate progress in scientific computing that delivers breakthrough scientific results.

The ORNL team includes Cami Collins, Rhea Barnett, Mark Cianciosa, Yashika Ghai, Ehab Hassan, JM Park, Phil Snyder, Gary Staebler from the Fusion Energy Division; Vittorio Badalassi, Jin Whan Bae, Kate Borowiec, Robert Lefebvre, and Arpan Sircar from the Nuclear Energy and Fuel Cycle Division; and Rick Archibald, David Bernholdt, Wael Elwasif, and Ana Gainaru from the Computer Science and Mathematics Division. Other institutional team members are Benjamin Dudson and Jerome Solberg from Lawrence Livermore National Laboratory, Jeff Candy and Orso Meneghini from General Atomics, Michael Eldred from Sandia National Laboratories, and Christopher Hollan from the University of California San Diego.

UT-Battelle manages ORNL for the Department of Energy’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.  –– Sean Simoneau

This Oak Ridge National Laboratory news article "All-in-one simulation tool enables faster, smarter fusion reactor design" was originally found on https://www.ornl.gov/news