Instead, De’s team created a small, simulated population of digital representatives of people called agents, who would theoretically move throughout a nuclear reactor. Agents established a pattern of life to include finding their workstation, visiting the lunchroom, and walking through the building. Within a few days of real-world time, a simulated population could amass a dataset covering years, even decades, of behavior.
The researchers then created a virtual simulation of a nuclear reactor facility. Since floor plans of nuclear reactors for commercial power are not made available to the public, De’s team created a digital twin of ORNL’s High Flux Isotope Reactor, or HFIR, a DOE Office of Science user facility. The simulated population data was merged with the HFIR digital twin to create a virtual reality simulation of people working inside a nuclear reactor.
Using VR goggles, a user maneuvers through the virtual HFIR, leaving a trackable trail of activities that establish a pattern of life. The research team gathered data about the user’s behavior, looking to see if the person followed the rules or deviated from the task. The final data is compiled into a machine learning algorithm to train on detection of anomalous human behavior, identifying threats to reactor operations before a situation happens.
“We gathered human movement data for non-playable characters (NPCs) under sensor deployment restrictions at HFIR and built an agent-based model based on anecdotal evidence gathered during tours and discussions with facility managers,” said Gunaratne, an ORNL artificial intelligence research scientist. Gunaratne brought his experience modeling human movement in urban spaces to the project, going through different scenarios to find the right answer to crowd control challenges.
An end-user wearing the VR headset on sees a facility and equipment around them, NPCs walking around, and a clipboard in their hand with a checklist of tasks similar to tasks a worker would perform throughout their day. Gunaratne said the user proceeds through the tasks as well as performs pop-up instructions based on where they are in the simulation.
A part of this work recently won the Best Demo Paper award at 25th IEEE MDM’24 conference in Brussels, Belgium. Additional HFIR simulation findings, which will be published in the Proceedings of the 2024 Interservice/Industry Training, Simulation and Education Conference and Proceedings of the 2024 Winter Simulation Conference, create a realistic picture using non-player characters to make the VR feel real.
“In this situation with HFIR, we could try different what-if scenarios based on cost and safety risk, such as insider threat situations, to find out what shifts in normal behavior are exhibited during a non-typical event,” Gunaratne said. “We could even observe if the cascading effects of one agent’s anomalous behavior impacts other agents.”
The team said this simulation can also be used to train emergency response personnel on what-if scenarios in restricted or secure campuses.
This project is funded by ORNL’s Laboratory Directed Research & Development Program.
UT-Battelle manages ORNL for the Department of Energy’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science. — Liz Neunsinger
This Oak Ridge National Laboratory news article "Virtual reality gives new vision to nuclear reactor security" was originally found on https://www.ornl.gov/news