Written by: Sarah Pinkelman
“Instead of blowing up rockets before finishing a design, what if we can do it on a computer?” asks professor Karthik Duraisamy. As director of the Air Force Center of Excellence (CoE), Duraisamy and his team, which spans five universities, are doing just that. Working within the Center for Data-Driven Computational Physics at the University of Michigan Aerospace Engineering program, the CoE has reached year five of a six-year project in partnership with the Air Force. Thus far, the CoE has produced significant innovations in computational simulation techniques, enabling engineers to produce complex and reliable designs. Meanwhile, these methods save copious amounts of time and money that might otherwise be spent running complex simulations on supercomputers.
Ramakanth Munipalli, a senior aerospace research engineer with the Air Force Research Laboratory, says that rocket engine development traditionally involves costly variables, high stake safety factors, and decades of effort: “Previous generations of rocket designers did not have access to realistic computer-based simulations of key physical challenges such as combustion instability. . . . Design insights were derived from arduous and expensive testing . . . [and] simple models were developed for areas where a high risk element was perceived.”
The effects of fuel combustion and oscillation within the confined space of a rocket nozzle combustion chamber typically presents a design time frame that can take more than a decade. F1 liquid rocket engines burn about 5,000 gallons of fuel per second and generate about 10 million pounds of thrust. “No matter how well you design your engine, there is always going to be some [oscillation]. You just want to keep it manageable,” adds Duraisamy. The CoE accelerated the capability of Reduced Order Models (ROMs) to meet high-fidelity simulations with the required accuracy levels in about a thousandth of the time, producing digital twins far more capable.
Munipalli notes that while “state of the art computational tools are able to compute rocket combustor physics at a high degree of fidelity . . . their cost makes them prohibitive for routine usage and one is limited to modeling smaller sub-systems . . . Reduced order modeling tools that the COE is developing, provide perhaps the only way out of this tight corner. If we can train reduced order models on simpler problems using advanced simulation tools, we could network them to form a ‘system of systems.’”
Applicable to Multiple Systems
Furthermore, such methods can be applied to other complex systems, from predicting storm progressions to improving power grids to enabling load following in nuclear power plants. The CoE continues to develop ROMs robust enough for efficient design of existing systems and for digital twin simulations to run in real time. Among other useful applications, this means repairs can be proactive; for example, they’ll anticipate the necessity of replacing a certain panel or gauge before it degrades.
Ultimately, these tools to create better ROMs and digital twins will be transferable to interested parties. The CoE, for example, will transition these tools to the Air Force so they can design their rockets more efficiently, but they can also transfer similar tools to different levels of complexity. Munipalli adds that “if we can figure out how to make such a composite model work effectively, we can use it to study system performance, its behavior in a range of conditions and understand how to optimize and control it. This leads directly to the idea of a digital twin.”
Defining Leaps in Research
Duraisamy and his team describe four significant jumps in research they’ve made thus far in accuracy, stability, scalability, and in the ability to bring all the parts together into a new paradigm to simulate systems that are too expensive for high fidelity simulations. These new techniques “can speed up the simulation by a factor of 1000,” says Duraisamy, while also improving accuracy and stability and scaling these models from requiring a supercomputer down to a typical engineer’s computing workstation.
A reduced order model takes very complex simulations and data and reduces them to simpler forms, which in turn decreases the computer power and time.Complex systems, such as rocket engines, are typically modeled using differential equations that have many unknowns. For instance, high fidelity simulations of a rocket combustor may involve solving for billions of unknowns. ROMs can reduce this complexity by finding a set of equations with far less unknowns—say a few hundred instead of a billion. However, ROMs traditionally are not very adaptable to other situations and problems; the CoE has combined the best of data- and model-driven ROMs to change that. Duraisamy and collaborators introduced a hardware and software ecosystem called ConFlux, which utilizes this mix of physics simulation and big data modeling to make digital twins—digital versions of mirrored systems that have taken the technology world by storm in the last five years.
How do they get there? Duraisamy and his team start with component-wise modeling. They break down the complex system (e.g., a rocket engine) into small components (e.g., injectors in the rocket engine) for which they develop ROMs. Then they integrate the ROMs for the components to model the complex system. “When we develop the component models . . . they are trained using data from an expensive simulation for a small component”. They are able to do this through an important synthesis of tools: “Our novelty here is we combine physical models, machine learning and mathematics in a unique way.”
Researchers Cheng Huang and Chris Wentland are part of that process. In a study that appeared in The Journal of Computational Physics, they present how they switched the variable groupings to dramatically improve accuracy: “It’s one of the breakthroughs we had,” says Wentland. “Chuck Merkle from Purdue introduced this idea originally in the context of high-fidelity simulations… Normally computational fluid dynamics (CFD) folks like to work with conservative variables—density, momentum, total energy. . . . Engineers like to work with things like pressure, temperature, and velocity. . . . These ROMs now target these engineering quantities that are the primitive variables. . . . ROMs that target those primitive variables are way more stable, accurate, and provide greater cost reduction.” This means they can eliminate problems like spurious burning, “enabling the ROMs to provide accurate representations of the heat release rate and flame propagation speed,” writes Huang et al. in the study’s abstract.
The digital twins integrate the data from the ROMs, learn and adjust, and make real-time maintenance and repairs possible. Huang says that their models “ have great potential with digital twins. . . . We are definitely pioneers—I haven’t seen other people do this. You can interact with the rocket engine to fix itself, report issues, and during long-term space exploration, you may find your engine has some issues because it’s unpredictable in space. . . . You can ask the digital twin to make adjustments. This is our long-term goal. “
Researcher Elnaz Rezaian works with a data-driven version of balanced truncation, which is called eigensystem realization algorithm (ERA). It doesn’t use the operators of a high-fidelity system, so it still functions under stiff conditions—when a problem is difficult to handle numerically. The CoE is applying her method to a simple reactive flow problem. “Next,” she says, “we will apply the method to higher dimensions and extend the method to non-linear systems. Ultimately we want to apply the method to rocket combustors.” She adds that “our application is pretty unique. . . . We have moved further by investigating the predictive performance of ROMs, which means their performance beyond the training regimes. . . . Our specific application is rocket combustor, but these methods can be applied to the most fundamental issues that ROMS have been facing through the past decades.”
Then, along with accuracy, stability, and scalability, as described in a newly published study, Huang has determined how to replicate and connect the models into one full-system model. As Huang et. al write in the abstract, the “trained ROMs are then coupled and integrated into the framework to model the full large-scale system.” This new paradigm brings the models together into concert such that, “the framework is demonstrated to accurately predict local
pressure oscillations, time-averaged and RMS fields of target state variables, even with geometric changes,” bringing the center even closer to their goals.
Meanwhile, for most of the applications, Chris Wentland and fellow PhD student Nicholas Arnold-Medabalimi, who works in the Computational Aerosciences Lab, have spent years making sure everything is scalable. Arnold-Medabalimi is not directly in the CoE, but he and the center still exchange information to the benefit of both. And that’s one of the CoE’s most important means of functionality and success. Huang says that “we have a collaborative pattern that’s crucial to our success. There aren’t a lot of opportunities to have rocket engineers and scientists working with computational science experts and mathematicians. This center brings together a group of experts from totally diverse and different backgrounds [and] it has worked out great.”
Duraisamy adds that “collaboration with experts, such as Profs. Karen Willcox (UT Austin), Benjamin Peherstorfer (NYU), William Anderson (Purdue), and their research groups, has been critical to the success of the center. The Air force has been very supportive of our work, and has given us a guiding hand throughout the duration of the project.” In addition to achieving their research goals, Duraisamy says about his team, “good things are happening. Students are graduating and establishing their careers.”