Vincent Perrier (Nextflow Software CEO) discusses the future of Computational Fluid Dynamics

Computational Fluid Dynamics (CFD) simulation is typically achieved using the industry-proven body-fitted Finite-Volume (FV) method. Particle-based methods, and especially Smoothed Particle Hydrodynamics (SPH), have now reached industrial maturity. SPH can complement established CFD solutions where FV shows its limitations, either in its capability to simulate correctly the fluid flows, or in terms of lengthy computation time.

Vincent Perrier (Nextflow Software CEO) recently spoke with Brian Albright (the editorial director of Digital Engineering magazineto discuss next-generation Computational Fluid Dynamics.

Discover his vision by listening to the podcast (the transcript is also available below).

Brian Albright

Hello, this is Brian Albright. Welcome to the DE 24/7 podcasts.
I am here today with Vincent Perrier, CEO of Nextflow Software, and we are going to be discussing next generation CFD. Welcome to the program Vincent!

Vincent Perrier

Thank you, Brian. Thank you for hosting me and welcome to all the listeners to this podcast. 

Brian Albright

To kind of start this discussion about where CFD is going, could you talk a little bit about, you know, this idea that there is really more to it than just CFD. There is not one type of CFD that is really suitable for every engineer, that is going to fit every scenario. 

Vincent Perrier

Yes, absolutely!
Today
, we can see that there is a variety of different methods, approaches and tools to CFDOf course, there are some dominant methods and tools. They are industry-proven, they have been used for decades, and the tools are really mature and really good at what they do.

But the problems get more and more complex, more and more specific, and as the combination of physics that we want to simulate also become more and more complex.
S
ometimes, I would say, the general purpose CFD solution that engineers use every day and that can address maybe very well 80% of their problems. Maybe sometimes, this general purpose CFD will hit some limits. They will need to look for alternative CFD.

In that sense, we really think, at 
Nextflow Software, that there is no “one CFD fits all”.  But depending on the problem at hand to be solved by the engineer, there is certainly the CFD method best-suited out there. Providing the right combination of characteristics that will be delivering the best results for what is being studied by the engineer.

Brian Albright

What should engineers really be looking for, in this sort of Next-Gen CFD platform?

Vincent Perrier

The engineers need to find and use the CFD method that is the most adapted to solve their problems at hand. Potentially, this may require using multiple methods at once. Either with some tightly coupled simulation, or in some cases co-simulation, or in other cases loosely coupled simulation. 

A CFD method is really the result of multiple physical assumption and numerical model choices. Today, we can see on the market probably three main families of methods.

  1. You have the well-known Finite-Volume (FV), within which you could have this kind of subsets in Finite-Volume: the traditional body-fitted, and the newest immersed-boundary octree-based Cartesian Finite-Volume
  2. Then you have a family of so-called particle-based methods
  3. Then the third family would be the Lattice-Boltzmann LBM methods.  

Depending on various criteria, engineers can have a look at the strengths and weaknesses of each of those methods. Some criteria, just example, could be: 

  • How easy it is to do the meshing, to do the preprocessing? 
  • How long it takes to do the simulation setup with those methods?  
  • What are the capabilities of those methods?  
  • What if body motions or contacts deformations are involved?  
  • Do they need to simulate turbulent boundary-layers?  
  • Are there any multiphase considerations of free surface that they need to pay attention to?  
  • What are the complex physics involved: compressible flows, supersonic flows, thermal analysis, high-density?  

There are a lot of criteria and some methods are better for some criteria than others.

Brian Albright

When we are talking about this Next-Gen platform, could you discuss the tradeoffs between accuracy or fidelity and computation times? How that is changing given some of the advancements we have seen in the compute platforms?

Vincent Perrier

When we look at next-generation CFD, I believe it really needs to provide the optimal ratio between accuracy and computation time, depending on the engineer’s objectives. 

As you know, CFD is more and more “shifting left” into the design cycle. This means, once initially used at detailed design stage, now CFD is also moving up the design cycle into concept or early exploration stages. Depending on where you are at this stage, may be the need for accuracy is not that important.

When you are at a detailed design stage, you want as much accuracy as possible. When you are early in the concept stage, maybe you just want to have a global overview. What really matters is to get:

  • the fastest computation possible
  • and just a relative comparison of designs, not absolute detailed values with regards to the physics calculation.

In theory, some methods could be expected to be more accurate than others, but it is really use-case dependent. Depending on the criteria I was explaining before, even if you try to use the most accurate method in theory that you think would work best, maybe the use-case will make this particular method not really applicable. You will not get significant results.  

There are also multiple ways of implementing a method and optimizing it for accuracy or computation time. You also want to have a look at how the different numerical models, the different discretization choices and other parameters that were made by the developer, the vendor of the CFD tool, in support of the simulation you want to achieve.

Vincent Perrier

Your question was also about computation platforms. Indeed, I think we see two very important trends in computation platforms. 

  1. First of all, GPUs are becoming more and more important. 
  2. A second trend could be also virtualization.
    There is a 
    really large choice of computation platforms available to users. I believe that the CFD solutions need to adapt and to support the best platforms that the customer has chosen for his particular design stage, in his particular use-case. 

For example, maybe user wants to use laptops and workstations for running quick simulations as accurate as possible. But with no imperative on accuracy at earlier design stages, and maybe they want to benefit from the GPU accelerator on their laptop or workstation. That could be one possibility.

Maybe, for detailed simulations at later design stages, they want to use HPC clusters. In that case, depending on their strategic choice, maybe they have CPU clusters, maybe they have GPU clusters. Maybe they have mixed CPU/ GPU.
So
 the goal for CFD simulation is really to make the most of the computation platform and benefit from all computation resources available to them.  

The ultimate goal is always to reduce the computation time, because this is time that costs money. This is also time that delays design choices and, ultimately, the time-to-market for the product being designed.

Brian Albright

You mentioned that we are seeing simulation moved further and further back into the design process. Can you talk a little bit about the role of automation in that type of environment ? Where you are seeing more simulations and they are being conducted by designers, who are not necessarily expert engineers in those particular types of simulation rules?

Vincent Perrier

That’s right! Today, simulation is here to answer questions or problems that design engineers have. For many years, design engineers have been relying on expert simulation analysis engineers:

  • to run simulations,
  • to be able to have the understanding of the physics at play,
  • and also the capability to analyze and understand the results that they observe. 

I think, to some extent, that it will be always true into very detailed phases and later into design stages. But, more and more when shifting left and when addressing simulation in early stages, you want simulation to be kind of a push-button and to provide accurate results that you can trust, but very easily for the engineer.

So today, if you look at where engineers spend time in simulation, they spend time in preprocessing. This means setting up their simulation, preparing all the files, all the parameters, all the scripts, and to launch the simulation. 

Then, obviously, there is the computation time and we talked about this before, the goal is always to reduce computation time as much as possible.

Then, after that, engineers retrieve the computation results. They do all the post-processing and the analysis to understand, first of all, to retrieve the data, to organize the data. Also to present the data in a way that it is possible to be analyzed by a human. Then, to do some 3D rendering, to analyze the results and to be able to make informed decisions on the design, and to be able to answer the initial design question they have. 

Vincent Perrier

When we look at the productivity of engineers using simulation tools, the goal for any provider of simulation solution is to reduce all this time that engineers spend and automate as many tasks as possible.

In preprocessing, this means that ideally, this preprocessing time has to be as short as possible and ultimately, as easy as a push-button. This means engineers should be able to start from their CAD files and launch the simulation right away. All the CAD cleaning or CAD optimization phase, or any manual phase, should be automated with some advanced tools. So you do not need any expert who knows:

  • the in’s and out’s,
  • the in-depth of how the solver works to adjust the CAD,
  • the mesh to the needs of the solver. 

This has to be automated by advanced tools.

In the same way, and if you have thousands of parameters to tune before you are able to launch your simulation, that’s just too complex. Simulation parameters needs to be limited to few fine-tuning parameters. Ideally, you should rely on predefined settings, for identified use-cases if possible, to make the life of engineers easy.

And then, with regards to all the post-processings, sometimes post-processing involves also some computation for doing some rendering or for just processing the data. All of that should be also made automatic.

I think the key, for making post-processing really effective is to have dedicated tools that address the application domain of the engineers. If design engineers are used:

  • to view some designs in a particular way,
  • or to have a look at some particular figures,
    then, the post-processing tools should present the data the way that design engineers are used to using them.

Brian Albright

Alright, well, thank you very much for your time today Vincent, I enjoyed talking to you. 

Vincent Perrier

Thank you, Brian, and thank you to everyone who was listening to me.