When simulating the physics of fluids, there’s a lot we need to consider. The SPH method we use (see our previous blogpost on the subject) takes care of simulating the actual movements and concentrations of the fluids on the mesoscale, but the forces pushing around our fluids on the mesoscale actually originate from what the molecules are doing — i.e. on the microscale. The microphysics we feed to the mesoscale simulation does not come from the SPH method itself, but rather from microscale methods that can account for the statistical mechanics of fluid mixtures.
Currently, the microscale is taken care of in RheoCube by using something called lattice fluid theory (LFT— an extension of the classic Flory-Huggins theory). LFT is a bit crude, as it basically assumes that molecules can only move around on a lattice, but it gets most trends right. Nevertheless, in order to get the quantitative information that we need for the mesoscale, we have to tune a few parameters. This is, of course, far from ideal; we would much rather start from a clean slate, build in the correct physics from the ground up, and extract the relevant numbers. This means introducing molecular dynamics (MD) simulations.
Molecular dynamics is the method whereby we put in the forces between molecules (which itself is pretty complicated), and then just let normal classical mechanics do the rest. It turns out that just letting molecules interact through Newton’s laws can give us pretty good results (although occasionally we find that quantum corrections can have small effects. Pretty much anything we might want to know about how liquids behave on the microscale can come out of MD simulations.
Can we find a way to only do a few MD calculations, but retain their accurate predictions over all the many points that we’ll need for the full mesoscale?
Molecular dynamics simulations, such as the snapshot of polymers and organic solvents are an accurate way of modelling the complicated molecular physics on the microscale, which will become available in RheoCube in the near future.
However, we all know that nothing comes for free. MD may be a beautiful method for getting some impressively accurate results, but the calculations take a long time. We cannot really afford to have every single time step in a SPH simulation require hundreds of MD simulations, each of which will take multiple hours to run. We’d be waiting for centuries. But perhaps doing a simulation at every single possible point is overkill.
The answer involves a technique that is growing wildly in popularity: gaussian process regression (GPR — see this link for a very nice, more technical introduction). Nowadays, GPR would be commonly included in the broad class of methods bearing the label “machine learning”, but the method has its roots in earth sciences (where it is more commonly known as kriging). GPR is an interpolation method, meaning that it predicts the value of a function between two known points. Although many different methods exist for interpolating a function, GPR is one of the best for functions that depend on many parameters. In our case, the values of microscopic properties that we want to predict will depend on things like volume fractions, pressure and temperature, so we have the potential for many parameters, depending on the number of components we have in our fluid.
Gaussian process regression is based on random walks: each constraint we add narrows the range of values that are commonly sampled between your points. The mean value gives you the GPR prediction.
As the name suggests, GPR is based on gaussian processes, which are a type of random walk (meaning a function or process where each new step is taken in a random direction— think of a small toddler stumbling around as they learn to walk). Interestingly, the maths behind these types of processes is very similar to diffusion, which belongs to a branch of mathematics known as stochastic calculus. The idea with GPR is that we add constraints to our random walks, so that a random walk has to go through certain points (our measurements from MD in this case)—such as random walks We then ask the question: What is the probability of a random walk going through any given point between our measurements? When we know the probability associated with measuring all the points, we get a picture of where our true function is most likely to be found. The so-called “expectation value”, i.e. the value you expect to find based on the average of all the points weighted by their probability, gives us the best guess for our function! As a bonus, we also get an uncertainty, based on the standard deviation of the random walks we used to get our expectation value. If we find that this value is too inaccurate, we can always launch a new simulation to make sure that we improve the predictions in the neighbourhood of that point, thus making the predictions a self-improving algorithm over time.
GPR is, of course, not perfect. One of the main limitations we will face is the need for a lot of data. This means several hundred MD simulations at least, which means we face several days of pre-processing. The less data we have, the less accurate our values will be. Perhaps a better way to do this would be to have a mathematical model that is based directly on the physics underlying the system. To do this, we would have to find equations that model the physics to a high accuracy for a huge range of parameters; as far as we know, these do not exist yet.
Eventually, we’re hoping to apply GPR to get out all the values needed by the SPH simulation in RheoCube, so that we can always use the most accurate MD predictions for any given point. However, there is a lot of development we need to do before that. The MD simulations that all this will rely on are already highly complex simulations, and will take some time to fully implement. The first versions of this approach in RheoCube will therefore be limited to predicting things like viscosity for a range of parameters.
Multiscale problems are amongst the most complicated problems that are out there, and are often far beyond the possibilities for most single simulation techniques. Bringing in this type of simulation will therefore not only deliver highly accurate simulations to our clients, but also push the boundaries of the state-of-the-art in computational fluid dynamics.