Contents:
Differential equations are used for modeling throughout the sciences from astrophysical calculations to simulations of biochemical interactions. These models have to be simulated numerically due to the complexity of the resulting equations. However, numerical solving differential equations presents interesting software engineering challenges.
On one hand, speed is of utmost importance. While this strategy works, efficiency in this field is not just due to the engineering practices. There is no perfect method across the whole range of differential equations. Each type of differential equation has different qualities that need to be taken advantage of in order to have an efficient solver. There are hundreds of methods to choose from, many of which do not have a classical canonical implementation or any! Additionally, recent scientific models have been growing in realism, meaning that the ability to efficiently handle problems like stochastic differential equations , delay differential equations , and differential-algebraic equations are growing in importance.
Thus the ability to build software in a productive manner is also required a task that is usually relegated to high level scripting languages! How do we gather the methods of the field so that way usable and precise benchmarks can guide the theoretical development?
As both the efficiency and complexity of the methods grow, it is important that we address 1 and allow for people who specialize in scientific models to be able to solve their problems without having to become numerical methods and software development experts. This requires that new methods, like adaptivity for stochastic differential equations, proliferate while not increasing the user-burden.
But at the same time, the computational physicists and engineers who specialize in solving difficult PDEs fall under 2 and need access to every little detail about Jacobian linear solver choices across the vast field of methods. But what methods do you even develop or use? It was interesting to find out that the vast majority of methods have never been benchmarked together or have an easy way to do so! This means that 3 is of utmost importance for guiding the field.
To tackle this problem, we developed a confederated package ecosystem in Julia which has become known as DifferentialEquations. The architecture of the software is described here and it allows for any developers to add dispatches to a common solve function. This is required since the researchers developing the methods need the research outputs to grow their own careers, but at the same it allows us to make use of their newest methods in more complex software. By doing so, we strike a balance between the user and research communities which is required to keep iterating the software of the field.
At this point we can say this methodology has been successful. We have a large active community, JuliaDiffEq , which is devoted to solving the user and theoretical problems of the field through software development. The organization continually fields a large number of student researchers and Google Summer of Code developers.
We have covered areas which have traditionally not had adequate open-source software with high performance implementations, such as symplectic integrators , exponential and IMEX integrators , and high order adaptive Runge-Kutta methods for SDEs. These solvers then automatically compose with higher level tools for things like parameter estimation and chemical reaction modeling. Now that we have a successful and growing software ecosystem, it makes sense to cast our nets as wide as possible and distribute our methods to all of the users that need them.
The software was developed in Julia because of its unique ability to hit our efficiency and productivity demands. However, we recognize that we live in a poly-language world and so in order to really meet the goals of our organization, we need to expand and become multi-language compatible. To meet the multi-language demands, we have produced the diffeqpy and diffeqr packages for Python and R respectively.
Our focus was on addressing 1 and 2 : giving domain-experts high efficiency automated tooling and giving the bindings the required flexibility to be tweaked to efficiency. Let me state that we are not all the way there yet, this is still very new, but we have made major headway.
These are interesting because it allows constraints to be incorporated as part of the equation. The Robertson equation is defined as follows:. In DifferentialEquations.
The R interface is a bit more constrained but has specific solve functions for the different types of differential equations. The R code for diffeqr is:. Notice that the interfaces are virtually identical. Even key features about the solution are preserved. For example, the return sol is a continuous function sol t in Julia. This is true in Python as well. This means that the in-depth documentation carries across languages as well.
Even the common solver options carry across languages. It can't do anything but double-precision numbers and doesn't have event handling, but the sensitivity calculations makes it quite special. Okay, time for DifferentialEquations. I left it for last because it is by far the most complex of the solver suites, and pulls ideas from many of them.
While most of the other suite offer no more than about 15 methods on the high end with most offering about 8 or less , DifferentialEquations. Thus all of the standard methods mentioned before are available in this suite. But then there are the native Julia methods. In each of these categories it has a huge amount of variety, offering pretty much every method from the other suites along with some unique methods. Some unique methods to point out are that it has the only 5th order Rosenbrock method, it has the efficient Verner methods discussed in the Mathematica part, it has newer 5th order methods i.
It has methods specialized to reduce interpolation error the OwrenZen methods , and methods which are strong-stability preserving for hyperbolic PDEs. It by default the solutions as continuous functions via a high order interpolation though this can be turned off to make the saving more efficient. Each of these implementations employ a lot of extra tricks for efficiency. For example, the interpolation is "lazy", meaning that if the method requires extra function evaluations for the continuous form, it will only do those extra calculations when the continuous function is used so when you directly ask for it or when you plot.
This is just a peek at the special things the library does to gain an edge. The native Julia methods benchmark very well as well, and all of the benchmarks are openly available. They even allow you to tweak a lot of the internals and swap out the linear algebra routines to use parallel solvers like PETSc.
Its event handling is the most advanced out of all of the offerings. You can change just about anything.
You can make your ODE do things like change its size during the solving if you want, and you can make the event handling adjust and adapt internal solver parameters. It's not a hyperbole to say that the user is given "full control" since the differential equation solvers themselves are written as essentially a method on the event handling interface, meaning that anything it can do internally you can do. The variety of methods is also much more diverse than the other offerings. It has symplectic integrators like Harier's suite, but has more high and low order methods. It has DDE solvers for constant-lag and state-dependent delays, and it has stiff solvers for each of these cases.
The stiff solvers also all allow for solving DAEs in mass matrix form though fully implicit ODEs are possible, but can only be solved using a few methods like a wrapper to Sundials' IDA and doesn't include event handling here quite yet. It allows arbitrary numbers and arrays like Boost. This means you can use arbitrary precision numbers in the native Julia methods, or you can use "array-like" objects to model multiscale biological organisms instead of always having to use simple contiguous arrays.
It has addons for things like sensitivity analysis and parameter estimation. It hits the strong points of each of the previously mentioned suites because it was designed from the get-go to do so. And it benchmarks extremely well. The only downside is that, because it is so feature and performance focused, its documentation is heavy. The beginning tutorial will give you the basics making it hopefully as easy as MATLAB to pick up , but pretty much every control knob is available, making the extended portion of the documentation a long read.
Let's take a step back and summarize this information. Since it also features solvers for the non-ordinary differential equations and its unique Julia methods also benchmarks well, I think it's clear that DifferentialEquations. As for the other choices from scripting languages, MATLAB wasn't designed to have all of the most efficient methods, but it'll handle basic equations with delays and events and output good plots.
Mathematica and Maple will do symbolic pre-calculations to speed things up and can JiT compile functions, along with offering pretty good event handling, and thus their wrappers are more like DifferentialEquations. So in a pinch when not all of the bells and whistles are necessary, each of these scripting language suites will get you by. Behind DifferentialEquations. Still, you're going to have to write a lot of stuff yourself to make the rootfinding into an event handling interface, but if you put the work in this will do you well.
Any of these will do you well if you want to really get down-and-dirty in a compiled language and write a lot of the interfaces yourself, but they will be a sacrifice in productivity with no clear performance gain over the scripting language methods which also include some form of JIT compilation. Shortly after being posted this has received quite a bit of buzz. I am thankful so many people are interested!
A few things were pointed out to me that I have since corrected:. There are probably a few other corrections needed. For one, I may have put PyDSTool too high in some areas like state-dependent delay solvers I had notes that say it can't do it, but I can't find it in the docs anymore More edits today. This time for Maple. Some more refinements. Mostly adding extra capabilities in the Netlib part of the chart there's a lot of interesting stuff in there. While this is not part of any of the mentioned libraries and thus doesn't change the chart, this is something SciPy users might want to know about.
Additionally, its interface doesn't give you control over all of the fine details of the solvers that it wraps. In a few days I will be posting about the release of 3. And if you have an asymtopically large problem or very expensive function evaluations, this will be the most efficient as well. Sundials was the first to package that all together nicely. The routines solve both stiff and non-stiff systems, and include many options, e.
This has become quite a living document. I got some questions about development goals, which was actually the purpose of this in the first place, so let me share some concise recommendations. These are recommendations for next developments which would be the most beneficial for each.
I'll omit the tools which are essentially "done" like Netlib. Their goal was more flexibility, and they did well.
Event handling now exists and it handles more than just vectors now. However, the developer chat does mention that this degrade the efficiency quite a bit links in the discussion above over just wrapping the Fortran solvers for the Runge-Kutta and BDF methods, so that is something to keep in mind.