Roughly speaking, there are two ways to study the behavior of a theoretical model: with math, or with a computer. You can use math to derive properties of the model. Or you can use a computer to simulate the model in order to discover its properties.

You can simulate any model, but you can’t do math on any model. Depending on the complexity of the model, it might be impossible to derive some or even any of the properties of interest. For instance, for deterministic ordinary differential equation models, an analytical solution* can only be derived for the very simplest models, such as the logistic model. Models with more state variables and/or more parameters won’t have an analytical solution. They might still have one or more equilibria for which you can solve using algebra–but most ecological models with more than, oh, three or so state variables won’t have algebraically tractable equilibria. And so on. It’s desirable to be able to do math if you can, because that provides generalizable insight and understanding that’s difficult or impossible to obtain with simulations. But if you can’t do the math, well, that’s what simulations are for.

Question: has ecological theory more or less exhausted the space of analytically tractable models? That is, have we already learned as much as we can learn about ecology using math? So that future models are going to be analyzed mostly or entirely via simulations? For instance, in food web ecology, I think every possible mathematically tractable food web “module” (food web model containing just a few species) was analyzed years ago.

And if we have exhausted the space of mathematically tractable models, does it matter? In particular, does it mean that we already have all the general theory we’ll ever have? So that in future there’s nothing left to do but simulate more complicated and possibly system-specific models, to answer questions that either have complicated or system-specific answers? (see this old post for discussion)

Looking forward to your comments, as always.

*A formula giving the value(s) of the state variable(s) at any point in time, just based on knowledge of the initial conditions, the model parameters, and the time. For the logistic equation, the analytical solution is a formula giving the population size *N* at any time *t*, just based on knowledge of the initial population size (*N* at time* t*=0), the parameters *r* and* K*, and the time *t*.

### Like this:

Like Loading...

Via Twitter, a succinct-to-the-point-of-uninformative answer to the questions in the post title:

Because they are logically derived, theories and models by nature are mathematically tractable and we have quite a number of theories and models in ecology that have not been expressed in a mathematical form. If we start from there, the answer is obviously NO. However, math is just becoming popular in ecology and its application often necessitates a reductionist approach, which is a limitation in itself. Currently, it doesn’t seem like one need to be math savvy to do an important work. Many ecologists (e.g. theoretical ecologists) that use math extensively are those that have had above average training in math or come from a background (e.g. physics) where math is their basic tool. So, for those people, it’s hard to tell whether it’s a case of having a hammer and seeing everything as a nail.

Math is a valuable tool but a tool nonetheless. It’s current use or lack of it can’t tell us much about the future of ecology. Right now, Bayesian modelling is trending. Before then, it was multivariate analyses. I don’t know where we’d go from here. But one thing is clear, ecology has made quite a leap in a short period of time by capitalizing on technological advancement such as the use of supercomputing and machine learning in dynamic vegetation modeling, molecular ecology etc. Ecology is complex and these technologies do not carry the same limitations as mathematically-explicit models. In my view, math would continue to play an important role in conceptualizing ecological theories but acceptability and generalizability of any theory would be contingent on scalability with technology. It is then that the field would have fully matured.

“models by nature are mathematically tractable”

You mean something different by “mathematically tractable” than I do. For instance, the Lotka-Volterra competition equations don’t have an analytical solution, at least not that anyone’s been able to discover.

There’s certainly an over-generalization on my part. Point taken.

How many of the models that we commonly use in population / community ecology do have analytical solutions? Because it seems like it’s not many.

Oh, very few! The logistic equation, the Levins metapopulation model, and…uh, is that it?

As the post apparently didn’t make clear enough, “analytical solution” is just one of many examples of “a property of a model that you can derive with math”. Other properties of models that you can (sometimes!) derive with math include equilibria (set your differential equations equal to zero, use algebra to solve for the value(s) of the state variables), the local stability of equilibria, the stationary distribution of a stochastic model, etc. etc.

In general, the more complicated your model, the harder it is to derive anything about its behavior.

Has ecological theory exhausted the space of mathematically tractable models?

Maybe. It seems like a good deal of recent theory deals with new variations on or applications of a handful of well-established modeling frameworks, which to me suggests yes. But what I’m less sure of is to what degree those handful of frameworks are predominant because they’re already famous and well-established, versus whether they’re predominant because they’re tried-and-true best general representations of particular phenomena and other potential mathematically tractable models just don’t work as well, at least for the kinds of problems ecologists have been interested in. If it’s more the former, then I’d judge the odds to be fairly high of some creative theoretician developing some new analytically tractable model, that’s useful enough to gain some reasonable level of use in the community.

Kinda sad this post isn’t getting more discussion, because I think it’s an interesting question, but it is a bit niche.

“Kinda sad this post isn’t getting more discussion, because I think it’s an interesting question, but it is a bit niche.”

Welcome to my world. 🙂 😦

“But what I’m less sure of is to what degree those handful of frameworks are predominant because they’re already famous and well-established, versus whether they’re predominant because they’re tried-and-true best general representations of particular phenomena and other potential mathematically tractable models just don’t work as well, at least for the kinds of problems ecologists have been interested in.”

Personally, I suspect that if the space of mathematically tractable ecological models has yet to be exhausted, it’s because there are questions ecologists haven’t yet asked that have mathematically tractable answers. Not because, e.g., there’s some totally new mathematically-tractable description of, say, predator-prey cycles lurking out there undiscovered. Just a guess on my part, obviously.

No; some emerging areas:

–Probability approaches: given reality of “noise”, analytically solving for the mean (can be very different from equilibria in ODEs) and variance in dynamics. Lots of work here with persistence; more needed for multispecies and informing management.

–Transient dynamics (these can be long or indefinite)

–Complexity: expanding network theory to incorporate dynamics. In physics for instance, lots of valuable insight from analytical solutions to things like Ising (phase transitions ~ “regime shifts”) and Kuramoto (coupled oscillators ~ “synchrony”) models.

Also, in my experience, when you scale up analytically tractable models to model heterogeneity across time, space, or species you often get patterns very similar to what you can do analytically due to scale separation (also why simple models can outperform complex ones). So fluency in analytical work – the building blocks representing edge cases – will always be essential.

As for generality I’d refer to physics: we’ve lots of general concepts to learn about finer-scale processes and how/when they translate to larger scales.

Very interesting remarks, thanks Vadim for this overview of where ecological theory is going.

This is precisely the comment I was going to make. I think there is a lot of work done by us (mathematicians and theorists more broadly) which may give analytical insight into classes or aspects of models which can apply more broadly. For instance, my main area of technical expertise is in the qualitative theories of ODE and PDE; we can rarely analytically solve things, but often we can conjecture or prove theorems along the lines of “all models of such and such form give rise to oscillations, or synchrony,” etc. Novel approaches in these areas render previously intractable models into analytically understandable ones. Progress is slow, but it is happening I think, and ideally the methods developed can eventually be used outside of mathematics itself.

I should also mention that there is a growing body of work on computational methods which give analytical insight into large sets of parameters for specific models, or model structures. One can numerically generate the phase space for a given system of ODE, for instance, and perform continuation in dozens of parameters in order to essentially understand the influence of all parameters. Of course this can fail for a variety of reasons, and isn’t feasible for many kinds of models, but in principle such methods can be made formal such that even pure mathematicians will accept the answers. If I recall correctly, one of the earliest rigorous proofs that the Lorenz equations gave rise to deterministic chaos used essentially a computationally-inspired analytic approach called Interval arithmetic, and I think such blended methods are becoming more common.

I’m kind of confused by the framing here, because you’re going straight from “analytically tractable” to “simulate more complicated models”, but there’s a whole range of theoretically approaches between them. Generally, when I’m working on a theoretical problem, simulation is one of the last tools I use. I generally apply at least two main approaches before even starting large-scale simulations:

1.find approximate versions of a given model that we can reason about analytically

2. use bifurcation analysis to determine how the qualitative behaviour the of model changes through parameter space.

Only after that do I start doing targeted simulations to understand more specific details of a given model. As Vadim mentions above, there’s also a whole host of methods based around treating different parts of the model as stochastic, that don’t rely on doing a bunch of simulations.

I would say that our fundamental toolbox of model sub-components hasn’t grown much over the last half-century (we’re still using Rosenzweig-MacArthur as the workhorse of trophic ecology, for instance), but that’s not due to whether more complex models are analytical or not (even Rosenzweig-MacArthur isn’t analytically tractable!), as much as that these simpler building blocks are well-understood, and we don’t generally have enough ecological information to determine if new potential functions might be useful.

Just as an example that we aren’t at risk of running out of theoretical material: I found a bunch of theory papers in the last couple years that weren’t “analyzed mostly or entirely via simulations” just from a quick skim of my reference manager:

Rossberg and Barabas 2019 “How carefully executed network theory informs invasion ecology”

Tekwa et al. 2019. “Path-dependent institutions drive alternative stable states in conservation”

Walsh et al. 2018 “Detecting species at low densities: a new theoretical framework and an empirical test on an invasive zooplankton”

Ward and McCann 2017 “A mechanistic theory for aquatic food chain length”

“I’m kind of confused by the framing here, because you’re going straight from “analytically tractable” to “simulate more complicated models”, but there’s a whole range of theoretically approaches between them. ”

I agree. Was trying to keep the post brief and so didn’t spell out all the ways that you can use math rather than numerical simulations to understand model behavior. For instance, if you can use algebra to solve for the equilibria, it’s very useful to do so. But my sense is that ecological theoreticians are running out of deterministic models for which we can use algebra to solve for the equilibria.

“use bifurcation analysis to determine how the qualitative behaviour the of model changes through parameter space. ”

Do you think bifurcation analysis has become a workhorse tool in theoretical ecology more broadly?

“I would say that our fundamental toolbox of model sub-components hasn’t grown much over the last half-century…but that’s not due to whether more complex models are analytical or not…as much as that these simpler building blocks are well-understood, and we don’t generally have enough ecological information to determine if new potential functions might be useful.”

Very interesting remark, I need to think more about that. Hopefully others will chime in.

“Just as an example that we aren’t at risk of running out of theoretical material: I found a bunch of theory papers in the last couple years that weren’t “analyzed mostly or entirely via simulations” just from a quick skim of my reference manager:”

I was hoping/dreading that somebody would prove me wrong by just listing a whole bunch of recent theory papers that don’t use any numerical simulations!

I would say one big change is that it’s more common for theory papers to require some observational or experimental evidence that they make reasonable predictions for at least some system… three papers of the four I have do that. That I think is a big change from the early days of theory.

As for whether people are using bifurcation broadly… It’s hard for me to say. I see it a fair bit, but that might be because I use it a lot in my own work, and so am prone to notice it more.

“I would say one big change is that it’s more common for theory papers to require some observational or experimental evidence that they make reasonable predictions for at least some system…”

You are far from alone in that impression:

https://dynamicecology.wordpress.com/2015/03/19/should-theory-published-in-general-ecology-journals-have-to-be-realistic/

https://dynamicecology.wordpress.com/2015/03/25/ecologists-think-general-ecology-journals-only-want-realistic-theory-and-they-dont-like-that/

“I would say that our fundamental toolbox of model sub-components hasn’t grown much … and we don’t generally have enough ecological information to determine if new potential functions might be useful.”

Definitely, and its great to have a base set of intuitive building blocks for which we have some good understanding of (1) empirical support and (2) at least near-equilibrium dynamics. I think the #2 fact might be what Jeremy was getting at here. So perhaps the answer is that the next (+obvious) direction is/has been investigating these basic models in new combinations and away from equilibria. But analytical tools remain a huge part of understanding (vs just describing) these dynamics.

Missed a couple of those papers, thanks!!

Sigh. I’m old enough to remember when Matthew Holden would comment here rather than on Twitter (i.e. I can remember, like, last month). I miss those days.

I note with interest that Matthew’s reasons for answering “no” to the questions posed in the post title seem different to me than other commenter’s reasons. But what do others think?

I’ll add that often I start backwards these days. Build some complicated simulation (complicated for my taste), and see an interesting unexpected result. Then I’m like, that has to be a bug – check the code and convince myself that it perhaps it isn’t a bug. Then I build simpler models to test a few possible hypotheses.

So I often start complicated then move to the simple analytic models (the opposite of what I think most folks would recommend, but I enjoy it)

Whether to start simple and then make the model more complex, or start complicated and then try to simplify, is an interesting question that probably deserves a post of its own. There are pluses and minuses to both approaches, and it may be a case of different strokes for different folks.

Yes – although obviously analytically tractable is fuzzy and pivotal to the answer, tractability is tied to model complex. For example we can handle 2 species quadratic differential equations. Anything more starts to get hairy. Much more (e.g. 3 species with higher order or non polynomial terms) is almost certainly simulation only. Thus tractable models are a finite and well explored space. We may come up with an interesting new question that can be mapped into that space (say 2 quadratic difeq), but in terms of analysis and solutions its still going to map into a model already analyzed.

Ok, so that’s two votes for yes (you and me), one vote for maybe (Jonathan Walter), and several votes for no (various commenters here and on Twitter). Should’ve put a poll at the end of the post!

Brian’s argument (new models are variants that map onto already-analyzed models) is pushing me to get off the fence and change my vote to Yes.

Is this a problem? No, I don’t think so. Ecologists are still going to find new ways to apply these models and new questions to ask, and in some sense it’s an asset (especially for those like me who like theory but are much better coders than mathematicians) that these models have already been analyzed.

I’m confused. You must be talking about analytic solutions of the state variables as explicit functions in time/space, and not an analytic exploration of long term dynamics because there are several 2D ODEs (not quadratic polynomials) in ecology that have analytic equilibria that one can prove stability properties for. I’m working on one right now, one that I don’t think has been analysed before.

Yes you can get a little bit beyond 2D 2nd order polynomial (certainly e.g. analysis of Rosenzweig-MacArthur) but you do start to lose breadth in what you can do analytically (e.g. solve for equilibria) and pretty shortly after that any analytical information usually disapppears. Not a mathematical law as far as I know (and I’m sure there are special case exceptions) but it seems to be a fairly general rule in y experience.

It’s also worth noting that there are open questions about quadratic 2D ODEs that mathematicians thought were too hard to be on the Millennium Prize problem list (ok, the full problem is about general polynomials but hopefully you see my point).

https://en.m.wikipedia.org/wiki/Hilbert%27s_sixteenth_problem

As mentioned below, it all depends upon what one means by “simple” and what one considers analytical insight. I teach about certain infinite dimensional differential equations with analytical solutions, but contrasting these with 2D ODE we can’t solve by hand is hard. Contrasting these with probabilistic, game theoretic, or other kinds of models is even more difficult.

Hello? Is this the analogy police? I’d like to report a bad analogy:

Plus, even on its own terms, this analogy is debatable. I’ve seen a case made that various genres of music, including rock, are played out. Doesn’t mean that literally all possible rock songs have been written–let’s not be silly and indulge in strawman-bashing. But it does mean that any new rock music is going to strike listeners as a minor variant on previous rock music. I don’t know that I buy that “rock is dead” case myself–I don’t listen to nearly enough music to have a useful opinion. But I don’t think it’s *obviously* an incorrect case.

Via Twitter, some interesting and thoughtful remarks from Paul Hurtado:

I am a regular reader of the blog but rarely comment because I am an outsider (physicist) working on ecological questions. However, I felt compelled to answer because we have been thinking about this ideas very hard in our group (sorry for the length of the comment).

There has been a real interest in thinking about these questions from the statistical physics community in recent years. I think we have made a quite a bit of progress that is just starting to make its way into the ecological community. Many of our models are also analytically tractable and involve adapting really sophisticated and fun techniques from the statistical physics.

Right now, ecology has the equivalent of a classical mechanics (few particle interacting), but there is no equivalent of statistical mechanics. In my opinion, as suggested above, there needs to be more thoughts about how to do this in ecology (ecological phases, phase transitions) and I think we are just starting to understand how this can be done.

I feel like there are three major philosophical reasons that the answer is no:

1. First, most ecological models (niche theories) have worked in the limit of a few species and few resources/limiting factors. However, we know that real ecosystems have large number of organisms and resources. As argued in Phil Anderson’s 1972 seminal essay “More is Different”:

“The constructionist hypothesis breaks down when confronted with the twin difficulties of scale and complexity. The behavior of large and complex aggregates of elementary particles, it turns out, is not to be understood in terms of the simple extrapolation of the properties of a few particles…”

2. In many models, the environment and species are treated separately. This is especially misleading in the microbial world where microbes constantly reengineer their environments through cross-feeding and synthropy. We are just beginning to explore the effect of this in models of the microbial world, and even in the more macroscopic world. It in fact changes behaviors completely and we have found is essential for explaining and predicting experiments. This is really a surprisingly under-explored class of models (see below)

3. There are still lot of unexplored connections between ecological models and other areas of “theory” (computation, optimization, statistical physics). This kind of exchange has the potential to lead to new techniques and new classes of models that can be solved but weren’t previously. For example, we recently figured out that many consumer resource models are equivalent to constrained optimization problems (arXiv:1901.09673). I am sure there are many, many, more connections to be had that will open up new models that we can analyze.

Technical Level.

I think on a technical level the models have ignored the role of heterogeneities in species preferences and environments. For one or two species models, this becomes difficult because of the proliferation of parameters. However, for large ecosystems (as is well known in ecology- May being the earliest version of this), we can think about drawing parameters from a distribution. This makes it easier to parameterize large models. However, I think solving these models technically requires new techniques from the statistical physics of disordered systems (replicas, cavity methods, mean-field theories) and is starting to be done. Even simulating these large systems gets difficult. However, there are unexpected connections with constrained optimizations and other fun things in Machine Learning.

There have been a number of interesting papers I would call Stat. Mech of Ecological systems that have come out from our physics community that play with these ideas. There is also a growing literature on how things change in the microbial world:

Stat Mech of classic models:

Fisher, Charles K., and Pankaj Mehta. “The transition between the niche and neutral regimes in ecology.” Proceedings of the National Academy of Sciences 111.36 (2014): 13111-13116.

Bunin, Guy. “Ecological communities with Lotka-Volterra dynamics.” Physical Review E 95.4 (2017): 042414.

Tikhonov, Mikhail, and Remi Monasson. “Collective phase in resource competition in a highly diverse ecosystem.” Physical Review Letters 118.4 (2017): 048103.

Advani, Madhu, Guy Bunin, and Pankaj Mehta. “Statistical physics of community ecology: a cavity solution to MacArthur’s consumer resource model.” Journal of Statistical Mechanics: Theory and Experiment 2018.3 (2018): 033406.

Biroli, Giulio, Guy Bunin, and Chiara Cammarota. “Marginally stable equilibria in critical ecosystems.” New Journal of Physics 20.8 (2018): 083051.

Cui, Wenping, Robert Marsland III, and Pankaj Mehta. “Diverse communities behave like typical random ecosystems.” arXiv preprint arXiv:1904.02610 (2019).

Roy, Felix, et al. “Numerical implementation of dynamical mean field theory for disordered systems: application to the Lotka-Volterra model of ecosystems.” arXiv preprint arXiv:1901.10036 (2019).

Microbial Resource Models (which we have been working on a lot).

Goldford, Joshua E., et al. “Emergent simplicity in microbial community assembly.” Science 361.6401 (2018): 469-474.

Marsland III, Robert, et al. “The Community Simulator: A Python package for microbial ecology.” arXiv preprint arXiv:1904.09367 (2019).

Marsland III, Robert, et al. “Available energy fluxes drive a transition in the diversity, stability, and functional structure of microbial communities.” PLoS computational biology 15.2 (2019): e1006793.

Muscarella, Mario E., and James P. O’dwyer. “Species dynamics and interactions via metabolically informed consumer-resource models.” BioRxiv (2019): 518449.

Right now really simulations but we have unpublished analytic results that use many of the same techniques as above.