Bayesian inference of minimally complex models with interactions of arbitrary order
Phys. Rev. E 111, 054307 – Published 15 May, 2025
DOI: https://doi.org/10.1103/PhysRevE.111.054307
Abstract
Finding the model that best describes a high-dimensional dataset is a daunting task, even more so if one aims to consider all possible high-order patterns of the data (i.e., correlation patterns between three or more variables), going beyond pairwise models. For binary data, we show that this task becomes feasible when restricting the search to a family of simple models, which we call minimally complex models (MCMs). MCMs are maximum entropy models that have interactions of arbitrarily high order grouped into independent components of minimal complexity. They are simple in information-theoretic terms, which means they can only fit well certain types of data patterns and are therefore easy to falsify. We show that Bayesian model selection restricted to these models is computationally feasible and has many advantages. First, the model evidence, which balances goodness of fit against complexity, can be computed efficiently without any parameter fitting, enabling very fast explorations of the space of MCMs. Second, the family of MCMs is invariant under gauge transformations, which can be used to develop a representation-independent approach to statistical modeling. For small systems (up to 15 variables), combining these two results allows us to select the best MCM among all, even though the number of models is already extremely large (