Loading [MathJax]/extensions/TeX/AMSsymbols.js
We gratefully acknowledge support from
the Simons Foundation
and University of Bath
Full-text links:

Download:

Current browse context:

physics.data-an

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo ScienceWISE logo

Physics > Data Analysis, Statistics and Probability

Title: Rational Ignorance: Simpler Models Learn More Information from Finite Data

Abstract: We use the language of uninformative Bayesian prior choice to study the selection of appropriately simple effective models. We advocate for the prior which maximizes the mutual information between parameters and predictions, learning as much as possible from limited data. When many parameters are poorly constrained by the available data, we find that this prior puts weight only on boundaries of the parameter manifold. Thus it selects a lower-dimensional effective theory in a principled way, ignoring irrelevant parameter directions. In the limit where there is sufficient data to tightly constrain any number of parameters, this reduces to Jeffreys prior. But we argue that this limit is pathological, as it leads to dramatic dependence on effects invisible to experiment.
Comments: 8 pages, 6 figures. v2 with expanded discussion and more figures
Subjects: Data Analysis, Statistics and Probability (physics.data-an); Statistical Mechanics (cond-mat.stat-mech); Information Theory (cs.IT)
Cite as: arXiv:1705.01166 [physics.data-an]
  (or arXiv:1705.01166v2 [physics.data-an] for this version)

Submission history

From: Michael Abbott [view email]
[v1] Tue, 2 May 2017 20:27:14 GMT (1530kb,D)
[v2] Fri, 1 Sep 2017 13:55:26 GMT (3194kb,D)