Skip to main content
Cornell University
We gratefully acknowledge support from the Simons Foundation, member institutions, and all contributors. Donate
arxiv logo > cs > arXiv:2311.09197

Help | Advanced Search

Computer Science > Machine Learning

(cs)
[Submitted on 15 Nov 2023]

Title:A Unified Approach to Learning Ising Models: Beyond Independence and Bounded Width

Authors:Jason Gaitonde, Elchanan Mossel
Download a PDF of the paper titled A Unified Approach to Learning Ising Models: Beyond Independence and Bounded Width, by Jason Gaitonde and Elchanan Mossel
Download PDF
Abstract:We revisit the problem of efficiently learning the underlying parameters of Ising models from data. Current algorithmic approaches achieve essentially optimal sample complexity when given i.i.d. samples from the stationary measure and the underlying model satisfies "width" bounds on the total ℓ1 interaction involving each node. We show that a simple existing approach based on node-wise logistic regression provably succeeds at recovering the underlying model in several new settings where these assumptions are violated:
(1) Given dynamically generated data from a wide variety of local Markov chains, like block or round-robin dynamics, logistic regression recovers the parameters with optimal sample complexity up to loglogn factors. This generalizes the specialized algorithm of Bresler, Gamarnik, and Shah [IEEE Trans. Inf. Theory'18] for structure recovery in bounded degree graphs from Glauber dynamics.
(2) For the Sherrington-Kirkpatrick model of spin glasses, given poly(n) independent samples, logistic regression recovers the parameters in most of the known high-temperature regime via a simple reduction to weaker structural properties of the measure. This improves on recent work of Anari, Jain, Koehler, Pham, and Vuong [ArXiv'23] which gives distribution learning at higher temperature.
(3) As a simple byproduct of our techniques, logistic regression achieves an exponential improvement in learning from samples in the M-regime of data considered by Dutt, Lokhov, Vuffray, and Misra [ICML'21] as well as novel guarantees for learning from the adversarial Glauber dynamics of Chin, Moitra, Mossel, and Sandon [ArXiv'23].
Our approach thus significantly generalizes the elegant analysis of Wu, Sanghavi, and Dimakis [Neurips'19] without any algorithmic modification.
Comments: 51 pages
Subjects: Machine Learning (cs.LG); Data Structures and Algorithms (cs.DS); Machine Learning (stat.ML)
Cite as: arXiv:2311.09197 [cs.LG]
  (or arXiv:2311.09197v1 [cs.LG] for this version)
  https://doi.org/10.48550/arXiv.2311.09197
arXiv-issued DOI via DataCite

Submission history

From: Jason Gaitonde [view email]
[v1] Wed, 15 Nov 2023 18:41:19 UTC (68 KB)
Full-text links:

Access Paper:

    Download a PDF of the paper titled A Unified Approach to Learning Ising Models: Beyond Independence and Bounded Width, by Jason Gaitonde and Elchanan Mossel
  • Download PDF
  • PostScript
  • Other Formats
Current browse context:
cs.LG
< prev   |   next >
new | recent | 2311
Change to browse by:
cs
cs.DS
stat
stat.ML

References & Citations

  • NASA ADS
  • Google Scholar
  • Semantic Scholar
a export BibTeX citation Loading...

Bookmark

BibSonomy logo Reddit logo

Bibliographic and Citation Tools

Bibliographic Explorer (What is the Explorer?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)
Which authors of this paper are endorsers? | Disable MathJax (What is MathJax?)
  • About
  • Help
  • contact arXivClick here to contact arXiv Contact
  • subscribe to arXiv mailingsClick here to subscribe Subscribe
  • Copyright
  • Privacy Policy
  • Web Accessibility Assistance
  • arXiv Operational Status
    Get status notifications via email or slack