Computer Science > Machine Learning
[Submitted on 28 Oct 2018 (v1), last revised 18 Jun 2019 (this version, v3)]
Title:Sparse Logistic Regression Learns All Discrete Pairwise Graphical Models
Download PDFAbstract:We characterize the effectiveness of a classical algorithm for recovering the Markov graph of a general discrete pairwise graphical model from i.i.d. samples. The algorithm is (appropriately regularized) maximum conditional log-likelihood, which involves solving a convex program for each node; for Ising models this isℓ1 -constrained logistic regression, while for more general alphabets anℓ2,1 group-norm constraint needs to be used. We show that this algorithm can recover any arbitrary discrete pairwise graphical model, and also characterize its sample complexity as a function of model width, alphabet size, edge parameter accuracy, and the number of variables. We show that along every one of these axes, it matches or improves on all existing results and algorithms for this problem. Our analysis applies a sharp generalization error bound for logistic regression when the weight vector has anℓ1 constraint (orℓ2,1 constraint) and the sample vector has anℓ∞ constraint (orℓ2,∞ constraint). We also show that the proposed convex programs can be efficiently solved inO~(n2) running time (wheren is the number of variables) under the same statistical guarantees. We provide experimental results to support our analysis.
Submission history
From: Shanshan Wu [view email][v1] Sun, 28 Oct 2018 23:40:42 UTC (272 KB)
[v2] Sun, 3 Feb 2019 03:40:47 UTC (300 KB)
[v3] Tue, 18 Jun 2019 20:53:05 UTC (301 KB)
Current browse context:
cs.LG
References & Citations
Bibliographic and Citation Tools
Bibliographic Explorer (What is the Explorer?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)