On Finding the Natural Number of Topics with Latent Dirichlet Allocation: Some Observations

  • R. Arun
  • V. Suresh
  • C. E. Veni Madhavan
  • M. N. Narasimha Murthy
  • R. Arun
    • 1
  • V. Suresh
    • 1
  • C. E. Veni Madhavan
    • 1
  • M. N. Narasimha Murthy
    • 1
  1. 1.Department of Computer Science and AutomationIndian Institute of ScienceBangaloreIndia
Conference paper

DOI: 10.1007/978-3-642-13657-3_43

Part of the Lecture Notes in Computer Science book series (LNCS, volume 6118)
Cite this paper as:
Arun R., Suresh V., Veni Madhavan C.E., Narasimha Murthy M.N. (2010) On Finding the Natural Number of Topics with Latent Dirichlet Allocation: Some Observations. In: Zaki M.J., Yu J.X., Ravindran B., Pudi V. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2010. Lecture Notes in Computer Science, vol 6118. Springer, Berlin, Heidelberg

Abstract

It is important to identify the “correct” number of topics in mechanisms like Latent Dirichlet Allocation(LDA) as they determine the quality of features that are presented as features for classifiers like SVM. In this work we propose a measure to identify the correct number of topics and offer empirical evidence in its favor in terms of classification accuracy and the number of topics that are naturally present in the corpus. We show the merit of the measure by applying it on real-world as well as synthetic data sets(both text and images). In proposing this measure, we view LDA as a matrix factorization mechanism, wherein a given corpus C is split into two matrix factors M1 and M2 as given by Cd*w = M1d*t x Qt*w. Where d is the number of documents present in the corpus and w is the size of the vocabulary. The quality of the split depends on “t”, the right number of topics chosen. The measure is computed in terms of symmetric KL-Divergence of salient distributions that are derived from these matrix factors. We observe that the divergence values are higher for non-optimal number of topics – this is shown by a ’dip’ at the right value for ’t’.

Keywords

LDA Topic SVD KL-Divergence 
Download to read the full conference paper text

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Personalised recommendations