Sie sind hier

Optimization of topic models based on the deformed entropy

Sergei Koltcov

Entropy models are widely used in various scientific fields, such as statistical physics, biology, economics, and machine learning. However, while the models developed in statistical physics have been mostly based on deformed entropies, including the entropies of Renyi, Tsallis, and Sharma-Mittal, machines learning has been mainly relying on Boltzmann-Gibbs-Shannon entropy. This type of entropy has been often used as one of regularizers of topic models (TM) or for their diagnosis. Topic modeling is a class of algorithms based on the procedure of restoring of a multidimensional distribution as a mixture of hidden distributions. One of the unsolved problems in TM is the choice of the number of distributions in the mixture. Another problem is its semantic stability. In my talk I will present an entropy approach, based on the deformed entropies of Renyi, Tsallis and Sharma-Mittal entropy, and explain how it can be used for analyzing the dependence of TM’s information utility on the number of topics and semantic stability. In this approach, I regard a topic model an information system comparable to a mesoscopic thermodynamic system, whose behavior is determined by the factor ‘number of topics’. I will show that in this case, the problem of choosing the optimal number of topics (taking semantic stability into account) can be reduced to the problem of finding the minimum of free energy or the minimum of the nonequilibrium Renyi / Tsallis or Sharma-Mittal entropy. In addition I will present the results
of numerical simulation of five models (PLSA, VLDA (Variational), LDA (Gibbs sampling), GLDA (Gibbs sampling)), and BIGARTM, run on two datasets: one in Russian and the other in English.

26.11.2018 - 16:15
D 239