학술논문

The brevity law as a scaling law, and a possible origin of Zipf's law for word frequencies
Document Type
Working Paper
Source
Entropy 2020, 22(2), 224
Subject
Physics - Physics and Society
Nonlinear Sciences - Adaptation and Self-Organizing Systems
Physics - Data Analysis, Statistics and Probability
Language
Abstract
An important body of quantitative linguistics is constituted by a series of statistical laws about language usage. Despite the importance of these linguistic laws, some of them are poorly formulated, and, more importantly, there is no unified framework that encompasses all them. This paper presents a new perspective to establish a connection between different statistical linguistic laws. Characterizing each word type by two random variables, length (in number of characters) and absolute frequency, we show that the corresponding bivariate joint probability distribution shows a rich and precise phenomenology, with the type-length and the type-frequency distributions as its two marginals, and the conditional distribution of frequency at fixed length providing a clear formulation for the brevity-frequency phenomenon. The type-length distribution turns out to be well fitted by a gamma distribution (much better than with the previously proposed lognormal), and the conditional frequency distributions at fixed length display power-law-decay behavior with a fixed exponent $\alpha\simeq 1.4$ and a characteristic-frequency crossover that scales as an inverse power $\delta\simeq 2.8$ of length, which implies the fulfilment of a scaling law analogous to those found in the thermodynamics of critical phenomena. As a by-product, we find a possible model-free explanation for the origin of Zipf's law, which should arise as a mixture of conditional frequency distributions governed by the crossover length-dependent frequency.
Comment: Submitted to special issue on Information Theory and Language