Stochastic model for the vocabulary growth in natural languages
We propose a stochastic model for the number of different words in a given database which incorporates the dependence on the database size and historical changes. The main feature of our model is the existence of two different classes of words: (i) a finite number of core-words which have higher frequency and do not affect the probability of a new word to be used; and (ii) the remaining virtually infinite number of noncore-words which have lower frequency and once used reduce the probability of a new word to be used in the future. Our model relies on a careful analysis of the google-ngram database of books published in the last centuries and its main consequence is the generalization of Zipf’s and Heaps’ law to two scaling regimes. We confirm that these generalizations yield the best simple description of the data among generic descriptive models and that the two free parameters depend only on the language but not on the database. From the point of view of our model the main change on historical time scales is the composition of the specific words included in the finite list of core-words, which we observe to decay exponentially in time with a rate of approximately 30 words per year for English.
💡 Research Summary
The paper introduces a stochastic framework that captures how the number of distinct words in a text corpus evolves with corpus size and over historical time. The central innovation is the explicit separation of the vocabulary into two classes: (i) a finite set of “core‑words” that are high‑frequency, essentially immutable in their contribution to the probability of encountering a new word, and (ii) an effectively infinite pool of “non‑core‑words” that are low‑frequency and whose usage reduces the chance of future novel word occurrences.
Mathematically, the model assumes a fixed number (N_c) of core‑words. At each token insertion the probability that a new non‑core word appears is given by
\
Comments & Academic Discussion
Loading comments...
Leave a Comment