Probabilities
Probabilities is the English translation of the book Probabilités Tome 1 and Tome 2. The mathematic content is authored by Prof. Jean-Yves Ouvrard. The English version has been done by his eldest son Dr. Xavier Ouvrard. In this first version, only the first part is released. Part 1 contains 7 chapters and corresponds to bachelor level. The first part introduces the fundamentals of probability theory, including event algebras, random variables, independence, conditional probabilities, moments of discrete and continuous random variables, generating functions, and limit theorems. Disclaimer: The second part is given as such and still needs full review. Corrected versions will be made available as soon as they are available during the coming months of 2026, stay tune!
💡 Research Summary
“Probabilities” is the English translation of the French textbook “Probabilités” (Volumes 1 and 2) authored by Prof. Jean‑Yves Ouvrard and translated by his son Dr. Xavier Ouvrard. The work is released under a CC‑BY‑NC‑SA 4.0 license and is currently available only in its first part, which is aimed at the bachelor level. The manuscript is organized into two major sections: Part 1 (already published) and Part 2 (still under review, with corrected versions promised in 2026).
Part 1 consists of seven chapters that systematically develop probability theory from elementary concepts to more sophisticated topics, following a traditional yet modern pedagogical approach.
Chapter 1 introduces random experiments, the algebra of events, σ‑algebras, and the Kolmogorov axioms. It then moves to discrete probability spaces, presenting concrete models such as the uniform law, geometric law (on ℕ and ℕ*), binomial, Poisson, and hypergeometric distributions. The notion of a “germ of a probability law” is used to illustrate how a law originates from elementary constructions.
Chapter 2 deals with summable families of real numbers. It defines the sum of a non‑negative family, extends the concept to families of arbitrary sign, and connects these ideas with series convergence tests. This groundwork is essential for later treatment of infinite expectations and integrals.
Chapter 3 focuses on independence. It treats independent events, complements, independent random variables, functions of independent variables, and the law of the sum of independent variables (via convolution). The presentation is rigorous, with clear proofs of the convolution formula and its consequences.
Chapter 4 covers conditional probability and Bayes’ theorem. It introduces the “probability of causes” perspective, derives the total probability formula, and discusses conditional laws. An “evolutive phenomenon modelling” subsection shows how to embed conditional structures into stochastic processes.
Chapter 5 is devoted to moments of discrete random variables. After defining expectation, the chapter explores linearity, expectations of functions, and specific expectations for classical discrete laws. Higher‑order moments, variance, covariance, Markov and Chebyshev inequalities, and the correlation coefficient are presented. Generating functions are introduced, with explicit formulas for binomial, Poisson, geometric (both on ℕ and ℕ*), and negative binomial laws, and the link between generating functions and moments is explained.
Chapter 6 moves to continuous random variables and densities. It defines probability measures on ℝⁿ, provides examples (uniform, exponential, Cauchy, Gaussian, chi‑square), and extends to ℝ² (uniform on rectangles and disks, bivariate Gaussian). Marginal and conditional densities are treated, together with expectations, variances, and covariance in the continuous setting. An appendix on the Riemann integral in ℝⁿ supplies the necessary measure‑theoretic background, including Fubini’s theorem and change‑of‑variables formula.
Chapter 7 addresses approximation of laws and limit theorems. It presents Poisson approximation, the normal approximation of the binomial, hypergeometric approximation by a binomial, the Central Limit Theorem, and the Weak Law of Large Numbers. Each approximation is accompanied by error bounds and conditions for validity, giving readers a practical sense of when asymptotic results can be applied.
Part 2 (still under review) expands the text into measure‑theoretic probability and advanced stochastic analysis. It begins with a concise review of measure theory, σ‑algebras, and integration, followed by the three fundamental convergence theorems (monotone, dominated, Fatou). The product measure and Fubini’s theorem are revisited in a more abstract setting.
Chapter 9 re‑examines random variables and moments from a measure‑theoretic viewpoint, introducing conditional expectations as orthogonal projections in L²(Ω, 𝔄, P) and extending the definition to L¹ and to the space of non‑negative measurable functions. Jensen’s inequality, convergence theorems for conditional expectations, and computational techniques are discussed.
Chapter 10 introduces Fourier transforms and characteristic functions, proving injectivity, exploring their behavior under independence, and showing how moments can be recovered from derivatives at zero.
Chapter 11 is devoted to Gaussian random variables. It defines the Gaussian measure, discusses existence and absolute continuity conditions, derives marginal distributions, and develops the Gaussian linear model, including parameter estimation, hypothesis testing, confidence intervals, and prediction.
Chapter 12 treats convergence of measures, weak convergence, and the Central Limit Theorem in a rigorous framework, followed by a brief section on statistical estimation.
Chapter 13 introduces stochastic processes and martingales. It defines filtrations, stopping times, the optional stopping theorem, Doob’s decomposition, L² maximal inequalities, and convergence theorems for martingales. The material culminates in the second optional stopping theorem and convergence results for sub‑martingales and super‑martingales.
Chapter 14 presents Markov chains. After an introductory section, it defines conditional independence, transition matrices, and fundamental properties such as irreducibility, periodicity, stationary distributions, and convergence to equilibrium.
Throughout the manuscript, each definition is followed by a theorem, a proof, and a set of exercises with solutions, encouraging self‑assessment. The style is rigorous yet accessible, making the book suitable for advanced undergraduates, graduate students, and researchers who need a comprehensive reference that bridges elementary probability and modern stochastic analysis.
Strengths of the work include its systematic progression from elementary to advanced topics, the inclusion of both discrete and continuous frameworks, and the thorough treatment of generating functions, characteristic functions, martingales, and Markov chains. The measure‑theoretic foundation is well integrated, and the numerous exercises reinforce learning.
Potential drawbacks are the occasional omission of detailed proof steps (which may challenge readers new to measure theory), and a limited number of real‑world applications (e.g., statistical inference, machine learning, finance). Adding case studies or computational examples could broaden its appeal.
In summary, “Probabilities” offers a complete, logically ordered exposition of probability theory, covering foundational concepts, moment analysis, limit theorems, conditional expectations, Fourier analysis, Gaussian processes, convergence of measures, martingale theory, and Markov chains. Its breadth makes it a valuable textbook for anyone seeking a deep, mathematically rigorous understanding of probability and its modern extensions.
Comments & Academic Discussion
Loading comments...
Leave a Comment