George Forsythes last paper

Reading time: 5 minute
...

📝 Original Info

  • Title: George Forsythes last paper
  • ArXiv ID: 1005.0909
  • Date: 2010-05-10
  • Authors: Researchers from original ArXiv paper

📝 Abstract

We describe von Neumann's elegant idea for sampling from the exponential distribution, Forsythe's generalization for sampling from a probability distribution whose density has the form exp(-G(x)), where G(x) is easy to compute (e.g. a polynomial), and my refinement of these ideas to give an efficient algorithm for generating pseudo-random numbers with a normal distribution. Later developments are also mentioned.

💡 Deep Analysis

Deep Dive into George Forsythes last paper.

We describe von Neumann’s elegant idea for sampling from the exponential distribution, Forsythe’s generalization for sampling from a probability distribution whose density has the form exp(-G(x)), where G(x) is easy to compute (e.g. a polynomial), and my refinement of these ideas to give an efficient algorithm for generating pseudo-random numbers with a normal distribution. Later developments are also mentioned.

📄 Full Content

Roger Hockney and I are the only people who were lucky enough to have both George Forsythe and Gene Golub as PhD advisors (see the Mathematics Genealogy Project).

In my case this came about because Gene went on sabbatical to the UK, and George took over while he was away. However, I managed to finish before Gene returned to Stanford. That was in the days before email, and there was a mail strike in UK, so communication with Gene was difficult. Perhaps that helped me to finish quickly, because if Gene had been at Stanford he probably would have asked me to do more work on the last chapter! Most of you here today know Gene, but only the older ones will remember George and Sandra Forsythe, so today I will talk about George Forsythe and an interesting link back to John von Neumann and the early days of computers.

In summer 1949 Forsythe attended some lectures at UCLA by John von Neumann on the topic of random number generation. The lectures were part of a Symposium on the (then new) Monte Carlo method [15, p. 236]. It seems that von Neumann never wrote up the lectures, but a fascinating 3-page summary was written by Forsythe and published in 1951.

Forsythe must have continued to think about the topic because, shortly before he died, he wrote a Stanford report Von Neumann’s comparison method for random sampling from the normal and other distributions (STAN-CS-72-254, dated February 9, 1972). This expanded on a brief comment by von Neumann that his method [for the exponential distribution] can be modified to yield a distribution satisfying any first-order differential equation.

Collected Works 5, 770. [4].

That was in the days before TOMS, when interesting algorithms were still published in Communications.

Suppose we want to sample a probability distribution with density

where G(x) is some simple function, e.g. a polynomial. Von Neumann illustrated his idea for the exponential distribution

The function f (x) satisfies a first-order linear differential equation

and conversely. That is why von Neumann made the remark about firstorder differential equations.

The obvious way to generate a sample from the exponential distribution is to generate a sample u ∈ (0, 1] from the uniform distribution and then take x = -ln(u) .

However, the evaluation of ln(u) is expensive (relative to the cost of generating u by an efficient uniform random number generator). Also, this method does not generalize well to the normal distribution, where we would need to evaluate the inverse of the normal distribution function

Von Neumann’s insight was that we can generate a random sample using a small number (on average) of samples from a uniform distribution, and evaluation of G(x) at a small number of points. There is no need to compute any expensive special functions!

Suppose for the moment that 0 ≤ u 1 = G ≤ 1. Generate samples u 2 , u 3 , . . . from the uniform distribution so long as the numbers are decreasing, and then stop. In other words, find n ≥ 1 such that

The probability that

so the probability of ( 1) is

Check: p 1 + p 2 + • • • = 1 by telescoping series, so the algorithm terminates with probability 1.

Exercise: The expected value of n is exp(G).

What is the probability that our final n is odd? It is just

This suggests a rejection method for generating a sample from the distribution with density exp(-G(x)) on some interval [a, b]:

  1. If n is even, return to step 1 (i.e. reject w).

This works because the probability that w is accepted, i.e. the probability that n is odd at step 3, is exactly exp(-G(w)).

The algorithm only works correctly if 0 ≤ G(w) ≤ 1 on the interval w ∈ [a, b].

To apply the idea to the exponential or normal distributions we have to split the infinite interval [0, +∞) or (-∞, +∞) into a union of finite intervals I k . Provided the intervals I k are small enough, we can use the algorithm to generate samples from each I k . Thus, first select k with the correct probability

For the exponential distribution, consider the intervals

For convenience on a binary computer, our sample should lie in I k with probability 2 -k , k = 1, 2, . . . We can select k by counting the leading zero bits in a uniform random number (giving k -1). Then we can apply a rejection method to get a sample with the correct distribution from I k .

For the normal distribution, it is convenient to randomly generate the sign, then consider the interval [0, ∞). We subdivide this interval into intervals

It is easy to precompute a table of the constants a k . The table is small, since we can neglect probabilities 2 -k if k is greater than the wordlength w of the computer.

For the exponential distribution, von Neumann took intervals I k = [k -1, k) so the probability of sampling from I k is (e -1)/e k . He did this because he had a trick for combining the trials with the selection of intervals. However, his trick does not generalize to other distributions.

For the normal distribution, Forsythe used intervals defined by a 0

…(Full text truncated)…

📸 Image Gallery

cover.png

Reference

This content is AI-processed based on ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut