---------------------------------------
Random numbers and other random objects
---------------------------------------
In probability theory, a random number is just a random variable x,
i.e., a measurable function on the set Omega of possible experiments,
that assigns to each experiment omega in Omega the value x(omega)
of x in this experiment. Here Omega comes equipped with a normalized
positive measure mu that assigns to every measurable subset S of Omega
the probability mu(S) that a randomly chosen experiment belongs to S.
In the important, 'noninformative' case where the measure is invariant
under a group transitive on Omega, so that all experiments are
identical copies of one another, physicists refer to this set Omega
as a (classical) 'ensemble', although they are usually too vague to
express this in formal terms.
The terminology easily extends to the inhomogeneous case if one
allows in ensembles each realization with a different frequency.
Mathematicians prefer to leave the set Omega (which they call the
'sample space') unspecified and talk about 'realizations' in place of
'experiments'. Thus, for each experiment omega in Omega, x(omega) is
a realization of x, i.e., what physicists would call the value found
in this particular experiment.
By giving a specific definition of the sigma algebra of interest
(defining which subsets of Omega are measurable), the measure assigning
probabilities, and a specific recipe defining the x(omega), one has a
model world in which realizations make perfect sense.
A difficulty is, of course, that we do not have such a model for the
real world, and hence must resort to empirical approximations when
treating real-life problems. (This places physicists at a slight
disatvantage; however, there is the compensating advantage that their
results apply to real life instead of only satisfying one's sense of
beauty and precision....)
The only thing not specified in probability theory (unless one specifies
a particular model as indicated above) is the mechanism that draws
the number, and hence there is no way to know which experiment omega
has been realized. Therefore, probability theory makes only statements
about _all_ realizations simultaneously.
Example. Given the axioms of probability theory, a random number
uniformly distributed between zero and one is defined as a random
variable x such that
= integral_0^1 f(s) ds
for all Lebesgue-integrable functions f on [0,1], and any x(omega) is a
realization of it, i.e., an actual number in [0,1]. (In particular,
random numbers are _not_ numbers; only their realizations are!)
Mechanisms to draw numbers that may be used as approximations to a
sequence of independent realizations x(omega) are called randon number
generators. They do not produce random numbers (since random numbers
are not numbers but measurable functions). Instead, they produce
sequences that look like typical realizations of sequences of
independent, uniformly distributed random numbers - in the sense that
they usually pass with high confidence level certain statistical tests
valid for such random sequences (this is discussedin any textbook on
statistics).
Therefore, the numbers they generate are used in practice as (often
completely adequate) substitutes for random numbers.
(On the other hand, there is no uniformly distributed random natural
number since the uniform measure on natural numbers,
mu(f) = sum_{k>=0} f(k) is not normalizable.)
Random numbers are comparably simple objects. More complicated random
objects need more sophisiticated ensembles but otherwise everything
remains analogous.
Let us consider the physically important example of Brownian motion.
Brownian motion (the random walk in space) is modelled by an ensemble
whose realizations (members) are the H"older differentiable
functions on R^3 with exponent 1/2. The probability of any particular
realization of a random walk is exactly zero, and statements with
positive probability must hold in uncountably many realizations.
Nevertheless, the ensemble is precisely the set Omega composed of all
such realizations. And the appropriate sigma algebra carrying the
Wiener measure needed to describe the random walk is indeed an algebra
of subsets of Omega.
Repeatedly tossing a fair coin is also a (kind of trivial) stochastic
process. A fair coin that can be thrown an unlimited number of times
with independent outcomes (sampling with replacement) cannot be
modelled by the sigma algebra 2^{0,1} over Omega_1 ={0,1}, since
this has not even two independent bits. Its sigma algebra is based
on the infinite ensemble Omega_inf consisting of all possible
sequences of outcomes, and is the tensor product of infinitely many
copies of 2^{0,1}. This setting is necessary in order to provide
meaning to the concept of 'independent trial' which
underlies most of statisitcal reasoning.
Because of the assumed independence of the trials, one can reduce all
computations to computations within 2^{0,1}. This is generally done
in elementary probability theory, to simplify the presentation.
But once one looks at binary processes which are even slightly
correlated (history-dependent), one needs the full sigma algebra
over Omega_inf.