
Let X a random variable{random variable}. A stochastic process (associated to X) is a
function of X and t.
X
t
is called a stochastic function of X or a
stochastic process. Generally probability depends on the
history of values of X
t
before t
i
. One defines the conditional probability
as the probability of X
t
to take a value between x and
x + dx, at time t
i
knowing the values of X
t
for times anterior to t
i
(or X
t
"history"). A
Markov process is a stochastic process with the property that for any set of succesive
times one has:
P
i | j
denotes the probability for i conditions to be satisfied, knowing j anterior events. In
other words, the expected value of X
t
at time t
n
depends only on the value of X
t
at
previous time t
n − 1
. It is defined by the transition matrix by P
1
and P
1 | 1
(or equivalently
by the transition density function f (x,t)
1
and f (x ,t | x ,t )
1 | 1 2 2 1 1
. It can be seen that two
functions f
1
and f
1 | 1
defines a Markov{Markov process} process if and only if they
verify:
• the Chapman-Kolmogorov equation{Chapman-Kolmogorov equation}:
•
A Wiener process{Wiener process}{Brownian motion} (or Brownian motion) is a
Markov process for which:
Using equation eqnecmar, one gets:
As stochastic processes were defined as a function of a random variable and time, a large
class\footnote{This definition excludes however discontinuous cases such as Poisson