Home / Get Math Help
Markoff Process
Alternate name
Definition
A random process whose future probabilities are determined by its most recent values. A stochastic process x(t) is called Markov if for every n and t_1
Related term
Associated person
Andrey Markov