Get Math Help

GET TUTORING NEAR ME!

(800) 434-2582

By submitting the following form, you agree to Club Z!'s Terms of Use and Privacy Policy

    Home / Get Math Help

    Markov Sequence

    Definition

    A sequence X_1, X_2, ... of random variates is called Markov (or Markoff) if, for any n, F(X_n|X_(n - 1), X_(n - 2), ..., X_1) = F(X_n|X_(n - 1)), i.e., if the conditional distribution F of X_n assuming X_(n - 1), X_(n - 2), ..., X_1 equals the conditional distribution F of X_n assuming only X_(n - 1). The transitional densities of a Markov sequence satisfy the Chapman-Kolmogorov equation.

    Associated person

    Andrey Markov

    Back to List | POWERED BY THE WOLFRAM LANGUAGE