slovodefinícia
Markoff chain
(gcide)
Markov chain \Mark"ov chain\, n. [after A. A. Markov, Russian
mathematician, b. 1856, d. 1922.] (Statistics)
A random process (Markov process) in which the probabilities
of discrete states in a series depend only on the properties
of the immediately preceding state or the next preceeding
state, independent of the path by which the preceding state
was reached. It differs from the more general Markov process
in that the states of a Markov chain are discrete rather than
continuous. Certain physical processes, such as diffusion of
a molecule in a fluid, are modelled as a Markov chain. See
also random walk. [Also spelled Markoff chain.]
[PJC]
markoff chain
(wn)
Markoff chain
n 1: a Markov process for which the parameter is discrete time
values [syn: Markov chain, Markoff chain]
podobné slovodefinícia
Markoff chain
(gcide)
Markov chain \Mark"ov chain\, n. [after A. A. Markov, Russian
mathematician, b. 1856, d. 1922.] (Statistics)
A random process (Markov process) in which the probabilities
of discrete states in a series depend only on the properties
of the immediately preceding state or the next preceeding
state, independent of the path by which the preceding state
was reached. It differs from the more general Markov process
in that the states of a Markov chain are discrete rather than
continuous. Certain physical processes, such as diffusion of
a molecule in a fluid, are modelled as a Markov chain. See
also random walk. [Also spelled Markoff chain.]
[PJC]

Nenašli ste slovo čo ste hľadali ? Doplňte ho do slovníka.

na vytvorenie tejto webstránky bol pužitý dictd server s dátami z sk-spell.sk.cx a z iných voľne dostupných dictd databáz. Ak máte klienta na dictd protokol (napríklad kdict), použite zdroj slovnik.iz.sk a port 2628.

online slovník, sk-spell - slovníkové dáta, IZ Bratislava, Malé Karpaty - turistika, Michal Páleník, správy, údaje o okresoch V4