Find link

language:

jump to random article

Find link is a tool written by Edward Betts.

searching for Continuous-time Markov chain 9 found (83 total)

alternate case: continuous-time Markov chain

Empirical process (895 words) [view diff] no match in snippet view article find links to article

In probability theory, an empirical process is a stochastic process that characterizes the deviation of the empirical distribution function from its expectation
Kolmogorov equations (1,405 words) [view diff] no match in snippet view article find links to article
In probability theory, Kolmogorov equations, including Kolmogorov forward equations and Kolmogorov backward equations, characterize continuous-time Markov
Birth process (1,329 words) [view diff] no match in snippet view article find links to article
In probability theory, a birth process or a pure birth process is a special case of a continuous-time Markov process and a generalisation of a Poisson
Marc A. Suchard (282 words) [view diff] exact match in snippet view article find links to article
Weiss, R. E., & Sinsheimer, J. S. (2001). Bayesian selection of continuous-time Markov chain evolutionary models. Molecular Biology and Evolution, 18(6),
Models of DNA evolution (6,312 words) [view diff] no match in snippet view article find links to article
A number of different Markov models of DNA sequence evolution have been proposed. These substitution models differ in terms of the parameters used to describe
Markov decision process (5,086 words) [view diff] exact match in snippet view article find links to article
model, which means our continuous-time MDP becomes an ergodic continuous-time Markov chain under a stationary policy. Under this assumption, although the
Stochastic simulation (3,715 words) [view diff] no match in snippet view article find links to article
Hu, Michele Joyner, Kathryn Link, Simulation Algorithms for Continuous Time Markov Chain Models, [online] available at http://www.ncsu
Entropy production (4,239 words) [view diff] exact match in snippet view article find links to article
production can be defined mathematically in such processes. For a continuous-time Markov chain with instantaneous probability distribution p i ( t ) {\displaystyle
Petri net (7,240 words) [view diff] no match in snippet view article find links to article
In this case, the nets' reachability graph can be used as a continuous time Markov chain (CTMC). Dualistic Petri Nets (dP-Nets) is a Petri Net extension