الصفحة الرئيسية > Term: Markov chain
Markov chain
A stochastic process with a finite number of states in which the probability of occurrence of a future state is conditional only upon the current state; past states are inconsequential. In meteorology, Markov chains have been used to describe a raindrop size distribution in which the state at time step n + 1 is determined only by collisions between pairs of drops comprising the size distribution at time step n.
- قسم من أقسام الكلام: noun
- المجال / النطاق: الطقس
- الفئة: علم الأرصاد الجوية
- Company: AMS
0
المنشئ
- Kevin Bowles
- 50% positive feedback