Markoff chain n : a Markov process for which the parameter is discrete time values [syn: {Markov chain}]
版权所有 © 2024 3Dict.cn