1.

In Markov analysis, state probabilities must.

A. Sum to one
B. Be less than one
C. Be greater than one
D. None of the above
Answer» B. Be less than one


Discussion

No Comment Found