Multinomial Distribution

Page Contents

Multinomial Model

We will now extend the method of maximum likelihood estimation and testing to a generalisation of the binomial model .

If we have a series of ~n tests, each test can have one of ~k outcomes, ~a_1, ... ~a_~k, the probability of each outcome being the same in each test. i.e. _ _ p ( ~a_1 ) = &theta._1, ... p ( ~a_~k ) = &theta._~k , _ _ &Sigma. &theta._~i = 1 .

If _ ~r_~i represent the number of times the result ~a_~i has occured in the ~n tests, then what is the probability of getiing ~r_1 times ~a_1 , ~r_2 times ~a_2, ... , ~r_~k times ~a_~k in the ~n trials. Obviously we have the restriction _ &Sigma. ~r_~i = ~n _ ( as ~a_1, ... ~a_~k are the only possible outcomes.) Then

p ( ~r_1, ... ~r_~k ) _ = _ comb{ ~n, {~r_1, ... ~r_~k} } &theta._1^{~r_1} &theta._2^{~r_2} ... &theta._~k^{~r_~k}

where

comb{ ~n, {~r_1, ... ~r_~k} } _ = _ fract{ ~n#!, {~r_1#! ~r_2#! ... ~r_~k#!} }

This is called the #~{multinomial distribution} with parameter {&theta.} = ( &theta._1, &theta._2 , ... , &theta._~k )

 

 

Multinomial Parameters

Note that _ &theta. &in. [ 0 , 1 ] # [ 0 , 1 ] # ... # [ 0 , 1 ] _ = _ [ 0 , 1 ]^~k , _ but that there is also the further restriction that _ &Sigma. &theta._~i = 1 , _ so that we could express any one of the elements of &theta. as a linear sum of the ~k - 1 others, for example

&theta._~k _ = _ 1 - sum{&theta._~i,1,~k - 1}

So to define the parameter it is sufficient to take &theta.' &in. [ 0 , 1 ]^{~k - 1}. The model is therefore said to have dimension ~k - 1.