How to build Generative Model when we have more than one variable
I have a data-frame which has looks similar to this:
A B C
1 2 2
2 4 3
4 8 5
9 16 7
16 32 11
22 43 14
28 55 17
34 67 20
40 79 23
A,B and C
can be assumed to be the features
in machine learning literature.
I have read maximum likelihood estimation for 1
variable assuming Gaussian distribution.
The equation is something like, where xi's are
each data-point:
Where x1,x2....xn
are n data points each having dimension 3
. If we assume p(x) to be gaussian, then we can use the Gaussian Normal distribution equation
as:
This is well understandable if we have only 1 feature.
How can I generalise the above normal distribution equation when we have more than 1 features
, here we have 3 features?
Can someone help me to write the maximum likelihood for the above data-frame?
Do we learn mu and sigma
for each features A,B and C
that is total of 6 learnable parameters?
If we have 3 different distribution say Normal, exponential and so on for columns A,B and C
then how does the MLE equation looks like over entire-data-frame?
If we do argmax of equation 1
, we don't require the ground truth for it. right? We are just maximising the equation?
Topic generative-models machine-learning
Category Data Science