Antisymmetry of graph of Information vs probability
The formula for Information given by a data of occurring with probability p is: I=-log2 p
This formula gives the bits if information needed to know the outcome of the event.
This formula captures the intuition that the information needed to know the outcome of an event with probability 1 is 0 as we already know that outcome of the event.
So shouldn't the formula give the information as 0 for the event with probability 0 as we know the outcome of the event.
So shouldn't the graph of the I vs p be a symmetric as the values of p close to 1 will give similar information as the events with p close to 0?
Topic probability metadata categorical-data
Category Data Science