First of all, perplexity has nothing to do with characterizing how often you guess something shrekcoin right. It has more to do with characterizing the complexity of a stochastic sequence. Or in a variety of other ways, and this should make it even clearer where «log-average inverse probability» comes from.
- So in this sense, perplexity is infinitely more unique/less arbitrary than entropy as a measurement.
- I think it’s worth pointing out that perplexity is invariant with the base you use to define entropy.
- Wikipedia article on perplexity does not give an intuitive meaning for the same.
- I came across term perplexity which refers to the log-averaged inverse probability on unseen data.
What is perplexity?
- It has more to do with characterizing the complexity of a stochastic sequence.
- So as you make make rolling one side of the die increasingly unlikely, the perplexity ends up looking as though the side doesn’t exist.
- Or in a variety of other ways, and this should make it even clearer where «log-average inverse probability» comes from.
- I think it’s worth pointing out that perplexity is invariant with the base you use to define entropy.
- First of all, perplexity has nothing to do with characterizing how often you guess something right.
So perplexity represents the number of sides of a fair die that when rolled, produces a sequence with the same entropy as your given probability distribution. I came across term perplexity which refers to the log-averaged inverse probability on unseen data. Wikipedia article on perplexity does not give an intuitive meaning for the same. So as you make make rolling one side of the die increasingly unlikely, the perplexity ends up looking as though the side doesn’t exist. I think it’s worth pointing out that perplexity is invariant with the base you use to define entropy. So in this sense, perplexity is infinitely more unique/less arbitrary than entropy as a measurement.
