(information theory) A measure of the uncertainty associated with a random variable ; a measure of the average information content one is missing when one does not know the value of the random variable (usually in units such as bits); the amount of information (measured in, say, bits) contained per average instance of a character in a stream of characters.
Find Similar Words
Find similar words to information-entropy using the buttons below.
Words Starting With I and Ending With Y
Words Near Information-entropy in the Dictionary
- information and communication science and technology
- information assurance
- information communication technology
- information desk
- information engine
- information float
- information hazard
- information hiding
- information impactedness