Entropy is a measure of disorder or uncertainty in a system, often used in the context of thermodynamics and information theory. It is a measure of the amount of energy available to perform work in a system. Entropy can also be used to describe the amount of information available in a system.