Entropy

From FasciPedia
Revision as of 17:21, 9 February 2023 by Bacchus (talk | contribs) (Text replacement - "theory" to "theory")
Jump to navigation Jump to search

Entropy is a scientific concept, one of the unbreakable universal laws of nature, and a measurable physical property, that is most commonly associated with a state of disorder, randomness, or uncertainty. In layman's terms, it says that all things eventually break down, and in order to create new things, other things must be broken down to compensate. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life (excluding evolutionary theory which claims the exact opposite), in cosmology, economics, sociology, weather science, and information systems such as DNA and genetics, including the transmission of information in telecommunication. The thermodynamic concept was referred to by Scottish scientist and engineer William Rankine in 1850 with the names thermodynamic function and heat-potential.