Find link

Find link is a tool written by Edward Betts.

searching for Entropy (astrophysics) 47 found (3490 total)

alternate case: entropy (astrophysics)

Standard molar entropy (433 words) [view diff] no match in snippet view article find links to article

In chemistry, the standard molar entropy is the entropy content of one mole of substance, under standard conditions (not standard temperature and pressure
Entropy encoding (317 words) [view diff] no match in snippet view article find links to article
In information theory an entropy encoding is a lossless data compression scheme that is independent of the specific characteristics of the medium. One
Entropy (information theory) (6,386 words) [view diff] no match in snippet view article find links to article
In information theory, entropy (more specifically, Shannon entropy) is the expected value (average) of the information contained in each message received
Entropy (10,175 words) [view diff] no match in snippet view article find links to article
This article is about entropy in thermodynamics. For other uses, see Entropy (disambiguation). For a more accessible and less technical introduction
Measure-preserving dynamical system (866 words) [view diff] no match in snippet view article find links to article
the measure-theoretic entropy of a dynamical system. The entropy of a partition Q is defined as The measure-theoretic entropy of a dynamical system
Free entropy (432 words) [view diff] no match in snippet view article find links to article
A thermodynamic free entropy is an entropic thermodynamic potential analogous to the free energy. Also known as a Massieu, Planck, or Massieu–Planck potentials
Entropy of vaporization (168 words) [view diff] no match in snippet view article find links to article
to be confused with Enthalpy of vaporization. The entropy of vaporization is the increase in entropy upon vaporization of a liquid. This is always positive
Entropy of fusion (237 words) [view diff] no match in snippet view article find links to article
The entropy of fusion is the increase in entropy when melting a substance. This is almost always positive since the degree of disorder increases in the
Negentropy (1,095 words) [view diff] no match in snippet view article find links to article
also negative entropy, syntropy, extropy, ectropy or entaxy, of a living system is the entropy that it exports to keep its own entropy low; it lies at
Orders of magnitude (entropy) (121 words) [view diff] no match in snippet view article find links to article
magnitude of entropy. Orders of magnitude (data) Order of magnitude (terminology) Jean-Bernard Brissaud (14 February 2005). "The Meaning of Entropy" (PDF)
Principle of maximum entropy (2,741 words) [view diff] no match in snippet view article find links to article
learning, see maximum entropy classifier. For other uses, see maximum entropy (disambiguation). The principle of maximum entropy states that, subject
Entropy monitoring (440 words) [view diff] no match in snippet view article find links to article
Entropy monitoring is a method of assessing anaesthetic depth. It was commercially developed by Datex-Ohmeda, now part of GE Healthcare. It relies on
The Entropy Tango (66 words) [view diff] no match in snippet view article find links to article
The Entropy Tango is a novel by British fantasy and science fiction writer Michael Moorcock. It is part of his long running Jerry Cornelius series.
Boltzmann's entropy formula (948 words) [view diff] no match in snippet view article find links to article
mechanics, Boltzmann's equation is a probability equation relating the entropy S of an ideal gas to the quantity W, which is the number of microstates
The Entropy Plague (111 words) [view diff] no match in snippet view article find links to article
The Entropy Plague is a Big Finish Productions audio drama based on the long-running British science fiction television series Doctor Who. It concludes
Maximum entropy probability distribution (1,827 words) [view diff] no match in snippet view article find links to article
statistics and information theory, a maximum entropy probability distribution is a probability distribution whose entropy is at least as great as that of all other
Differential entropy (1,363 words) [view diff] no match in snippet view article find links to article
Differential entropy (also referred to as continuous entropy) is a concept in information theory that extends the idea of (Shannon) entropy, a measure of
Rényi entropy (1,504 words) [view diff] no match in snippet view article find links to article
theory, the Rényi entropy generalizes the Hartley entropy, the Shannon entropy, the collision entropy and the min-entropy. Entropies quantify the diversity
Nonextensive entropy (90 words) [view diff] no match in snippet view article find links to article
has proposed a nonextensive entropy (Tsallis entropy), which is a generalization of the traditional Boltzmann–Gibbs entropy. The rationale behind the theory
Joint entropy (178 words) [view diff] no match in snippet view article find links to article
information theory, joint entropy is a measure of the uncertainty associated with a set of variables. The joint Shannon entropy of two variables and
Entropy (statistical thermodynamics) (2,293 words) [view diff] no match in snippet view article find links to article
mechanics, the entropy function earlier introduced by Clausius is interpreted as statistical entropy using probability theory. The statistical entropy perspective
Conditional quantum entropy (352 words) [view diff] no match in snippet view article find links to article
conditional quantum entropy is an entropy measure used in quantum information theory. It is a generalization of the conditional entropy of classical information
Conformational entropy (458 words) [view diff] no match in snippet view article find links to article
Not to be confused with configurational entropy. Conformational entropy is the entropy associated with the number of conformations of a molecule. The concept
Von Neumann entropy (1,867 words) [view diff] no match in snippet view article find links to article
statistical mechanics, the von Neumann entropy, named after John von Neumann, is the extension of classical Gibbs entropy concepts to the field of quantum mechanics
Conditional entropy (343 words) [view diff] no match in snippet view article find links to article
In information theory, the conditional entropy (or equivocation) quantifies the amount of information needed to describe the outcome of a random variable
Binary entropy function (394 words) [view diff] no match in snippet view article find links to article
In information theory, the binary entropy function, denoted or , is defined as the entropy of a Bernoulli process with probability of success p. Mathematically
Introduction to entropy (2,989 words) [view diff] no match in snippet view article find links to article
the main encyclopedia article, see Entropy. The idea of "irreversibility" is central to the understanding of entropy. Everyone has an intuitive understanding
Cross entropy (784 words) [view diff] no match in snippet view article find links to article
In information theory, the cross entropy between two probability distributions over the same underlying set of events measures the average number of bits
Entropy (Buffy the Vampire Slayer) (1,601 words) [view diff] no match in snippet view article find links to article
"Entropy" is the 18th episode of season 6 of the television series Buffy the Vampire Slayer. The Trio, riding ATVs, pursue two vampires through a cemetery;
Configuration entropy (383 words) [view diff] no match in snippet view article find links to article
In statistical mechanics, configuration entropy is the portion of a system's entropy that is related to the position of its constituent particles rather
Entropy (energy dispersal) (2,354 words) [view diff] no match in snippet view article find links to article
The description of entropy as energy dispersal provides an introductory method of teaching the thermodynamic concept of entropy. In physics and physical
Entropy rate (242 words) [view diff] no match in snippet view article find links to article
In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the
Q-exponential distribution (362 words) [view diff] no match in snippet view article find links to article
probability distribution arising from the maximization of the Tsallis entropy under appropriate constraints, including constraining the domain to be
Maximum entropy spectral estimation (385 words) [view diff] no match in snippet view article find links to article
Maximum entropy spectral estimation is a method of spectral density estimation. The goal is to improve the spectral quality based on the principle of
Entropy (comics) (373 words) [view diff] no match in snippet view article find links to article
Entropy is a Cosmic Entity in the Marvel Comics Universe who possesses Nigh-Omnipotence. A representation of Eternity formed at the beginning of time
Entropy and life (3,215 words) [view diff] no match in snippet view article find links to article
Research concerning the relationship between the thermodynamic quantity entropy and the evolution of life began around the turn of the 20th century. In
Temperature–entropy diagram (134 words) [view diff] no match in snippet view article find links to article
diagram. A temperature entropy diagram, or T-s diagram, is used in thermodynamics to visualize changes to temperature and specific entropy during a thermodynamic
Information theory (4,681 words) [view diff] no match in snippet view article find links to article
information is entropy, which is usually expressed by the average number of bits needed to store or communicate one symbol in a message. Entropy quantifies
Entropy (film) (47 words) [view diff] no match in snippet view article find links to article
Entropy is a 1999 film directed by Phil Joanou, starring Stephen Dorff and featuring the Irish rock band U2. The film is largely autobiographical, covering
Joint quantum entropy (575 words) [view diff] no match in snippet view article find links to article
The joint quantum entropy generalizes the classical joint entropy to the context of quantum information theory. Intuitively, given two quantum states
Entropy (arrow of time) (4,829 words) [view diff] no match in snippet view article find links to article
Entropy is the only quantity in the physical sciences (apart from certain rare interactions in particle physics; see below) that requires a particular
Cross-entropy method (704 words) [view diff] no match in snippet view article find links to article
The cross-entropy (CE) method attributed to Reuven Rubinstein is a general Monte Carlo approach to combinatorial and continuous multi-extremal optimization
Topological entropy (1,063 words) [view diff] no match in snippet view article find links to article
article is about entropy in geometry and topology. For other uses, see Entropy (disambiguation). In mathematics, the topological entropy of a topological
Software entropy (260 words) [view diff] no match in snippet view article find links to article
be confused with information entropy. A work on software engineering by Ivar Jacobson et al. describes software entropy as follows: The second law of
History of entropy (2,297 words) [view diff] no match in snippet view article find links to article
The concept of entropy developed in response to the observation that a certain amount of functional energy released from combustion reactions is always
Black hole thermodynamics (1,846 words) [view diff] no match in snippet view article find links to article
(its) entropy as it falls in, giving a decrease in entropy. Generalized second law introduced as total entropy = black hole entropy + outside entropy. Extremal
Entropy / Send Them (867 words) [view diff] no match in snippet view article find links to article
'"Send Them/Entropy (Hip Hop Reconstruction from the Ground Up)"', is a double A side EP by Asia Born (now known as Lyrics Born) and DJ Shadow and the