![]() This procedure eliminates the uncertainty as to the proper constraints, and easily generalizes to arbitrary composite systems, for which it provides a simpler alternative to the Jaynes PME. This procedure is shown to be equivalent to maximizing the BGS entropy of the system alone subject to a contrived nonlinear constraint which reduces to (a) the usual linear constraint for an infinite heat bath, and (b) a previously enigmatic logarithmic constraint which implies a power-law distribution for a large but finite heat bath. A Maximum Entropy Method for Expert System Construction ALAN LIPPMAN Division of Applied Mathematics Brown University Providence, R.I. By Boltzmann's (aka statistical mechanics) relation, entropy is analogous to the number of configurations ( S k B log. This means that once S reaches a stationary point ( maximum entropy) the system is in equilibrium. ![]() Here we show that the correct noncanonical distribution for a system in equilibrium with a finite heat bath is implied by the unconstrained maximization of the total BGS entropy of the system and bath together. By the 2nd Law, the entropy of a system increases or becomes a maximum i.e S 0. However, the rationale for linear constraints is nebulous, and probability distributions are not always canonical. maximum entropy formalism, stressing its consistency and inter- derivability with the other principles of probability theory. The resulting probability distributions are of canonical (exponential) form. The principle of maximum entropy (PME), as expounded by Jaynes, is based on the maximization of the Boltzmann-Gibbs-Shannon (BGS) entropy subject to linear constraints.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |