A Farewell to Entropy: Statistical Thermodynamics Based on Information
✍ Scribed by Arieh Ben-Naim
- Publisher
- World Scientific
- Year
- 2008
- Tongue
- English
- Leaves
- 412
- Category
- Library
No coin nor oath required. For personal study only.
✦ Synopsis
The principal message of this book is that thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term entropy with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement would facilitate the interpretation of the driving force of many processes in terms of informational changes and dispel the mystery that has always enshrouded entropy. It has been 140 years since Clausius coined the term entropy ; almost 50 years since Shannon developed the mathematical theory of information subsequently renamed entropy. In this book, the author advocates replacing entropy by information, a term that has become widely used in many branches of science. The author also takes a new and bold approach to thermodynamics and statistical mechanics. Information is used not only as a tool for predicting distributions but as the fundamental cornerstone concept of thermodynamics, held until now by the term entropy. The topics covered include the fundamentals of probability and information theory; the general concept of information as well as the particular concept of information as applied in thermodynamics; the re-derivation of the Sackur Tetrode equation for the entropy of an ideal gas from purely informational arguments; the fundamental formalism of statistical mechanics; and many examples of simple processes the driving force for which is analyzed in terms of information. Contents: Elements of Probability Theory; Elements of Information Theory; Transition from the General MI to the Thermodynamic MI; The Structure of the Foundations of Statistical Thermodynamics; Some Simple Applications.
✦ Table of Contents
Cover......Page 1
Title Page......Page 5
Contents......Page 9
List of Abbreviations......Page 15
Preface......Page 17
1.1 A Brief History of Temperature and Entropy......Page 29
1.2 The Association of Entropy with Disorder......Page 37
1.3 The Association of Entropy with Missing Information......Page 47
2.1 Introduction......Page 61
2.2.1 The sample space, denoted Ω......Page 64
2.2.2 The field of events, denoted ......Page 65
2.2.3 The probability function, denoted ......Page 67
2.3 The Classical Definition......Page 71
2.4 The Relative Frequency Definition......Page 73
2.5 Independent Events and Conditional Probability......Page 78
2.5.1 Conditional probability and subjective probability......Page 86
2.5.2 Conditional probability and cause and effect......Page 90
2.5.3 Conditional probability and probability of joint events......Page 92
2.6 Bayes’ Theorem......Page 93
2.6.1 A challenging problem......Page 100
2.6.2 A more challenging problem: The three prisoners’ problem......Page 102
2.7 Random Variables, Average, Variance and Correlation......Page 104
2.8.1 The binomial distribution......Page 114
2.8.2 The normal distribution......Page 118
2.8.3 The Poisson distribution......Page 121
2.9 Generating Functions......Page 122
2.10 The Law of Large Numbers......Page 128
3 Elements of Information Theory......Page 131
3.1 A Qualitative Introduction to Information Theory......Page 132
3.2 Definition of Shannon’s Information and Its Properties......Page 138
3.2.1 Properties of the function for the simplest case of two outcomes......Page 140
3.2.2 Properties of for the general case of outcomes......Page 142
3.2.3 The consistency property of the missing information (MI)......Page 153
3.2.4 The case of an infinite number of outcomes......Page 158
3.2.4.1 The uniform distribution of locations......Page 160
3.2.4.2 The normal distribution of velocities or momenta......Page 162
3.2.4.3 The Boltzmann distribution......Page 165
3.3 The Various Interpretations of the Quantity ......Page 166
3.4 The Assignment of Probabilities by the Maximum Uncertainty Principle......Page 172
3.5 The Missing Information and the Average Number of Binary Questions Needed to Acquire It......Page 177
3.6 The False Positive Problem, Revisited......Page 198
3.7 The Urn Problem, Revisited......Page 200
4 Transition from the General MI to the Thermodynamic MI......Page 205
4.1 MI in Binding Systems: One Kind of Information......Page 206
4.1.2 Two different ligands on sites......Page 207
4.1.3 Two identical ligands on sites......Page 210
4.1.4 Generalization to ligands on sites......Page 11
4.2 Some Simple Processes in Binding Systems......Page 214
4.2.1 The analog of the expansion process......Page 215
4.2.2 A pure deassimilation process......Page 218
4.2.3 Mixing process in a binding system......Page 222
4.2.4 The dependence of MI on the characterization of the system......Page 224
4.3.1 The locational MI......Page 229
4.3.2 The momentum MI......Page 232
4.3.3 Combining the locational and the momentum MI......Page 233
4.4 Comments......Page 235
5 The Structure of the Foundations of Statistical Thermodynamics......Page 239
5.1 The Isolated System; The Micro-Canonical Ensemble......Page 241
5.2 System in a Constant Temperature; The Canonical Ensemble......Page 248
5.3 The Classical Analog of the Canonical Partition Function......Page 256
5.4 The Re-interpretation of the Sackur–Tetrode Expression from Informational Considerations......Page 260
5.5 Identifying the Parameter β for an Ideal Gas......Page 263
5.6 Systems at Constant Temperature and Chemical Potential; The Grand Canonical Ensemble......Page 264
5.7 Systems at Constant Temperature and Pressure; The Isothermal Isobaric Ensemble......Page 270
5.8 The Mutual Information due to Intermolecular Interactions......Page 272
6 Some Simple Applications......Page 279
6.1 Expansion of an Ideal Gas......Page 280
6.2 Pure, Reversible Mixing; The First Illusion......Page 283
6.3 Pure Assimilation Process; The Second Illusion......Page 285
6.3.1 Fermi–Dirac (FD) statistics; Fermions......Page 287
6.3.2 Bose–Einstein (BE) statistics; Bosons......Page 288
6.3.3 Maxwell–Boltzmann (MB) statistics......Page 289
6.4 Irreversible Process of Mixing Coupled with Expansion......Page 293
6.5 Irreversible Process of Demixing Coupled with Expansion......Page 296
6.6 Reversible Assimilation Coupled with Expansion......Page 298
6.7 Reflections on the Processes of Mixing and Assimilation......Page 300
6.8 A Pure Spontaneous Deassimilation Process......Page 312
6.9 A Process Involving only Change in the Momentum Distribution......Page 315
6.10 A Process Involving Change in the Intermolecular Interaction Energy......Page 318
6.11 Some Baffing Experiments......Page 321
6.12 The Second Law of Thermodynamics......Page 326
A Newton’s binomial theorem and some useful identities involving binomial coefficients......Page 345
B The total number of states in the Fermi–Dirac and the Bose–Einstein statistics......Page 347
C Pair and triplet independence between events......Page 349
D Proof of the inequality |(, )| ≤ 1 for the correlation coecient......Page 350
E The Stirling approximation......Page 354
F Proof of the form of the function ......Page 355
G The method of Lagrange undetermined multipliers......Page 359
H Some inequalities for concave functions......Page 362
I The MI for the continuous case......Page 368
J Identical and indistinguishable (ID) particles......Page 371
K The equivalence of the Boltzmann’s and Jaynes’ procedures to obtain the fundamental distribution of the canonical ensemble......Page 378
L An alternative derivation of the Sackur–Tetrode equation......Page 380
M Labeling and un-labeling of particles......Page 383
N Replacing a sum by its maximal term......Page 384
O The Gibbs paradox (GP)......Page 388
P The solution to the three prisoners’ problem......Page 391
1. Thermodynamics and Statistical Thermodynamics......Page 401
2. Probability and Information Theory......Page 404
4. Cited References......Page 406
Index......Page 409
📜 SIMILAR VOLUMES
The principal message of this book is that thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term entropy with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement woul
The principal message of this book is that thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term entropy with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement woul
The principal message of this book is that thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term "entropy" with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement wo