Entropy :Entropy ( Princeton Series in Applied Mathematics )

Publication subTitle :Entropy

Publication series :Princeton Series in Applied Mathematics

Author: Greven Andreas;Keller Gerhard;Warnecke Gerald  

Publisher: Princeton University Press‎

Publication year: 2014

E-ISBN: 9781400865222

P-ISBN(Paperback): 9780691113388

Subject: O551 thermology

Keyword: 数理科学和化学

Language: ENG

Access to resources Favorite

Disclaimer: Any content in publications that violate the sovereignty, the constitution or regulations of the PRC is not accepted or approved by CNPIEC.

Description

The concept of entropy arose in the physical sciences during the nineteenth century, particularly in thermodynamics and statistical physics, as a measure of the equilibria and evolution of thermodynamic systems. Two main views developed: the macroscopic view formulated originally by Carnot, Clausius, Gibbs, Planck, and Caratheodory and the microscopic approach associated with Boltzmann and Maxwell. Since then both approaches have made possible deep insights into the nature and behavior of thermodynamic and other microscopically unpredictable processes. However, the mathematical tools used have later developed independently of their original physical background and have led to a plethora of methods and differing conventions.


The aim of this book is to identify the unifying threads by providing surveys of the uses and concepts of entropy in diverse areas of mathematics and the physical sciences. Two major threads, emphasized throughout the book, are variational principles and Ljapunov functionals. The book starts by providing basic concepts and terminology, illustrated by examples from both the macroscopic and microscopic lines of thought. In-depth surveys covering the macroscopic, microscopic and probabilistic approaches follow. Part I gives a basic introduction from the views of thermodynamics and probability theory. Part II collects surveys that look at the macroscopic approach of continuum mechanics and physics. Part III deals with the microscopic approach ex

Chapter

3.3 Relative Entropy as a Measure of Discrimination

3.4 Entropy Maximization under Constraints

3.5 Asymptotics Governed by Entropy

3.6 Entropy Density of Stationary Processes and Fields

References

PART 2. ENTROPY IN THERMODYNAMICS

Chapter 4. Phenomenological Thermodynamics and Entropy Principles

4.1 Introduction

4.2 A Simple Classification of Theories of Continuum Thermodynamics

4.3 Comparison of Two Entropy Principles

4.3.1 Basic Equations

4.3.2 Generalized Coleman–Noll Evaluation of the Clausius–Duhem Inequality

4.3.3 Müller–Liu's Entropy Principle

4.4 Concluding Remarks

References

Chapter 5. Entropy in Nonequilibrium

5.1 Thermodynamics of Irreversible Processes and Rational Thermodynamics for Viscous, Heat-Conducting Fluids

5.2 Kinetic Theory of Gases, the Motivation for Extended Thermodynamics

5.2.1 A Remark on Temperature

5.2.2 Entropy Density and Entropy Flux

5.2.3 13-Moment Distribution. Maximization of Nonequilibrium Entropy

5.2.4 Balance Equations for Moments

5.2.5 Moment Equations for 13 Moments. Stationary Heat Conduction

5.2.6 Kinetic and Thermodynamic Temperatures

5.2.7 Moment Equations for 14 Moments. Minimum Entropy Production

5.3 Extended Thermodynamics

5.3.1 Paradoxes

5.3.2 Formal Structure

5.3.3 Pulse Speeds

5.3.4 Light Scattering

5.4 A Remark on Alternatives

References

Chapter 6. Entropy for Hyperbolic Conservation Laws

6.1 Introduction

6.2 Isothermal Thermoelasticity

6.3 Hyperbolic Systems of Conservation Laws

6.4 Entropy

6.5 Quenching of Oscillations

References

Chapter 7. Irreversibility and the Second Law of Thermodynamics

7.1 Three Concepts of (Ir)reversibility

7.2 Early Formulations of the Second Law

7.3 Planck

7.4 Gibbs

7.5 Carathéodory

7.6 Lieb and Yngvason

7.7 Discussion

References

Chapter 8. The Entropy of Classical Thermodynamics

8.1 A Guide to Entropy and the Second Law of Thermodynamics

8.2 Some Speculations and Open Problems

8.3 Some Remarks about Statistical Mechanics

References

PART 3. ENTROPY IN STOCHASTIC PROCESSES

Chapter 9. Large Deviations and Entropy

9.1 Where Does Entropy Come From?

9.2 Sanov's Theorem

9.3 What about Markov Chains?

9.4 Gibbs Measures and Large Deviations

9.5 Ventcel–Freidlin Theory

9.6 Entropy and Large Deviations

9.7 Entropy and Analysis

9.8 Hydrodynamic Scaling: an Example

References

Chapter 10. Relative Entropy for Random Motion in a Random Medium

10.1 Introduction

10.1.1 Motivation

10.1.2 A Branching Random Walk in a Random Environment

10.1.3 Particle Densities and Growth Rates

10.1.4 Interpretation of the Main Theorems

10.1.5 Solution of the Variational Problems

10.1.6 Phase Transitions

10.1.7 Outline

10.2 Two Extensions

10.3 Conclusion

10.4 Appendix: Sketch of the Derivation of the Main Theorems

10.4.1 Local Times of Random Walk

10.4.2 Large Deviations and Growth Rates

10.4.3 Relation between the Global and the Local Growth Rate

References

Chapter 11. Metastability and Entropy

11.1 Introduction

11.2 van der Waals Theory

11.3 Curie–Weiss Theory

11.4 Comparison between Mean-Field and Short-Range Models

11.5 The 'Restricted Ensemble'

11.6 The Pathwise Approach

11.7 Stochastic Ising Model. Metastability and Nucleation

11.8 First-Exit Problem for General Markov Chains

11.9 The First Descent Tube of Trajectories

11.10 Concluding Remarks

References

Chapter 12. Entropy Production in Driven Spatially Extended Systems

12.1 Introduction

12.2 Approach to Equilibrium

12.2.1 Boltzmann Entropy

12.2.2 Initial Conditions

12.3 Phenomenology of Steady-State Entropy Production

12.4 Multiplicity under Constraints

12.5 Gibbs Measures with an Involution

12.6 The Gibbs Hypothesis

12.6.1 Pathspace Measure Construction

12.6.2 Space-Time Equilibrium

12.7 Asymmetric Exclusion Processes

12.7.1 MEP for ASEP

12.7.2 LFT for ASEP

References

Chapter 13. Entropy: a Dialogue

References

PART 4. ENTROPY AND INFORMATION

Chapter 14. Classical and Quantum Entropies: Dynamics and Information

14.1 Introduction

14.2 Shannon and von Neumann Entropy

14.2.1 Coding for Classical Memory1ess Sources

14.2.2 Coding for Quantum Memoryless Sources

14.3 Kolmogorov–Sinai Entropy

14.3.1 KS Entropy and Classical Chaos

14.3.2 KS Entropy and Classical Coding

14.3.3 KS Entropy and Algorithmic Complexity

14.4 Quantum Dynamical Entropies

14.4.1 Partitions of Unit and Decompositions of States

14.4.2 CNT Entropy: Decompositions of States

14.4.3 AF Entropy: Partitions of Unit

14.5 Quantum Dynamical Entropies: Perspectives

14.5.1 Quantum Dynamical Entropies and Quantum Chaos

14.5.2 Dynamical Entropies and Quantum Information

14.5.3 Dynamical Entropies and Quantum Randomness

References

Chapter 15. Complexity and Information in Data

15.1 Introduction

15.2 Basics of Coding

15.3 Kolmogorov Sufficient Statistics

15.4 Complexity

15.5 Information

15.6 Denoising with Wavelets

References

Chapter 16. Entropy in Dynamical Systems

16.1 Background

16.1.1 Dynamical Systems

16.1.2 Topological and Metric Entropies

16.2 Summary

16.3 Entropy, Lyapunov Exponents, and Dimension

16.3.1 Random Dynamical Systems

16.4 Other Interpretations of Entropy

16.4.1 Entropy and Volume Growth

16.4.2 Growth of Periodic Points and Horseshoes

16.4.3 Large Deviations and Rates of Escape

References

Chapter 17. Entropy in Ergodic Theory

References

Combined References

Index

The users who browse this book also browse


No browse record.