What is... entropy?

This page hosts information on Yvon Vignaud's talk "What is ... entropy?" at the WhatIsSeminar.

Abstract

Entropy is a state function measuring microscopic disorder of a system. Its importance is related to its nature of " arrow of time ": considering an isolated system, its entropy is increasing over time, unlike its energy which is a conserved quantity. It is a rather subtle and yet fundamental notion, which is especially useful in information theory or statistical mechanics. We aim to follow an intuitive approach so as to help grasping the concept of entropy.

Comments

 

Topic revision: r1 - 13 May 2009, PeterKrautzberger
 
  • Printable version of this topic (p) Printable version of this topic (p)