back to list

Thermodynamic approach to entropy

🔗Dominique Larré <dominique.larre@...>

11/29/2002 2:58:06 AM

Hello members, this timid lurker of the "main" tuning list has just
discovered the wealth of this group's archives. Somehow the discussions here
are often clearer to me than the big group's.

As a chemical engineer by education my most immediate internal image of
entropy is related to thermodynamics.

In my mind's simplistic image, an amount of energy is represented by the
product of an intensive factor by an extensive factor. For example, work is
the product of force by distance, force being intensive, displacement being
the extensity. Similar "Intensity times Extensity" formulas exist for
electric energy, chemical energy, etc. Entropy is the extensity of thermal
energy, with temperature being its intensity, so that dQ = T multiplied by
dS.

Having read Paul's answer to David Finnamore request (Message 24 of this
group's archives), and being well aware that it is the same "entropy" that
appears in the probability distribution approach, I wonder if a simple
representation can be given for our "harmonic entropy" which would link it
to the thermodynamic model.

Greetings to all

Dominique Larr�

- - - - - - - - - - - - - -
P.S. Is anyone aware of Scala users in France, particularly (but not
exclusively) in the Paris area?

🔗wallyesterpaulrus <wallyesterpaulrus@...>

11/29/2002 6:28:59 AM

--- In harmonic_entropy@y..., Dominique Larré <dominique.larre@w...>
wrote:

> Having read Paul's answer to David Finnamore request (Message 24 of
this
> group's archives), and being well aware that it is the
same "entropy" that
> appears in the probability distribution approach, I wonder if a
simple
> representation can be given for our "harmonic entropy" which would
link it
> to the thermodynamic model.

fascinating, but i can't think of any such analogy . . .

🔗Carl Lumma <clumma@...>

11/29/2002 5:42:02 PM

>>Entropy is the extensity of thermal energy, with temperature
>>being its intensity, so that dQ = T multiplied by dS.

Sorry, I don't follow. Do you mean dQ = T(dS)? If so, what
are d and Q?

>>I wonder if a simple representation can be given for our
>>"harmonic entropy" which would link it to the thermodynamic
>>model.
>
>fascinating, but i can't think of any such analogy . . .

They are ultimately the same entropy, in the sense that they
correspond to a lack of knowledge about a system. You can
see that I don't follow the chemical side of the analogy, but
probably the answer is that the type of entropy we use on
this list is wrapped up in the concept of "temperature".

Feynman gives a thought experiment... a box with a removable
piston and a single gas molecule inside. With the piston
removed, the pressure of the gas inside the box is based on
the total volume of the box. If you know where in the box
the melecule is at a given instant, you can position the
piston in such a way to extract work from the gas. IIRC he
is able to derrive Shannon's entropy from 19th-century
thermodynamics with this analogy.

-Carl

🔗wallyesterpaulrus <wallyesterpaulrus@...>

11/30/2002 10:18:37 PM

--- In harmonic_entropy@y..., "Carl Lumma" <clumma@y...>
wrote:
> >>Entropy is the extensity of thermal energy, with temperature
> >>being its intensity, so that dQ = T multiplied by dS.
>
> Sorry, I don't follow. Do you mean dQ = T(dS)? If so, what
> are d and Q?

d means differential, and Q is energy. so

T = dQ/dS,

that is, temperature is the derivative of energy with respect to
entropy . . .

> >>I wonder if a simple representation can be given for our
> >>"harmonic entropy" which would link it to the thermodynamic
> >>model.
> >
> >fascinating, but i can't think of any such analogy . . .
>
> IIRC he
> is able to derrive Shannon's entropy from 19th-century
> thermodynamics with this analogy.

this i'd love to see!

🔗Carl Lumma <clumma@...>

12/1/2002 11:20:58 AM

> d means differential,

thought so.

> and Q is energy.

ah.

> so T = dQ/dS,
>
> that is, temperature is the derivative of energy with respect
> to entropy . . .

Ah, yes.

>> IIRC he is able to derrive Shannon's entropy from 19th-century
>> thermodynamics with this analogy.
>
> this i'd love to see!

pp. 140-148 in the "Lectures on Computation".

-Carl