back to list

RE: [tuning] Concordance Ex Nihilo [tuning experiment]

🔗Paul H. Erlich <PERLICH@ACADIAN-ASSET.COM>

8/23/2000 1:19:19 PM

Joseph Pehrson wrote,

>If I'm understanding the concept of harmonic entropy at all, when
>higher values of N are used... i.e. larger numbers for the ratios and
>more included pitches, there is not so much "entropy," since there is
>more choice and the smaller values don't "stand out" so much as when
>there is less choice and N is smaller (??)

I think you have the right insight as to what is going on, though you've got
the definition of "entropy" upside-down (as the graph should show) -- there
is more entropy when there is more choice and the simpler ratios don't
"stand out" so much as when there is less choice and N is smaller . . .

>But, it will be complete if I could kindly post some of the results
>as AUDIBLE .mp3 files on the little "Tuning Lab" site... I'm anxious
>to do that...

When I get a chance, I'll make these files for you . . .

🔗Joseph Pehrson <pehrson@pubmedia.com>

8/23/2000 1:36:54 PM

--- In tuning@egroups.com, "Paul H. Erlich" <PERLICH@A...> wrote:

> I think you have the right insight as to what is going on, though
>you've got the definition of "entropy" upside-down (as the graph
>should show)

Whoops. Just forgot what the term "entropy" meant exactly. You can
see it's been a while since I've taken a physics class. Hopefully,
this list will make up for it a little bit!

[JP]:
> >But, it will be complete if I could kindly post some of the results
> >as AUDIBLE .mp3 files on the little "Tuning Lab" site... I'm
>anxious to do that...
>
> When I get a chance, I'll make these files for you . . .

GREAT!!!!!!!!!

________ ____ ___ __ _
Joseph Pehrson

🔗jacky_ekstasis@yahoo.com

8/23/2000 2:22:07 PM

"Joseph Pehrson" <pehrson@p...> wrote:

> > I think you have the right insight as to what is going on, though
> >you've got the definition of "entropy" upside-down (as the graph
> >should show)
>
> Whoops. Just forgot what the term "entropy" meant exactly.

I too must admit that I've been thinking the whole time we were
speaking of the variety of "Entropy" used in Communication Theory:

"A measure of the efficiency of a system (as a code or a language) in
transmitting information, being equal to the logarithm of the number
of different messages that can be sent by selection from the same set
of symbols and thus indicating the degree of inital uncertainty that
can be resolved by any one message."

Are we in fact talking about the physics variety of entropy? Guess
I'm a little slow on this. Paul would you be so kind as to guide me
to a past post or paper that will clarify this for me and I will
endeavor to follow more closely. Forgive my naivete.

Thanks,

Jacky