back to list

Information Theory

🔗David Finnamore <daeron@...>

9/30/2000 10:14:09 AM

Paul,

Maybe a good thing to do near the beginning of this new list is to
cover the bare basics of Info Theory as they apply to Harmonic
Entropy Theory. I think I might have read a little about it once in
Discovery Magazine or some such place. As I understand it, "entropy"
in Info Theory has to do with the efficiency with which a
communiqué communicates, essentially. You've used the term
"compressibility." If a message is concise and clear, it has low
entropy. If it's either wordy or vague, it has high entropy. Is
that in the ballpark? If so, let's expand on the specific
application
to H. E. This was probably roughly covered on the Tuning List a
couple of years ago but it might be wise to give a refresher here for
the sake of getting off on the right foot.

David Finnamore

🔗Paul Erlich <PERLICH@...>

10/1/2000 1:23:50 PM

--- In harmonic_entropy@egroups.com, "David Finnamore" <daeron@b...>
wrote:
> Paul,
>
> Maybe a good thing to do near the beginning of this new list is to
> cover the bare basics of Info Theory as they apply to Harmonic
> Entropy Theory. I think I might have read a little about it once in
> Discovery Magazine or some such place. As I understand
it, "entropy"
> in Info Theory has to do with the efficiency with which a
> communiqué communicates, essentially. You've used the term
> "compressibility." If a message is concise and clear, it has low
> entropy. If it's either wordy or vague, it has high entropy. Is
> that in the ballpark?

Yup!

> If so, let's expand on the specific
> application
> to H. E. This was probably roughly covered on the Tuning List a
> couple of years ago but it might be wise to give a refresher here
for
> the sake of getting off on the right foot.

From http://www.ixpres.com/interval/td/entropy.htm:

[In a Farey or other series of fractions, representing dyads from the
harmonic series than the brain could ideally recognize as such, the
simpler-integer ratios take up a lot of room, defined as the interval
between the mediant below and the mediant above, in interval space,
and so are associated with large "slices" of the probability
distribution, while the more complex ratios are more crowded and
therefore are associated with smaller "slices." Now the harmonic
entropy is defined, just like in information theory, as the [minus
the] sum over all ratios of a certain function of the probability
associated with that ratio. The function is x*log(x). (See an
information theory text to find out why.) When the true interval is
near a simple-integer ratio, there will be one large probability and
many much smaller ones. When the true interval is far from any simple-
integer ratios, many more complex ratios will all have roughly equal
probabilities. The entropy function will come out quite small in the
former case, and quite large in the latter case. In the case of 700
cents, 3/2 will have far more probabilty than any other ratio, and
the harmonic entropy is nearly minimal. In the case of 300 cents, 6/5
will have the largest probability in most cases, but 7/6, 13/11, and
19/16 will all have non-negligible amounts of probability, so the
harmonic entropy is moderate. In the case of 100 cents, 15/14, 16/15,
17/16, 18/17, 19/18, 20/19, and 1/1 will all have significant
probability, and the harmonic entropy is nearly maximal.

🔗David J. Finnamore <daeron@...>

10/2/2000 8:51:05 PM

Thanks, Paul, for the re- over-view. That clears things up for me for now.

--
David J. Finnamore
Nashville, TN, USA
http://personal.bna.bellsouth.net/bna/d/f/dfin/index.html
--