back to list

Harmonic entropy

🔗genewardsmith@juno.com

11/11/2001 2:31:50 PM

Is someone ever going to give a precise definition? One uploaded to
the files area of the entropy group would be nice. If you can
calculate it, you can define it--if nothing else works, say exactly
what you are calculating.

🔗Paul Erlich <paul@stretch-music.com>

11/12/2001 5:12:22 PM

--- In tuning-math@y..., genewardsmith@j... wrote:
> Is someone ever going to give a precise definition? One uploaded to
> the files area of the entropy group would be nice. If you can
> calculate it, you can define it--if nothing else works, say exactly
> what you are calculating.

Well, there are variations. But it's always a function (meant to
reflect one component of dissonance) of a precise specified input
interval or chord. And entropy is always defined as in information
theory:

sum(p*log(p))

where the sum is over all possible states and p is the probability of
each state.

For harmonic entropy, each "state" is a just dyad (or n-ad in future
versions), i.e., a ratio. The universe of possible ratios is
determined by a rule (such as max(p,q)<N or p*q<N, generally any rule
such that p(j)*q(i) - p(i)*q(j) = 1 for any pair of adjacent ratios p
(i)/q(i), p(j)/q(j), and with N tending toward infinity). The
probability of each dyad is determined by determining the
corresponding area under a normal curve (whose s.d. is an input
parameter) standard, centered around the actual input value. The
width of the "slice" corresponding to each dyad is determined
assuming that it occupies the full "range" between the adjacent
mediants.

My Matlab entropy function is of the form

output = entropy(cents,s,N)

where s is a cents value for the input interval, s is the s.d. of the
normal curve, and N is (these days) typically used for the rule
p*q<N. Why, because:

It was found that the local minima P/Q tend to satisfy P*Q<C no
matter what "rule" was chosen. The curve as a whole has an
overall "slope" unless the Tenney rule (p*q<N) is chosen. Then it is
often found that the entropy for the simpler local minima is
proportional to the Tenney Harmonic Distance, log(P*Q). It is also
found that using 1/sqrt(p*q) as a proxy for each dyad's "range", and
simply multiplying this by the height of the bell curve _exactly_ at
p/q, leads to a nearly identical functional appearance, except that
there is less sensitivity to tiny changes in N.

Open questions:

Is there a function, F(x,y), such that F(entropy,s) is invariant to
changes in N? For s=1%, F(entropy,1%) = exp(entropy/2.3) seemed to
work.

Can we explicitly calculate what this function converges to for N-
>infinity? Or at least prove that is does converge, and calculate the
limit to some computational error?

Can we prove that the observations mentioned above (about the local
minima and about proxying for the width being OK) are in some sense
true?

I prepared a full plan for calculating triadic harmonic entropy. See
the harmonic_entropy@yahoogroups.com archives. How can we optimize
the calculation?

🔗genewardsmith@juno.com

11/16/2001 7:09:33 PM

--- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote:

> For harmonic entropy, each "state" is a just dyad (or n-ad in
future
> versions), i.e., a ratio. The universe of possible ratios is
> determined by a rule (such as max(p,q)<N or p*q<N, generally any
rule
> such that p(j)*q(i) - p(i)*q(j) = 1 for any pair of adjacent ratios
p
> (i)/q(i), p(j)/q(j), and with N tending toward infinity).

The
> probability of each dyad is determined by determining the
> corresponding area under a normal curve (whose s.d. is an input
> parameter) standard, centered around the actual input value.

The "actual input value" is a certain number of cents, so how can a
normal curve be centered around it?

The
> width of the "slice" corresponding to each dyad is determined
> assuming that it occupies the full "range" between the adjacent
> mediants.

A "dyad" is a fraction reduced to lowest terms, e.g 5/4, if I am
following you. The "adjacent mediants" is not clear, but perhaps you
mean the fractions on each side of the Farey sequence in which
the "dyad" first appears? In the case of 5/4, that would be
1/1 < 5/4 < 4/3, so we would integrate a normal function between 1
and 4/3 to get p?

🔗Paul Erlich <paul@stretch-music.com>

11/17/2001 6:14:55 PM

--- In tuning-math@y..., genewardsmith@j... wrote:
> > The
> > probability of each dyad is determined by determining the
> > corresponding area under a normal curve (whose s.d. is an input
> > parameter) standard, centered around the actual input value.
>
> The "actual input value" is a certain number of cents,

Right.

> so how can a
> normal curve be centered around it?

If the number of cents is c, the curve is

y=1/(s*sqrt(2*pi))*exp((x-c)^2/2*s^2)

where x is the position on the interval axis (have you looked at any
of the harmonic entropy curves)?

> > The
> > width of the "slice" corresponding to each dyad is determined
> > assuming that it occupies the full "range" between the adjacent
> > mediants.
>
> A "dyad" is a fraction reduced to lowest terms, e.g 5/4, if I am
> following you.

Yes.

> The "adjacent mediants" is not clear, but perhaps you
> mean the fractions on each side of the Farey sequence in which
> the "dyad" first appears?

No, I mean the mediants between the fraction and its immediate
neighbors in a Farey sequence of order N.

> In the case of 5/4, that would be
> 1/1 < 5/4 < 4/3, so we would integrate a normal function between 1
> and 4/3 to get p?

You would integrate between two complicated ratios, typically both
very close to 5/4, but far closer to 5/4's immediate successor and
immediate predecessor in the Farey sequence of order N (because N is
large). Thus 5/4 would typically end up with a much greater
probability than its neighbors.