back to list

Poor man's harmonic entropy?

🔗Gene Ward Smith <gwsmith@svpal.org>

7/20/2003 12:49:55 PM

If x is a positive real number representing an interval, it's
suggested that Pc(x) = exp((log(p/q)-x)^2/2c)) can model the
probability that x is heard as p/q; here c is a parameter. If we take
the sum

sum q^(-d) Pc(x)

over positive rationals p/q, then it isn't hard to see that this
converges absolutely for high enough values of d--anything above 2,
at any rate.

A problem with this is that it mixes multiplicative and implicitly
additive distance measures, since |log(p/q)-x| is multiplicative,
while q does not depend on the octave and, like the Farey sequence,
is implicitly additive.

🔗Gene Ward Smith <gwsmith@svpal.org>

7/21/2003 4:09:38 AM

--- In tuning-math@yahoogroups.com, "Gene Ward Smith" <gwsmith@s...>
wrote:
> If x is a positive real number representing an interval, it's
> suggested that Pc(x) = exp((log(p/q)-x)^2/2c)) can model the
> probability that x is heard as p/q; here c is a parameter. If we
take
> the sum
>
> sum q^(-d) Pc(x)
>
> over positive rationals p/q, then it isn't hard to see that this
> converges absolutely for high enough values of d--anything above 2,
> at any rate.

I think maybe the plan should be to get a continuous function the
Tenney Height way, by

sum_{p/q > 0} Pc(x, p/q)/(p*q)

🔗Carl Lumma <ekin@lumma.org>

7/23/2003 11:15:09 AM

Gene wrote...

>>If x is a positive real number representing an interval, it's
>>suggested that Pc(x) = exp((log(p/q)-x)^2/2c)) can model the
>>probability that x is heard as p/q; here c is a parameter. If
>>we take the sum
>>
>>sum q^(-d) Pc(x)
>>
>>over positive rationals p/q, then it isn't hard to see that
>>this converges absolutely for high enough values of d--anything
>>above 2, at any rate.

Why are you summing? Presumably to get harmonic entropy for x,
which is the entropy of the distribution of probabilities for all
p/q. Does the q^(-d) term do that, and if so, how?

>>A problem with this is that it mixes multiplicative and
>>implicitly additive distance measures, since |log(p/q)-x| is
>>multiplicative, while q does not depend on the octave and,
>>like the Farey sequence, is implicitly additive.
>
>I think maybe the plan should be to get a continuous function
>the Tenney Height way, by
>
>sum_{p/q > 0} Pc(x, p/q)/(p*q)

Um, not clear how this could possibly work. Paul empirically
verified that p*q approximates the "width" (and also the entropy?)
for x. Where I'm totally at a loss for how to define "width".

Anyway, goal number 1 is to do extend things to triads and
up. Where I think p*q*r is supposed tell you something about the
space around triads on a 2-D plot... or something. It's been
a long time...

-Carl

🔗Gene Ward Smith <gwsmith@svpal.org>

7/23/2003 1:13:02 PM

--- In tuning-math@yahoogroups.com, Carl Lumma <ekin@l...> wrote:
> Gene wrote...
>
> >>If x is a positive real number representing an interval, it's
> >>suggested that Pc(x) = exp((log(p/q)-x)^2/2c)) can model the
> >>probability that x is heard as p/q; here c is a parameter. If
> >>we take the sum
> >>
> >>sum q^(-d) Pc(x)
> >>
> >>over positive rationals p/q, then it isn't hard to see that
> >>this converges absolutely for high enough values of d--anything
> >>above 2, at any rate.
>
> Why are you summing? Presumably to get harmonic entropy for x,
> which is the entropy of the distribution of probabilities for all
> p/q. Does the q^(-d) term do that, and if so, how?

I've replace the q^(-d) with (pq)^(-d), and I think taking d=1 is
fine. The weighting gives more weight to the better consonances, and
it makes the series converge to a continuous function. What values of
c have mostly been used? A graph would be nice.

> >I think maybe the plan should be to get a continuous function
> >the Tenney Height way, by
> >
> >sum_{p/q > 0} Pc(x, p/q)/(p*q)
>
> Um, not clear how this could possibly work. Paul empirically
> verified that p*q approximates the "width" (and also the entropy?)
> for x. Where I'm totally at a loss for how to define "width".

I'm not clear why it wouldn't work.

> Anyway, goal number 1 is to do extend things to triads and
> up. Where I think p*q*r is supposed tell you something about the
> space around triads on a 2-D plot... or something. It's been
> a long time...

You could take a similar sum over all p:q:r:

F(x) = sum_{p:q:r reduced} 1/(pqr) (Pc(x, p/q)+Pc(x, p/r)+Pc(x,r/q))

🔗Paul Erlich <perlich@aya.yale.edu>

7/23/2003 1:52:55 PM

--- In tuning-math@yahoogroups.com, Carl Lumma <ekin@l...> wrote:

> Um, not clear how this could possibly work. Paul empirically
> verified that p*q approximates the "width"

that's 1/sqrt(p*q), carl. i also verified that for triads, the "area"
is approximately 1/cuberoot(p*q*r), for p:q:r in lowest terms . . .

the total set of dyads or triads in each case is delimited by a very
high (much higher than the p*q or p*q*r above) maximum value for the
product of the terms. this ("tenney" as opposed to "farey") is the
only rule that gives a "uniform" (trendless) density over dyad space
or triad space.

> (and also the entropy?)

yes, if p and q are small enough, log(p*q) + c approximates the
entropy. tenney's harmonic distance function is log(p*q).

> for x. Where I'm totally at a loss for how to define "width".

in harmonic entropy, you can use mediant-to-mediant width, or you can
just use midpoint-to-midpoint width (which generalizes to voronoi
cells for triads).

> Anyway, goal number 1 is to do extend things to triads and
> up. Where I think p*q*r is supposed tell you something about the
> space around triads on a 2-D plot... or something. It's been
> a long time...

yup, we've done that on the harmonic entropy list.

🔗Carl Lumma <ekin@lumma.org>

7/23/2003 2:42:20 PM

Hooray!

>> Um, not clear how this could possibly work. Paul empirically
>> verified that p*q approximates the "width"
>
>that's 1/sqrt(p*q), carl.

Sure.

>i also verified that for triads, the "area" is approximately
>1/cuberoot(p*q*r), for p:q:r in lowest terms . . .

I thought the rationals were infinitely dense. So how do you
define "width"?

>the total set of dyads or triads in each case is delimited by a very
>high (much higher than the p*q or p*q*r above) maximum value for the
>product of the terms. this ("tenney" as opposed to "farey") is the
>only rule that gives a "uniform" (trendless) density over dyad space
>or triad space.

Thanks for the recap.

But don't we want to eventually say the total set is infinite?
Can we still define "width" then?

-Carl

🔗shelly roberts <naughty_nina_4u@yahoo.com>

7/23/2003 3:14:26 PM

Carl,thanks for your reply, but it doesn't make a bit of sense....thanks, shelly

Carl Lumma <ekin@lumma.org> wrote:Gene wrote...

>>If x is a positive real number representing an interval, it's
>>suggested that Pc(x) = exp((log(p/q)-x)^2/2c)) can model the
>>probability that x is heard as p/q; here c is a parameter. If
>>we take the sum
>>
>>sum q^(-d) Pc(x)
>>
>>over positive rationals p/q, then it isn't hard to see that
>>this converges absolutely for high enough values of d--anything
>>above 2, at any rate.

Why are you summing? Presumably to get harmonic entropy for x,
which is the entropy of the distribution of probabilities for all
p/q. Does the q^(-d) term do that, and if so, how?

>>A problem with this is that it mixes multiplicative and
>>implicitly additive distance measures, since |log(p/q)-x| is
>>multiplicative, while q does not depend on the octave and,
>>like the Farey sequence, is implicitly additive.
>
>I think maybe the plan should be to get a continuous function
>the Tenney Height way, by
>
>sum_{p/q > 0} Pc(x, p/q)/(p*q)

Um, not clear how this could possibly work. Paul empirically
verified that p*q approximates the "width" (and also the entropy?)
for x. Where I'm totally at a loss for how to define "width".

Anyway, goal number 1 is to do extend things to triads and
up. Where I think p*q*r is supposed tell you something about the
space around triads on a 2-D plot... or something. It's been
a long time...

-Carl

Yahoo! Groups SponsorADVERTISEMENT

To unsubscribe from this group, send an email to:
tuning-math-unsubscribe@yahoogroups.com

Your use of Yahoo! Groups is subject to the Yahoo! Terms of Service.

---------------------------------
Do you Yahoo!?
Yahoo! SiteBuilder - Free, easy-to-use web site design software

🔗Paul Erlich <perlich@aya.yale.edu>

7/23/2003 3:25:43 PM

--- In tuning-math@yahoogroups.com, Carl Lumma <ekin@l...> wrote:
> Hooray!
>
> >> Um, not clear how this could possibly work. Paul empirically
> >> verified that p*q approximates the "width"
> >
> >that's 1/sqrt(p*q), carl.
>
> Sure.
>
> >i also verified that for triads, the "area" is approximately
> >1/cuberoot(p*q*r), for p:q:r in lowest terms . . .
>
> I thought the rationals were infinitely dense.

sorry, i should have said *proportional to*, the width is
*proportional* to 1/sqrt(p*q) (for p and q not too large), and the
harmonic entropy is *proportional* to log(p*q) (for p and q quite
small).

> But don't we want to eventually say the total set is infinite?

yes, we want to take the limit.

> Can we still define "width" then?

yes, the width will go to zero in the limit, but the whole entropy
expression may still converge (or some function of it may), as you
should know from calculus.

🔗Carl Lumma <ekin@lumma.org>

7/23/2003 5:05:34 PM

>Carl, thanks for your reply, but it doesn't make a bit of
>sense....thanks, shelly

Hi Shelly. Does Gene's original post make sense to you?
How long have you been on this list?

-Carl

🔗Carl Lumma <ekin@lumma.org>

7/23/2003 5:06:44 PM

>>>I think maybe the plan should be to get a continuous function
>>>the Tenney Height way, by
>>>
>>>sum_{p/q > 0} Pc(x, p/q)/(p*q)
>>
>>Um, not clear how this could possibly work. Paul empirically
>>verified that p*q approximates the "width" (and also the
>>entropy?) for x. Where I'm totally at a loss for how to
>>define "width".
>
>I'm not clear why it wouldn't work.

Do you just see this stuff, or does your thought process
involve multiple steps?

-Carl

🔗Carl Lumma <ekin@lumma.org>

7/23/2003 5:07:15 PM

>> Can we still define "width" then?
>
>yes, the width will go to zero in the limit, but the whole entropy
>expression may still converge (or some function of it may), as you
>should know from calculus.

Ok, good. So are there any open problems in harmonic entropy?
Would it be accurate to say we've empirically verified the p*q*r
stuff, but that we don't have a clean picture of why it works?

-Carl

🔗Gene Ward Smith <gwsmith@svpal.org>

7/23/2003 10:01:04 PM

--- In tuning-math@yahoogroups.com, "Carl Lumma" <ekin@l...> wrote:

> Do you just see this stuff, or does your thought process
> involve multiple steps?

Experience tells me I could be completely out to lunch, working on a
different problem than the one I think I am, seeing things as obvious
which aren't, or mired in a host of other problems. Do you understand
I'm working with an expression which, at least, involves all rational
intervals and converges to a continuous function? It seems to me that
has to be worth something.

🔗Paul Erlich <perlich@aya.yale.edu>

7/23/2003 11:16:14 PM

--- In tuning-math@yahoogroups.com, "Carl Lumma" <ekin@l...> wrote:
> >> Can we still define "width" then?
> >
> >yes, the width will go to zero in the limit, but the whole entropy
> >expression may still converge (or some function of it may), as you
> >should know from calculus.
>
> Ok, good. So are there any open problems in harmonic entropy?

yup -- summarized them for gene a while back.

> Would it be accurate to say we've empirically verified the p*q*r
> stuff, but that we don't have a clean picture of why it works?
>
> -Carl

i'm sure gene could tell us why we get 1/cuberoot(p*q*r) works. as
for the entropy for the very simplest triads being log(p*q*r)+c, that
actually hasn't been verified yet. i'm hoping gene will help us do it
analytically rather than numerically . . .

🔗monz@attglobal.net

7/24/2003 12:26:26 AM

hi Gene,

> From: Gene Ward Smith [mailto:gwsmith@svpal.org]
> Sent: Wednesday, July 23, 2003 10:01 PM
> To: tuning-math@yahoogroups.com
> Subject: [tuning-math] Re: Poor man's harmonic entropy?
>
>
> --- In tuning-math@yahoogroups.com, "Carl Lumma" <ekin@l...> wrote:
>
> > Do you just see this stuff, or does your thought process
> > involve multiple steps?
>
> Experience tells me I could be completely out
> to lunch, working on a different problem than
> the one I think I am, seeing things as obvious
> which aren't, or mired in a host of other problems.
> Do you understand I'm working with an expression
> which, at least, involves all rational intervals
> and converges to a continuous function? It seems
> to me that has to be worth something.

i really haven't been following this thread, or
any developments in harmonic entropy theory since
around early 2000, but my intuition has a sense
that you're onto something good here. can you
elaborate and then explain the math a little?

PS to paul -- i'd like to update my sonic-arts
pages on harmonic entropy so that they reflect
some of the further developments of the theory.
can you write an addendum to my current pages?
i can just add it on at the bottom.

-monz

🔗Carl Lumma <ekin@lumma.org>

7/24/2003 12:54:39 AM

>> Do you just see this stuff, or does your thought process
>> involve multiple steps?
>
>Experience tells me I could be completely out to lunch, working
>on a different problem than the one I think I am, seeing things
>as obvious which aren't, or mired in a host of other problems.

:)

>Do you understand I'm working with an expression which, at least,
>involves all rational intervals and converges to a continuous
>function?

Yes indeed. Though I don't see how you know it converges. Did
you test a few values? No, you say it "isn't hard to see" that
it does... maybe you can walk us through it.

>It seems to me that has to be worth something.

No argument here.

-Carl

🔗Paul Erlich <perlich@aya.yale.edu>

7/24/2003 1:51:55 AM

--- In tuning-math@yahoogroups.com, <monz@a...> wrote:

> PS to paul -- i'd like to update my sonic-arts
> pages on harmonic entropy so that they reflect
> some of the further developments of the theory.
> can you write an addendum to my current pages?
> i can just add it on at the bottom.
>
>
>
> -monz

this exchange from an e-mail from me to you dated apr. 14th:

*******************************************************************

>if any text or links should be added to my harmonic entropy
>definition, would you be so kind as to send them over? :)

could you add, right before "certain chords of three or more notes",
the following:

assuming an error distribution that follows an exponential decay on
either side of the actual interval, instead of the usual default bell
curve, yields harmonic entropy curves that have pointy, instead of
round, local minima. accentuating the differences among the more
dissonant intervals by taking the exponential of the entropy (in
information theory, the exponential of the entropy is the so-
called "alphabet size"), and assuming a tuning resolution
representing the best human ears (s=0.6%), this "pointy" assumption
lets us paint the most generous possible picture (within the harmonic
entropy paradigm) for the ability to distinguish complex ratios by
their relative consonance:

{reproduce
/tuning-math/files/dyadic/margo.gif}

thanks,
paul

********************************************************************

also, on any of the harmonic entropy pages, you might want to display

/tuning-math/files/trivoro.gif

the 2-dimensional space here is triad space, just like on the graphs
on your eqtemp page. the center of each cell represents a triad a:b:c
such that a*b*c is less than a million, and the cell itself contains
all (and only) points that are closer to its center than to the
center of any other cell. the largest cells are the ones with the
smallest value of a*b*c, as shown more clearly in this closeup:

/tuning-math/files/Erlich/fun.gif

the area of each cell as a function of the geometric mean of a, b,
and c, in other words cuberoot(a*b*c), is shown at

/tuning-math/files/triadic.gif

showing that the areas are proportional to 1/cuberoot(a*b*c), at
least for a*b*c not too large.

finally, there's a bunch of stuff that can go on the "old ideas" page
to connect them better to the "new ideas" assumed in the definition
page. let me get back to you on that.

🔗Gene Ward Smith <gwsmith@svpal.org>

7/24/2003 2:45:57 AM

--- In tuning-math@yahoogroups.com, "Paul Erlich" <perlich@a...>
wrote:

> i'm sure gene could tell us why we get 1/cuberoot(p*q*r) works.

I can? I don't even know what you mean.

🔗Gene Ward Smith <gwsmith@svpal.org>

7/24/2003 3:07:09 AM

--- In tuning-math@yahoogroups.com, Carl Lumma <ekin@l...> wrote:

> Yes indeed. Though I don't see how you know it converges. Did
> you test a few values? No, you say it "isn't hard to see" that
> it does... maybe you can walk us through it.

Sorry!

Assume d>1 and chop everything up into octaves. For the 1-2 octave,
for a given denominator q of p/q, compare to the sum

(p_i q)^(-d) (p_0/q + ... + p_phi(q)/q)

where p_0 = q and we go through values p_i relatively prime to q. Our
terms are all positive, and this expression is less than

q^(-2d) (q/q + (q+1)/q + ... + (2*q-1)/q) < 2q^(-d)

Since d>1, this converges by comparison to the Zeta function
Dirichlet series 1+2^(-d) + 3^(-d) + ... Hence, the sum of the series
over the octave is bounded by 2 Zeta(d), and in fact for each octave,
the sum of the functions of x is bounded by 2 Zeta(d) M_O, where M_O
is the maximum of Ps(x, p/q) over the ocatve O. The normal
distribution tends rapidly to zero, so you have a uniformly
convergent series of continuous functions summing over the octaves,
leading to a continuous function.

This is the sort of thing which was in my head; maybe it can be made
clearer.

🔗Graham Breed <graham@microtonal.co.uk>

7/24/2003 5:50:40 AM

Gene Ward Smith wrote:

> Assume d>1 and chop everything up into octaves. For the 1-2 octave, > for a given denominator q of p/q, compare to the sum
> > (p_i q)^(-d) (p_0/q + ... + p_phi(q)/q)
> > where p_0 = q and we go through values p_i relatively prime to q. Our > terms are all positive, and this expression is less than
> > q^(-2d) (q/q + (q+1)/q + ... + (2*q-1)/q) < 2q^(-d)

Does that mean the original sum is over only those p/q in their lowest terms?

> Since d>1, this converges by comparison to the Zeta function > Dirichlet series 1+2^(-d) + 3^(-d) + ... Hence, the sum of the series > over the octave is bounded by 2 Zeta(d), and in fact for each octave,
> the sum of the functions of x is bounded by 2 Zeta(d) M_O, where M_O > is the maximum of Ps(x, p/q) over the ocatve O. The normal > distribution tends rapidly to zero, so you have a uniformly > convergent series of continuous functions summing over the octaves, > leading to a continuous function.

If c < 0.

Graham

🔗Paul Erlich <perlich@aya.yale.edu>

7/24/2003 11:35:40 AM

--- In tuning-math@yahoogroups.com, "Gene Ward Smith" <gwsmith@s...>
wrote:
> --- In tuning-math@yahoogroups.com, "Paul Erlich" <perlich@a...>
> wrote:
>
> > i'm sure gene could tell us why we get 1/cuberoot(p*q*r) works.
>
> I can? I don't even know what you mean.

where did you lose me? i thought the last few posts were quite clear,
in particular the graphs that show the above to be true.

🔗Gene Ward Smith <gwsmith@svpal.org>

7/24/2003 1:12:11 PM

--- In tuning-math@yahoogroups.com, Graham Breed <graham@m...> wrote:

Does that mean the original sum is over only those p/q in their
lowest
> terms?

Of course.