back to list

Quantitative harmonic entropy

🔗Graham Breed <graham@microtonal.co.uk>

9/25/2000 7:48:19 AM

Hello there Paul and friends!

I haven't been following the ongoing discussion on harmonic entropy,
so to write this message I've refreshed myself from this page:

http://www.ixpres.com/interval/td/erlich/entropy.htm

There doesn't seem to be enough information there to reproduce the
graphs. The important quotes seem to be

Paul Erlich:

"When I worked out a model for harmonic entropy, which should also
describe critical band roughness if the partials decrease in
amplitude
in some specific fashion, I derived that to a good approximation, the
complexity of a just ratio is directly related to its DENOMINATOR."

Joe Monzo:

"DENOMINATOR should read "the smaller term in the ratio", because in
some cases the numerater can be the smaller term."

(To comment on Monz's commentary, the intervals are actually defined
as >=1, so the denominator must be the smaller term.)

and

Paul Erlich:

"...the assumption that our brain can ideally recognize ratios with
numerator up to N but our hearing of frequencies is blurred in the
form of a normal distribution with standard deviation 1% (based on
Goldstein's work)."

The first part, about the denominator as an indicator of complexity,
I
remember from way back, and will discuss below.

The second part seems to cover the process of moving from a measure
of
complexity of integer ratios to a continuous function that depends on
pitch. I'd like more details on this, including the algorithm used
to
generate the graphs.

Right, to the denominator limit then.

Paul did explain this a long time ago on the Mills list, but I don't
have that post to hand. So here's how I remember it.

You start by assuming that no intervals with a virtual root (is that
the right word?) below a certain pitch can be resolved. This is
justified with a reference to psychoacoustics (Terhardt?). If the
higher note is held at a constant pitch, this means that the
resolvable approximations must belong to a Farey series. It is then
shown that, in the limit where the size of the Farey series tends to
infinity, a measure of complexity equal to the frequency of the guide
tone comes out. As the higher not is held constant, this means
complexity is proportional to the denominator.

Now, my objection to this is that the denominator result is
arbitrary.
The very approximation used to prove this -- of the resolvable
ratios
becoming arbitrarily large -- also means the size of the intervals
only changes by an infinitessimal amount for different rational
approximations. So the size of the denominator must be directly
proportional to the size of the numerator! Hence either can be used
as an index of complexity.

So, we're left with the same conclusion as Partch: small integer
ratios are more concordant.

However, Paul's result still has some connection with entropy I
think.
The question is: how to apply it where intervals differ by a finite
amount? It seemed to me that the most important result is that the
complexity is proportional to the guide tone. The problem is that
transposing an interval will change its complexity. This isn't what
we want: a fourth should have the same complexity wherever it occurs.

So, we have to decide on what to hold constant when comparing
intervals.

When the higher note is held constant, the guide tone is proportional
to the denominator.

When the lower note is held constant, the guide tone is proportional
to the numerator.

When the arithmetic mean of the frequencies is held constant, the
guide tone is proportional to the arithmetic mean of the numerator
and
denominator.

When the geometric mean of the frequencies is held constant, the
guide
tone is proportional to the geometric mean of the numerator and
denominator.

I consider these functions to all be equally valid indicators of
complexity, in the light of Paul's derivition. He thought otherwise,
and we had an argument about it that didn't go anywhere.

I notice that some charts are drawn using the "Mann series". A bit
higher up, Monz says "... a series such as used by Mann where the sum
of numerator and denominator does not exceed a certain limit." I
take
it, then, that the Mann series results are those you'd get from
holding the arithmetic mean constant. Sure enough, the graphs are
qualitatively the same.

My opinion is that the geometric mean is the one to go for, because
that connects with pitch being perceived on a logarithmic scale.

Now, the interesting question is: how do we generalize this to
chords?
It's to be hoped that we come up with a non-arbitrary measure of
accordance that rates otonal chords as more concordant than utonal
ones. To find this, I'd prefer to look at psychoacoustics, rather
than number theory.

The simplest measure is to stay with the guide tone. It's easy to
find this for a chord: multiply the virtual root by the lcm.
Unfortunately, this gives the wrong results: utonal chords would be
more concordant than otonal ones. It's also not intuitively obvious
why Paul's result should be generalized in this way.

So, let's take a step back. Like in Pauls derivation of complexity
of
dyads, set an arbitrary limit on the virtual pitch. This would lead
to chords being more concordant the higher the virtual pitch.

I'm not sure if this result is the equivalent of the guide-tone rule
for chords, or if a better rule needs to be derived from it. If the
latter, it may be doable numerically.

Anyway, a virtual root limit gives the following measures of
complexity of chords. I'm considering the chord as a ratio like
4:5:6
or 10:12:15. Complexity is the reciprocal of the virtual root (this
is easier to describe than the vr itself).

When the highest note is held constant, complexity is proportional to
the highest number in the ratio.

When the lowest note is held constant, complexity is proportional to
the lowest number in the ratio.

When the arithmetic mean of frequencies is held constant, complexity
is proportional to the arithmetic mean of the numbers in the ratio.

When the geometric mean of frequencies (arithmetic mean of pitches)
is
held constant, complexity is proportional to the geometric mean of
the
numbers in the ratio.

For choosing rational approximations infinitessimally close to a
given
interval, these measures should all give the same result. As a
higher-level measure of complexity, the last is my preference,
because
the average pitch is a good indicator of the pitch of a chord.

I'd be interested in seeing some kind of harmonic entropy plots using
these measures plugged into the relevant part.

I take it that, for the calculation to be performed, an algorithm for
generating all resolvable approximations is required. The easiest
chordal measure for doing this is that where the highest note is held
constant, so a limit is placed on the highest number in the ratio.

The following Python function should do the job:

def getRatios(nNotes, cap):
"""return a list of lists of numbers.

Each entry in the big list containing
nNotes numbers, all smaller than cap,
in order lowest first.

All such ratios are returned.
"""
chords = []
for note in xrange(nNotes, cap):
if nNotes==1:
chords.append([note])
else:
for each in getRatios(nNotes-1,note):
chords.append(each+[note])
return chords

# and this for testing:

import string

def printRatios(nNotes, cap):
for ratio in getRatios(nNotes, cap):
print string.join(map(str, ratio),':'),

The "cap" parameter is one greater than the largest number you want
to
see in the ratios. Plugging in 3 and 5 gives the Pythagorean triads:

1:2:3 1:2:4 1:3:4 2:3:4

This isn't a very efficient function, because adding a magic "xrange"
doesn't do much to reduce the memory requirements of holding all
possible ratios in memory at once. Returning all 156,849 triads with
two digit numbers in their ratios takes about 45 seconds on my PC.
Three digit numbers will take ... a great deal longer. Two digit
numbers with four note chords gives a MemoryError after 4'45''.

So, we have a measure of complexity of chords that is calculable, and
has some relationship with Paul's original "denominator-limit". What
do harmonic entropy fans make of it?

Graham

🔗Paul H. Erlich <PERLICH@ACADIAN-ASSET.COM>

9/25/2000 2:30:15 PM

Graham wrote,

>I haven't been following the ongoing discussion on harmonic entropy,
>so to write this message I've refreshed myself from this page:

>http://www.ixpres.com/interval/td/erlich/entropy.htm

I'm afraid you're very much behind, then, in regard to the specific
questions you ask below.

>There doesn't seem to be enough information there to reproduce the
>graphs.

Manuel Op de Coul can reproduce them, and I'd be happy to give you whatever
you need to get up to speed.

>The important quotes seem to be

>Paul Erlich:

>"When I worked out a model for harmonic entropy, which should also
>describe critical band roughness if the partials decrease in
>amplitude
>in some specific fashion, I derived that to a good approximation, the
>complexity of a just ratio is directly related to its DENOMINATOR."

That's actually a result in the pre-entropy, Van Eck model and is no longer
relevant anyway.

>Now, my objection to this is that the denominator result is
>arbitrary.
> The very approximation used to prove this -- of the resolvable
>ratios
>becoming arbitrarily large -- also means the size of the intervals
>only changes by an infinitessimal amount for different rational
>approximations. So the size of the denominator must be directly
>proportional to the size of the numerator! Hence either can be used
>as an index of complexity.

Absolutely right!

>So, we're left with the same conclusion as Partch: small integer
>ratios are more concordant.

Well, no: 3000:2001 is more concorant than 13:9.

>However, Paul's result still has some connection with entropy I
>think.
> The question is: how to apply it where intervals differ by a finite
>amount? It seemed to me that the most important result is that the
>complexity is proportional to the guide tone. The problem is that
>transposing an interval will change its complexity. This isn't what
>we want: a fourth should have the same complexity wherever it occurs.

>So, we have to decide on what to hold constant when comparing
>intervals.

>When the higher note is held constant, the guide tone is proportional
>to the denominator.

>When the lower note is held constant, the guide tone is proportional
>to the numerator.

>When the arithmetic mean of the frequencies is held constant, the
>guide tone is proportional to the arithmetic mean of the numerator
>and
>denominator.

>When the geometric mean of the frequencies is held constant, the
>guide
>tone is proportional to the geometric mean of the numerator and
>denominator.

>I consider these functions to all be equally valid indicators of
>complexity, in the light of Paul's derivition. He thought otherwise,
>and we had an argument about it that didn't go anywhere.

>I notice that some charts are drawn using the "Mann series". A bit
>higher up, Monz says "... a series such as used by Mann where the sum
>of numerator and denominator does not exceed a certain limit." I
>take
>it, then, that the Mann series results are those you'd get from
>holding the arithmetic mean constant. Sure enough, the graphs are
>qualitatively the same.

>My opinion is that the geometric mean is the one to go for, because
>that connects with pitch being perceived on a logarithmic scale.

You'll be happy to know that in the posts you haven't been following, I've
independently come to the same conclusion (well, I use the product or the
log-product, but that gives the same rank-order as the geometric mean, and
the rank-order is all that matters because we're only choosing a single
value as the cutoff for which ratios go into the harmonic entropy
calculation.

>Now, the interesting question is: how do we generalize this to
>chords?
> It's to be hoped that we come up with a non-arbitrary measure of
>accordance that rates otonal chords as more concordant than utonal
>ones. To find this, I'd prefer to look at psychoacoustics, rather
>than number theory.

I don't see why this should be any different in principle from the two-tone
case.

>The simplest measure is to stay with the guide tone. It's easy to
>find this for a chord: multiply the virtual root by the lcm.
>Unfortunately, this gives the wrong results: utonal chords would be
>more concordant than otonal ones.

Wouldn't they be identically concordant?

>When the highest note is held constant, complexity is proportional to
>the highest number in the ratio.

>When the lowest note is held constant, complexity is proportional to
>the lowest number in the ratio.

>When the arithmetic mean of frequencies is held constant, complexity
>is proportional to the arithmetic mean of the numbers in the ratio.

>When the geometric mean of frequencies (arithmetic mean of pitches)
>is
>held constant, complexity is proportional to the geometric mean of
>the
>numbers in the ratio.

>For choosing rational approximations infinitessimally close to a
>given
>interval, these measures should all give the same result. As a
>higher-level measure of complexity, the last is my preference,
>because
>the average pitch is a good indicator of the pitch of a chord.

>I'd be interested in seeing some kind of harmonic entropy plots using
>these measures plugged into the relevant part.

Good idea! I'll do it for dyads first, and then try it for triads. (I was
actually thinking the same thing earlier, before I read your post, because
the list seems ever more eager for some true chordal harmonic entropy
results). As you may or may not know, I've been using the actual distance
between mediants or midpoints in my calculations so far, but I could
certainly use some pre-defined measure of complexity instead.

>I take it that, for the calculation to be performed, an algorithm for
>generating all resolvable approximations is required. The easiest
>chordal measure for doing this is that where the highest note is held
>constant, so a limit is placed on the highest number in the ratio.

Wait a minute! That kind of defeats everything we just went through! It
seems to me that the only reasonable way to proceed, if one expects to
compare chords with very different-sized intervals, is to set a limit on the
geometric mean (or product or log-product) of the numbers in the "ratio".
But, in case you didn't know, I already tried your suggestion using 64 as
the highest number, and though I didn't go ahead and calculate the entropy
function, you can see the triads in this Voronoi graph:
http://www.egroups.com/files/tuning/triads.jpg (where the red dots are the
three inversions of the 4:5:6 major triad, the blue dots are the three
inversions of the 10:12:15 minor triad, and the green dots are the three
inversions of the 16:19:24 minor triad) which, if you blur your eyes just
the right amount, gives you a sense of how the resulting entropy function
would turn out (except that the axes should be at a 60-degree angle and some
of the cells may change as a result).

🔗Carl Lumma <CLUMMA@NNI.COM>

9/25/2000 9:25:59 PM

>I'd be interested in seeing some kind of harmonic entropy plots using
>these measures plugged into the relevant part.

Graham, that's a great idea!! Only... don't we have to show that the
measure being used sums to a meaningful total as some function of the
limit used? For example, if we use the set of all triads whose Tenney
limit (a.k.a. geometric mean) is less than 30, the Tenney limits of
all of them ought to sum to some special value? The widths of the
ratios in a given order of the Farey series sum to that order, anyway...
I suppose the series-limiting condition could be different from the
measure used to evaluate each member of the series, but I can't imagine
it would be a good thing if the total evaluation fluctuated wildly as
the limiting condition was incremented. That sound right, anybody?

=carl

🔗Paul H. Erlich <PERLICH@ACADIAN-ASSET.COM>

9/25/2000 9:29:26 PM

Graham wrote,

>When the higher note is held constant, the guide tone is proportional
>to the denominator.

and the width is proportional to the denominator.

>When the lower note is held constant, the guide tone is proportional
>to the numerator.

and the width is proportional to the numerator.

>When the arithmetic mean of the frequencies is held constant, the
>guide tone is proportional to the arithmetic mean of the numerator
>and
>denominator.

And the width is _not_ proportional to the arithmetic mean!

>When the geometric mean of the frequencies is held constant, the
>guide
>tone is proportional to the geometric mean of the numerator and
>denominator.

And the width _is_ proportional to the geometric mean!!!

🔗Paul H. Erlich <PERLICH@ACADIAN-ASSET.COM>

9/25/2000 11:13:30 PM

Carl wrote,

>Graham, that's a great idea!! Only... don't we have to show that the
>measure being used sums to a meaningful total as some function of the
>limit used? For example, if we use the set of all triads whose Tenney
>limit (a.k.a. geometric mean) is less than 30, the Tenney limits of
>all of them ought to sum to some special value?

Nah, you can always normalize them to sum to one . . . which probabilities
always do. And use sqrt(n*d) for the width in the diadic case. I found this
to be a great approximation for the simpler ratios in the limit.

>The widths of the
>ratios in a given order of the Farey series sum to that order, anyway...

Hmm?

>I suppose the series-limiting condition could be different from the
>measure used to evaluate each member of the series, but I can't imagine
>it would be a good thing if the total evaluation fluctuated wildly as
>the limiting condition was incremented. That sound right, anybody?

Confused at 2:20.

🔗Paul H. Erlich <PERLICH@ACADIAN-ASSET.COM>

9/26/2000 11:43:53 AM

I wrote,

>And use sqrt(n*d) for the width in the diadic case.

Whoops, I meant 1/sqrt(n*d). I'm about to try this . . . should eliminate
the necessity of calculating a normal cdf integral, etc.

🔗Paul H. Erlich <PERLICH@ACADIAN-ASSET.COM>

9/26/2000 2:01:20 PM

I wrote,

>Whoops, I meant 1/sqrt(n*d). I'm about to try this . . . should eliminate
the necessity of >calculating a normal cdf integral, etc.

Here's a comparison of the result this way with the result of using the
integrals over the mediant-to-mediant intervals:

http://www.egroups.com/files/tuning/perlich/tenney/tcmp3.jpg.

I'm pretty happy with the approximation.

This is based on my finding (two years ago?) that the mediant-to-mediant
widths of the simpler ratios n/d in the "Tenney series" are proportional to
1/sqrt(n*d). (Can anyone prove that?) So to calculate the blue curve, I used
1/sqrt(n*d) times the height of the bell curve as the probability for each
ratio, and then normalized so that all the probabilities sum to one. The
calculation ran very quickly, needless to say.

It is straightforward to generalize this to triads, using a Chalmers-like
60-degree angle plot and a bivariate normal distribution. But there's a leap
of faith involved. Is there an analogue to mediants such that the triadic
surface is divided into cells, one for each triad within a certain (high)
product-limit, where, for the simpler triads, the area of the cell is
inversely proportional to the geometric mean of the numbers in the otonal
representation of the triad? It would be nice if we could actually construct
this decomposition, but I'm clueless at the moment. Anyway, I'll get to
working on this soon . . .

🔗graham@microtonal.co.uk

9/26/2000 2:21:00 PM

Paul wrote:

> and the width is proportional to the denominator.
<snip>
> and the width is proportional to the numerator.
<snip>
> And the width is _not_ proportional to the arithmetic mean!
<snip>
> And the width _is_ proportional to the geometric mean!!!

Okay, Paul, what's so special about the arithmetic mean? And what is the
width here, the interval between the highest and lowest note in the chord?

🔗graham@microtonal.co.uk

9/26/2000 2:21:00 PM

Paul H. Erlich wrote:

>> It's to be hoped that we come up with a non-arbitrary measure of
>>accordance that rates otonal chords as more concordant than utonal
>>ones. To find this, I'd prefer to look at psychoacoustics, rather
>>than number theory.

> I don't see why this should be any different in principle from the
> two-tone case.

Voronoi cells got mentioned previously. They look suspiciously like
number theory to me, although I'm starting to get the hang of what you're
doing with them.

Like Monzo said recently, chords are rated according to how often they
occur in the limit specified. So it's a highest-number rule by the back
door.

> >The simplest measure is to stay with the guide tone. It's easy to
> >find this for a chord: multiply the virtual root by the lcm.
> >Unfortunately, this gives the wrong results: utonal chords would be
> >more concordant than otonal ones.
>
> Wouldn't they be identically concordant?

No, not unless you give both the chords the same virtual root, which would
be silly. Or I got my terminology wrong. Otonal and utonal chords have
the same lcm. So for such chords, the virtual root would be proportional
to the guide tone. As the virtual root should be high, and the guide tone
low, for simple chords the two measures are in conflict when the lcms are
the same.

> >I'd be interested in seeing some kind of harmonic entropy plots using
> >these measures plugged into the relevant part.
>
> Good idea! I'll do it for dyads first, and then try it for triads. (I
> was
> actually thinking the same thing earlier, before I read your post,
> because
> the list seems ever more eager for some true chordal harmonic entropy
> results). As you may or may not know, I've been using the actual
> distance
> between mediants or midpoints in my calculations so far, but I could
> certainly use some pre-defined measure of complexity instead.

This isn't clear to me. Are these the midpoints you're using for
describing the Voronoi cells?

> >I take it that, for the calculation to be performed, an algorithm for
> >generating all resolvable approximations is required. The easiest
> >chordal measure for doing this is that where the highest note is held
> >constant, so a limit is placed on the highest number in the ratio.
>
> Wait a minute! That kind of defeats everything we just went through! It
> seems to me that the only reasonable way to proceed, if one expects to
> compare chords with very different-sized intervals, is to set a limit
> on the
> geometric mean (or product or log-product) of the numbers in the
> "ratio".

Don't know about the only reasonable way to proceed. Highest number's the
simplest. Geometric mean's the best, if you can get it to work.

> But, in case you didn't know, I already tried your suggestion using 64
> as
> the highest number, and though I didn't go ahead and calculate the
> entropy
> function, you can see the triads in this Voronoi graph:
> http://www.egroups.com/files/tuning/triads.jpg

Right, yes, I alluded to that above.

(where the red dots are
> which, if you blur your eyes
> just
> the right amount, gives you a sense of how the resulting entropy
> function
> would turn out (except that the axes should be at a 60-degree angle and
> some
> of the cells may change as a result).

What's the significance of this angle?

I notice that the consonant intervals have large cells surrounding them.
Presumably another way of calculating the mediants would put them nearer
to the more complex ratios?

Graham

🔗Paul H. Erlich <PERLICH@ACADIAN-ASSET.COM>

9/26/2000 4:48:15 PM

Graham wrote,

>>> It's to be hoped that we come up with a non-arbitrary measure of
>>>accordance that rates otonal chords as more concordant than utonal
>>>ones. To find this, I'd prefer to look at psychoacoustics, rather
>>>than number theory.

>> I don't see why this should be any different in principle from the
>> two-tone case.

>Voronoi cells got mentioned previously. They look suspiciously like
>number theory to me, although I'm starting to get the hang of what you're
>doing with them.

In the diadic case, you can either use midpoints, or mediants. The latter
are kind of number-theoretic (?). Voronoi cells are the analogue of
midpoints, so if anything, they're less number-theoretic . . .

>Like Monzo said recently, chords are rated according to how often they
>occur in the limit specified.

Not directly . . .

>So it's a highest-number rule by the back
>door.

Hmm . . . in the diadic analogue, the Farey series gives you (approximately)
a lowest-number rule. Why would this flip to highest in the triadic case?

>> Good idea! I'll do it for dyads first, and then try it for triads. (I
>> was
>> actually thinking the same thing earlier, before I read your post,
>> because
>> the list seems ever more eager for some true chordal harmonic entropy
>> results). As you may or may not know, I've been using the actual
>> distance
>> between mediants or midpoints in my calculations so far, but I could
>> certainly use some pre-defined measure of complexity instead.

>This isn't clear to me. Are these the midpoints you're using for
>describing the Voronoi cells?

The Voronoi cells are the 2-d analogue of midpoints. Every point in a
Voronoi cell is closer to the "nucleus" of that cell (the chord defining it)
than to any other nucleus.

>Don't know about the only reasonable way to proceed. Highest number's the
>simplest. Geometric mean's the best, if you can get it to work.

I intend to use that going forward, since the Farey (highest number) method
is biased against large intervals.

>What's the significance of this angle?

The two axes are the lower interval and the upper interval. What if you
chose one of the axes to be the outer interval? You'd want the resulting
diagram to be simply a rotation of the original one, without any distortion.
The only way to do that is to use an equilateral triangular diagram, in
which the angle between the axes is either 120 degrees or 60 degrees -- ask
John Chalmers.

>I notice that the consonant intervals have large cells surrounding them.
>Presumably another way of calculating the mediants would put them nearer
>to the more complex ratios?

. . . making the consonant chords occupy even larger cells. Yes, a
substitute for Voronoi cells which was analogous to mediants rather than
midpoints would have this effect. But it doesn't matter much for harmonic
entropy, once the cells are much smaller than s.

🔗Paul H. Erlich <PERLICH@ACADIAN-ASSET.COM>

9/26/2000 4:58:09 PM

I wrote,

>> and the width is proportional to the denominator.
<snip>
>> and the width is proportional to the numerator.
<snip>
>> And the width is _not_ proportional to the arithmetic mean!
<snip>
>> And the width _is_ proportional to the geometric mean!!!

Graham wrote,

>Okay, Paul, what's so special about the arithmetic mean?

I don't know! I left out "inversely" in all of the above. But I don't think
the arithmetic mean is special; I think the other ones are special.

>And what is the
>width here, the interval between the highest and lowest note in the chord?

Nope. Remember that derivation of mine that caused you so much trouble? It
was concerned with finding a simple formula for the width of each of the
simple ratios, where width means the distance between the lower mediant and
the upper mediant. In that derivation, I used the Farey series, where the
numerator has a limit, and I showed that the width was approximately
inversely proportional to the denominator. If you use a denominator limit,
the width is approximately inversely proportional to the numerator. If you
use a product limit, the width is approximately inversely proportional to
the geometric mean. And if you use a sum limit, the width is _not_
approximately inversely proportional to the arithmetic mean . . .

Anyway, this "width" is the relevant quantity in the harmonic entropy
calculation. The analogue in the triadic case would be the area of the
Voronoi-like cell. Since, in the diadic case, if you use a product limit,
the width for the simple ratios is approximately inversely proportional to
the geometric mean, and, as I just posted, the resulting harmonic entropy
curve is essentially the same as the one that doesn't use this
approximation, I'm willing to take a leap of faith and suppose that
something similar holds in the triadic case, so that I can try to calculate
a triadic harmonic entropy surface.

🔗Carl Lumma <CLUMMA@NNI.COM>

9/26/2000 9:52:02 PM

>>Graham, that's a great idea!! Only... don't we have to show that the
>>measure being used sums to a meaningful total as some function of the
>>limit used? For example, if we use the set of all triads whose Tenney
>>limit (a.k.a. geometric mean) is less than 30, the Tenney limits of
>>all of them ought to sum to some special value?
>
>Nah, you can always normalize them to sum to one . . . which probabilities
>always do.

I know. But if our total didn't increase in some way as we upped the
limit on the series, it wouldn't bode well for our orriginal assumption
that discordance is related to the limiting function we chose. That is,
doesn't our usage of a complexity measure instead of real widths (we
imply the complexities can be used _as_ widths) depend on if the particular
measure means something acoustically?

Also, wouldn't we like to see some stability to the ordering of the few
widest ratios as the limit is upped... in fact, a _slow_, uniform
narrowing of their widths that dosen't change their ordering too much.
So we can approximate an infinite limit with a finite one, as we did with
the Farey series?

>>The widths of the ratios in a given order of the Farey series sum to
>>that order, anyway...
>
>Hmm?

I realized while going to sleep that this was an error. But I haven't
found any errors with my point yet, which is simply that for mediant-
based widths on the Farey series, the total of all the widths is trivially
bound by the width of the entire series, and this is a good thing. Check
it out...

For a Farey series of order z, the boundaries of the series on the
number line are 1/1 and z/1. By mistake, I forgot that you can't get
widths for the boundary points, so the total will be less than z. It
will in fact be:

( 2z - 1 ) ( (z+1) )
| ----- | - | - | = z - 3/2 - 1/z
( 2 ) ( z )

>Confused at 2:20.

And how!

>>I suppose the series-limiting condition could be different from the
>>measure used to evaluate each member of the series,

We could, say, use Farey limit to choose our ratios, and then Tenney
limit to approximate their widths, just so the sum of the latter is
well-behaved as we change the former. Or do you think that simply
expressing the complexities in terms of the total is enough?

-Carl

🔗Graham Breed <graham@microtonal.co.uk>

9/27/2000 3:21:26 AM

Paul Erlich wrote:

> >Voronoi cells got mentioned previously. They look suspiciously
like
> >number theory to me, although I'm starting to get the hang of what
you're
> >doing with them.
>
> In the diadic case, you can either use midpoints, or mediants. The
latter
> are kind of number-theoretic (?). Voronoi cells are the analogue of
> midpoints, so if anything, they're less number-theoretic . . .

Yes, more crystallography than number theory. The "musical"
equivalent of a mid-point would have to use a "musical" metric, as
mentioned below.

> >Like Monzo said recently, chords are rated according to how often
they
> >occur in the limit specified.
>
> Not directly . . .

Well, it looks like that's what's going on.

> >So it's a highest-number rule by the back
> >door.
>
> Hmm . . . in the diadic analogue, the Farey series gives you
(approximately)
> a lowest-number rule. Why would this flip to highest in the triadic
case?

Because a different method's being used? If the highest number in a
ratio is 64, then 4:5:6 can also occur as 8:10:12, 12:15:18,
16:20:24,
and so on up to 40:50:60. The next one, 44:55:66 is too big because
the highest number's too high. The number of times a ratio can occur
is the integer part of the limit divided by the highest number in the
ratio. In this case, 64/6=10.7, so it's there 10 times.

> >> As you may or may not know, I've been using the actual
> >> distance
> >> between mediants or midpoints in my calculations so far, but I
could
> >> certainly use some pre-defined measure of complexity instead.
>
> >This isn't clear to me. Are these the midpoints you're using for
> >describing the Voronoi cells?
>
> The Voronoi cells are the 2-d analogue of midpoints. Every point in
a
> Voronoi cell is closer to the "nucleus" of that cell (the chord
defining it)
> than to any other nucleus.

And at the moment "closer" is by the Euclidian distance as you see it
on the page? So that metric could be replaced by something like the
Tenney harmonic distance.

It's been mentioned (like here
<http://www.egroups.com/message/tuning/13529>) that the Tenney
distance between chords is measured by the product or geometric mean
of the ratios. Did Tenney propose this, or is it a generalization of
his diadic measure?

> >I notice that the consonant intervals have large cells surrounding
them.
> >Presumably another way of calculating the mediants would put them
nearer
> >to the more complex ratios?
>
> . . . making the consonant chords occupy even larger cells. Yes, a
> substitute for Voronoi cells which was analogous to mediants rather
than
> midpoints would have this effect. But it doesn't matter much for
harmonic
> entropy, once the cells are much smaller than s.

What's s? We'll see. Could you do a Voronoi-like plot where the
divisions are chosen so that each point is as near as possible to the
middle of the cell? That would give the right result, but be more
computationally intensive. Fudging it with a complexity measure
would
likely come out the same. If a ratio occurs 5 times as often as the
one next door, draw the division 5/6 of the way between them.

Since I put "quantitative" in the title, I could work this out.

The 4:5:6 is at point (415,467) on triads.jpg. The next ratio on the
spine is at (429, 446), 25.2 pixels from 4:5:6. And the one after
that (431,444), another 2.8 pixels away. Assuming these other ratios
have the lowest consonance index of 1, that means the difference
between a 10 and 1 ratio is 8.9 times that between two 1 ratios.
Taking some other points, we have (375,427) and (370,423), giving a
proportion of 8.8.

So it's a bit smaller than 9. Unfortunately, this isn't what I'd
have
predicted. If each ratio has the same area, then 4:5:6 would have 10
times the area of the most complex ratios through being there 10
times. Then, the distance would be scaled by the square root of 10,
or 3.2. Taking the 5/6 logic I used above, the distances should be
scaled by 11/2=5.5. So it isn't that either.

With a pair of points near 3:4:5, I get (587,195), (553,139) and
(549,133) giving a proportion of 9.1. 3:4:5 should be represented 12
times.

So, 10 gives 8.9 and 12 gives 9.1. I dunno ...

Ah, but the diagram's drawn to a pitch, not frequency, scale!

Did anybody find free software for drawing these diagrams?

Graham

🔗Paul H. Erlich <PERLICH@ACADIAN-ASSET.COM>

9/27/2000 8:40:31 AM

Carl wrote,

>>>Graham, that's a great idea!! Only... don't we have to show that the
>>>measure being used sums to a meaningful total as some function of the
>>>limit used? For example, if we use the set of all triads whose Tenney
>>>limit (a.k.a. geometric mean) is less than 30, the Tenney limits of
>>>all of them ought to sum to some special value?

I wrote,

>>Nah, you can always normalize them to sum to one . . . which probabilities
>>always do.

Carl wrote,

>I know. But if our total didn't increase in some way as we upped the
>limit on the series, it wouldn't bode well for our orriginal assumption
>that discordance is related to the limiting function we chose. That is,
>doesn't our usage of a complexity measure instead of real widths (we
>imply the complexities can be used _as_ widths) depend on if the particular
>measure means something acoustically?

It should be an approximation to the widths. I showed that using 1/sqrt(n*d)
gives you virtually the same results as integrating over the actual
mediant-to-mediant widths, for the "Tenney series".

>For a Farey series of order z, the boundaries of the series on the
>number line are 1/1 and z/1.

1/z and z/1.

>By mistake, I forgot that you can't get
>widths for the boundary points

Sure you can -- the mediant between 0/1 and 1/z is 1/(z+1), and the mediant
between z/1 and 1/0 is (z+1)/1.

>We could, say, use Farey limit to choose our ratios, and then Tenney
>limit to approximate their widths, just so the sum of the latter is
>well-behaved as we change the former.

Oh, behave!

>Or do you think that simply
>expressing the complexities in terms of the total is enough?

Some function of the complexities which is an approximation to width, yes. I
think my graphs yesterday showed that.

🔗Paul H. Erlich <PERLICH@ACADIAN-ASSET.COM>

9/27/2000 9:17:19 AM

Graham wrote,

>>>So it's a highest-number rule by the back
>>>door.

I wrote,

>> Hmm . . . in the diadic analogue, the Farey series gives you
(approximately)
>> a lowest-number rule. Why would this flip to highest in the triadic
case?

Graham wrote,

>Because a different method's being used? If the highest number in a
>ratio is 64, then 4:5:6 can also occur as 8:10:12, 12:15:18,
>16:20:24,
>and so on up to 40:50:60. The next one, 44:55:66 is too big because
>the highest number's too high. The number of times a ratio can occur
>is the integer part of the limit divided by the highest number in the
>ratio. In this case, 64/6=10.7, so it's there 10 times.

And why doesn't this argument carry over to dyads?

>And at the moment "closer" is by the Euclidian distance as you see it
>on the page?

Right.

>So that metric could be replaced by something like the
>Tenney harmonic distance.

I think you're getting confused. Midpoint, mediant, these are in
pitch-space, not Tenney harmonic lattice space.

>It's been mentioned (like here
><http://www.egroups.com/message/tuning/13529>) that the Tenney
>distance between chords is measured by the product or geometric mean
>of the ratios. Did Tenney propose this, or is it a generalization of
>his diadic measure?

I think Carl was just thinking about generalizing the diadic measure.

>What's s?

s -- the standard deviation of hearing errors assumed in the harmonic
entropy calculation.

>We'll see. Could you do a Voronoi-like plot where the
>divisions are chosen so that each point is as near as possible to the
>middle of the cell? That would give the right result, but be more
>computationally intensive. Fudging it with a complexity measure
>would
>likely come out the same. If a ratio occurs 5 times as often as the
>one next door, draw the division 5/6 of the way between them.

It's easy to say these things, but to actually construct a set of polygons
that satisfies these properties is very difficult. If you can find a way do
create polygons that satisfy your last condition above, you'll win my
personal . . . uh . . . _noble_ prize.

>Since I put "quantitative" in the title, I could work this out.

God bless you!

🔗Carl Lumma <CLUMMA@NNI.COM>

9/27/2000 12:33:22 PM

>>I know. But if our total didn't increase in some way as we upped the
>>limit on the series, it wouldn't bode well for our orriginal assumption
>>that discordance is related to the limiting function we chose. That is,
>>doesn't our usage of a complexity measure instead of real widths (we
>>imply the complexities can be used _as_ widths) depend on if the particular
>>measure means something acoustically?
>
>It should be an approximation to the widths.

Which was my only point.

>I showed that using 1/sqrt(n*d) gives you virtually the same results as
>integrating over the actual mediant-to-mediant widths, for the "Tenney
>series".

Right. And a good thing it was. I was just pointing out that we'll
want some like result for whatever complexity measure we use... Graham
was throwing around suggestions there...

>>For a Farey series of order z, the boundaries of the series on the
>>number line are 1/1 and z/1.
>
>1/z and z/1.

I don't normally allow fractions less than 1. Which shouldn't matter,
since there's mirror symmetry there; the Stern-Brocot tree is symmetrical
between 0/1 - 1/1 and 1/1 - 1/0.

>>By mistake, I forgot that you can't get
>>widths for the boundary points
>
>Sure you can -- the mediant between 0/1 and 1/z is 1/(z+1), and the mediant
>between z/1 and 1/0 is (z+1)/1.

Those fractions with zero can bite me.

>>We could, say, use Farey limit to choose our ratios, and then Tenney
>>limit to approximate their widths, just so the sum of the latter is
>>well-behaved as we change the former.
>
>Oh, behave!

:)

>>Or do you think that simply expressing the complexities in terms of the
>>total is enough?
>
>Some function of the complexities which is an approximation to width, yes.

But otherwise, simply expressing the complexities in terms of their total
is _not_ enough, right?

>I think my graphs yesterday showed that.

Who? Missed 'em.

-Carl

🔗Joseph Pehrson <pehrson@pubmedia.com>

9/27/2000 12:40:16 PM

--- In tuning@egroups.com, Carl Lumma <CLUMMA@N...> wrote:

http://www.egroups.com/message/tuning/13684

Carl...

Zies post shoot be over at zie neue "Harmonic Entropy" egroups:

http://www.egroups.com/messages/harmonic_entropy

______________ ___ _ _
Joseph Pehrson

🔗Carl Lumma <CLUMMA@NNI.COM>

9/27/2000 6:31:37 PM

[Joseph Pehrson wrote...]
>http://www.egroups.com/message/tuning/13684
>
>Carl...
>
>Zies post shoot be over at zie neue "Harmonic Entropy" egroups:

If it belonged there, I would have posted it there.

[David Beardsley wrote...]
>>Good luck, boys! We'll be counting on regular summaries of
>>what you've discovered!
>
>That would kind of defeat the point of having
>SEPARATE lists.

Not true. There's a difference between a conclusion and the many
exchanges it took to reach it. To ask to completely remove a topic
like harmonic entropy from this list defeats _its_ purpose.

-Carl

🔗Joseph Pehrson <josephpehrson@compuserve.com>

9/27/2000 7:42:23 PM

--- In tuning@egroups.com, Carl Lumma <CLUMMA@N...> wrote:

http://www.egroups.com/message/tuning/13700

> [Joseph Pehrson wrote...]
> >http://www.egroups.com/message/tuning/13684
> >
> >Carl...
> >
> >Zies post shoot be over at zie neue "Harmonic Entropy" egroups:
>
> If it belonged there, I would have posted it there.
>

Whatever, Carl. I was never one to advocate for separate lists in
the first place (!!)
________ ___ __ __ _ _
Joseph Pehrson

🔗Paul H. Erlich <PERLICH@ACADIAN-ASSET.COM>

9/27/2000 9:47:00 PM

Carl Lumma wrote,

>But otherwise, simply expressing the complexities in terms of their total
>is _not_ enough, right?

Complexities . . . how about simplicities (1/sqrt(n*d)). It produces the
blue curve in the graph below. It's enough.

>>I think my graphs yesterday showed that.

>Who? Missed 'em.

http://www.egroups.com/files/tuning/perlich/tenney/tcmp3.jpg.

🔗Carl Lumma <CLUMMA@NNI.COM>

9/28/2000 6:49:39 AM

>>But otherwise, simply expressing the complexities in terms of their total
>>is _not_ enough, right?
>
>Complexities . . . how about simplicities (1/sqrt(n*d)). It produces the
>blue curve in the graph below. It's enough.

I said _otherwise_! We already know that 1/sqrt(n*d) is cool. But what
about for triads, as Graham was asking? What if the measure was the
totient of the smallest number in the ratio? It wouldn't _necessarily_
work.

>>>I think my graphs yesterday showed that.
>
>>Who? Missed 'em.
>
>http://www.egroups.com/files/tuning/perlich/tenney/tcmp3.jpg.

Groovy! I'll take a close look tonight.

-Carl