back to list

Symmetric 7-limit comma badness and 2401/2400

🔗Gene Ward Smith <gwsmith@svpal.org>

3/17/2004 2:46:12 PM

If you take, for a 7-limit interval q and a symmetric lattice distance
dist, the function cents(q) dist(q)^4, you get a log-flat symmetric
badness measure for 7-limit commas. Euclidean and Hahn are not very
different; below I use Hahn distance, and order some commas with
badness less than 3000 from best to worst. The list is completely
dominated by superparticulars, and it looks to me as if 2401/2400 is
likely to be an absolute minimum in badness. At any rate looking at
this makes the Erlich phenomenon--the great importance of 2401/2400
for 7-limit micro ets--more understandable.

2401/2400 184.626652
8/7 231.174094
7/6 266.870906
6/5 315.641287
5/4 386.313714
4/3 498.044999
50/49 559.609830
49/48 571.148984
7/5 582.512193
10/7 617.487808
3/2 701.955001
36/35 780.326114
8/5 813.686286
5/3 884.358713
12/7 933.129094
4375/4374 950.211153
7/4 968.825906
126/125 1117.376095
25/24 1130.758839
21/20 1351.475096
16/15 1787.700573
15/14 1911.084922
225/224 1974.150012
250047/250000 2135.221095
1029/1024 2158.776169
64/63 2208.391452
35/32 2482.233925
|-92 -17 21 25> 2860.311932
10/9 2918.459394

🔗Carl Lumma <ekin@lumma.org>

3/17/2004 4:36:36 PM

>If you take, for a 7-limit interval q and a symmetric
>lattice distance dist, the function cents(q) dist(q)^4,

Why 4? It used to be pi(lim)-1, which would be 3 in the
7-limit.

>you get a log-flat symmetric
>badness measure for 7-limit commas. Euclidean and Hahn are
>not very different; below I use Hahn distance, and order
>some commas with badness less than 3000 from best to worst.
>The list is completely dominated by superparticulars, and
>it looks to me as if 2401/2400 is likely to be an absolute
>minimum in badness. At any rate looking at this makes the
>Erlich phenomenon--the great importance of 2401/2400
>for 7-limit micro ets--more understandable.
>
>2401/2400 184.626652
>8/7 231.174094
>7/6 266.870906
>6/5 315.641287
>5/4 386.313714
>4/3 498.044999
>50/49 559.609830
>49/48 571.148984
>7/5 582.512193
>10/7 617.487808

If q = n/d, then using (n-d)/d instead of cents(q) and
log(d)^3 instead of dist(q)^4, I get...

((0.19645692845300844 7 2401/2400)
(0.4419896533813025 3 4/3)
(0.6660493039778589 5 5/4)
(0.7075223009389495 7 225/224)
(0.8337823128571303 5 6/5)
(0.9004848978857011 7 126/125)
(0.9587113625980545 7 7/6)
(1.0518045661034596 5 81/80)
(1.0526168298843461 7 8/7)
(1.1239582004626365 3 9/8))

...I'm only returned the 10 best results, and I only
search q with d <= 3000 and cents(q) <= 600. What bounds
does your method require?

-Carl

🔗Gene Ward Smith <gwsmith@svpal.org>

3/18/2004 10:30:47 AM

--- In tuning-math@yahoogroups.com, "Carl Lumma" <ekin@l...> wrote:
> >If you take, for a 7-limit interval q and a symmetric
> >lattice distance dist, the function cents(q) dist(q)^4,
>
> Why 4? It used to be pi(lim)-1, which would be 3 in the
> 7-limit.

The exponent should be rank(Group)/rank(Kernel); the rank of the group
for p-limit will be pi(p), and the rank of the kernel for codimension
one temperaments is of course one.

🔗Carl Lumma <ekin@lumma.org>

3/18/2004 2:03:33 PM

>> >If you take, for a 7-limit interval q and a symmetric
>> >lattice distance dist, the function cents(q) dist(q)^4,
>>
>> Why 4? It used to be pi(lim)-1, which would be 3 in the
>> 7-limit.
>
>The exponent should be rank(Group)/rank(Kernel); the rank of
>the group for p-limit will be pi(p), and the rank of the
>kernel for codimension one temperaments is of course one.

You were among the people to review my code, which used
pi(lim)-1. Is the above due to that we're no longer
assuming octave equivalence or something?

I also asked:

>...I'm only returned the 10 best results, and I only
>search q with d <= 3000 and cents(q) <= 600. What bounds
>does your method require?

-Carl

🔗Carl Lumma <ekin@lumma.org>

10/22/2005 2:43:15 AM

>If you take, for a 7-limit interval q and a symmetric lattice distance
>dist, the function cents(q) dist(q)^4, you get a log-flat symmetric
>badness measure for 7-limit commas. Euclidean and Hahn are not very
>different; below I use Hahn distance, and order some commas with
>badness less than 3000 from best to worst.
//
>2401/2400 184.626652
>8/7 231.174094
>7/6 266.870906
>6/5 315.641287
>5/4 386.313714
>4/3 498.044999
>50/49 559.609830
>49/48 571.148984
>7/5 582.512193
>10/7 617.487808
>3/2 701.955001

Shouldn't it be dist(q)^3 for the 7-limit Hahn distance, since
that is an octave-equivalent measure with only 3 significant
dimensions?

-Carl

🔗Gene Ward Smith <gwsmith@svpal.org>

10/22/2005 11:42:15 PM

--- In tuning-math@yahoogroups.com, Carl Lumma <ekin@l...> wrote:

> Shouldn't it be dist(q)^3 for the 7-limit Hahn distance, since
> that is an octave-equivalent measure with only 3 significant
> dimensions?

I used (rank of p-limit)/(rank of kernel), which is the standard
formula for the exponent. The Hahn distance seems to me to have the
right dimensions for a complexity measure; twice the distance looks
like twice the complexity to me.

🔗Carl Lumma <ekin@lumma.org>

10/24/2005 5:14:45 PM

>>>If you take, for a 7-limit interval q and a symmetric lattice
>>>distance dist, the function cents(q) dist(q)^4, you get a
>>>log-flat symmetric badness measure for 7-limit commas.
>>
>>Shouldn't it be dist(q)^3 for the 7-limit Hahn distance, since
>>that is an octave-equivalent measure with only 3 significant
>>dimensions?
>
>I used (rank of p-limit)/(rank of kernel), which is the standard
>formula for the exponent.

I don't remember this formula. What's the derivation/reasoning?

My code does (n-d)log(d)^e / d for single-comma badness, where
e = rank(plim) - 1, and I believe this had your blessing.
size(q) ~ (n-d)/d here and dist(q) ~ log(d). So that leaves
me to try to get rank(plim) - 1 ~ rank(plim)/rank(kernel).
What's rank(kernel) when we're talking about a comma?

>The Hahn distance seems to me to have the right dimensions for
>a complexity measure; twice the distance looks like twice the
>complexity to me.

Sure. Since it features octave-equivalence, I was suggesting
that rank(plim) = 3 instead of 4. Then if rank(kernel) = 1 as
I think it does, this would be cents(q)dist(q)^3.

-Carl

🔗Gene Ward Smith <gwsmith@svpal.org>

10/24/2005 6:53:32 PM

--- In tuning-math@yahoogroups.com, Carl Lumma <ekin@l...> wrote:

> >I used (rank of p-limit)/(rank of kernel), which is the standard
> >formula for the exponent.
>
> I don't remember this formula. What's the derivation/reasoning?

I discovered it via a convoluted derivation. It's simple enough that
there is probably a way of looking at it which makes it obvious, but I
havn't given the matter a lot of thought.

> My code does (n-d)log(d)^e / d for single-comma badness, where
> e = rank(plim) - 1, and I believe this had your blessing.
> size(q) ~ (n-d)/d here and dist(q) ~ log(d). So that leaves
> me to try to get rank(plim) - 1 ~ rank(plim)/rank(kernel).
> What's rank(kernel) when we're talking about a comma?

By the rank of the kernel I just mean how many commas it takes to
generate it. If there is one comma, it is one, and so the exponent is
just rank(plim). Paul's comma heuristic is (n-d)/(d log(d)), and
multiplying that times log(d)^n where n = rank(plim) would give
(n-d)/d log(d)^(n-1), which is your formula.

🔗Carl Lumma <ekin@lumma.org>

10/24/2005 7:38:53 PM

>> My code does (n-d)log(d)^e / d for single-comma badness, where
>> e = rank(plim) - 1, and I believe this had your blessing.
>> size(q) ~ (n-d)/d here and dist(q) ~ log(d). So that leaves
>> me to try to get rank(plim) - 1 ~ rank(plim)/rank(kernel).
>> What's rank(kernel) when we're talking about a comma?
>
>By the rank of the kernel I just mean how many commas it takes to
>generate it. If there is one comma, it is one, and so the exponent is
>just rank(plim). Paul's comma heuristic is (n-d)/(d log(d)),

That's the error heuristic for the resulting temperament, I think.
I was hoping for the badness of the resulting temperament.

>and multiplying that times log(d)^n where n = rank(plim) would give
>(n-d)/d log(d)^(n-1), which is your formula.

True (though you shouldn't have used n again!)...

Anyway, if you ever recall the derivation of rank(limit)/rank(kernel),
let me know. ... I could see, if you were trying to get notes,
that would be dist^codimension, which would be rank(limit)-rank(kernel),
no?

-Carl

🔗Gene Ward Smith <gwsmith@svpal.org>

10/24/2005 8:07:16 PM

--- In tuning-math@yahoogroups.com, Carl Lumma <ekin@l...> wrote:

> That's the error heuristic for the resulting temperament, I think.
> I was hoping for the badness of the resulting temperament.

But I was just explaining that putting that heuristic together with
log(d) as a complexity measure leads to your formula.

> Anyway, if you ever recall the derivation of rank(limit)/rank(kernel),
> let me know.

I started out from the vals, where I had something derived from the
theory of multiple diophantine approximation. Then putting vals
together led me to the exponent, which turned out to have a simple
formula. I think I may have posted something on it which was fairly
indigestible when that happened.

🔗Carl Lumma <ekin@lumma.org>

10/24/2005 8:14:35 PM

>> That's the error heuristic for the resulting temperament, I think.
>> I was hoping for the badness of the resulting temperament.
>
>But I was just explaining that putting that heuristic together with
>log(d) as a complexity measure leads to your formula.

Ah, yes. (*click*)

-Carl