back to list

Temperament agreement

🔗Dave Keenan <d.keenan@bigpond.net.au>

1/7/2004 11:20:45 PM

Continued from the tuning list.
Paul:
>With my (Tenney) complexity and (all-interval-Tenney-minimax) error
>measures?

With these it seems I need to scale the parameters to
k=0.002
p=0.5
and max badness = 75

where badness = complexity * exp((error/k)**p)

I'd be very interested to see how that compares with your other cutoff
lines.

These errors and complexities don't seem to have meaningful units.

Complexity used to have units of generators per diamond and error used
to have units of cents, both things you could relate to fairly directly.

🔗Paul Erlich <perlich@aya.yale.edu>

1/8/2004 12:35:17 AM

--- In tuning-math@yahoogroups.com, "Dave Keenan" <d.keenan@b...>
wrote:
> Continued from the tuning list.
> Paul:
> >With my (Tenney) complexity and (all-interval-Tenney-minimax) error
> >measures?
>
> With these it seems I need to scale the parameters to
> k=0.002
> p=0.5
> and max badness = 75
>
> where badness = complexity * exp((error/k)**p)
>
> I'd be very interested to see how that compares with your other
cutoff
> lines.

See

/tuning-math/files/Erlich/dave3.gif

It looks like going back to logarithmic error scaling would be better
for seeing what is going on here . . . Matlab is chugging . . .

> These errors and complexities don't seem to have meaningful units.
>
> Complexity used to have units of generators per diamond and error
>used
> to have units of cents, both things you could relate to fairly
>directly.

Here, complexity is length in the Tenney lattice = log2(n*d), and
error is maximum over all intervals (or merely a simplest few, if you
wish) of (cents error)/(interval complexity), where interval
complexity is again log2(n*d). Most intervals achieve this maximum in
the tuning in question. One advantage is that, if you choose to add
more intervals into your optimization criterion, the optimum doesn't
change.

🔗Paul Erlich <perlich@aya.yale.edu>

1/8/2004 12:38:00 PM

--- In tuning-math@yahoogroups.com, "Paul Erlich" <perlich@a...>
wrote:
> --- In tuning-math@yahoogroups.com, "Dave Keenan" <d.keenan@b...>
> wrote:
> > Continued from the tuning list.
> > Paul:
> > >With my (Tenney) complexity and (all-interval-Tenney-minimax)
error
> > >measures?
> >
> > With these it seems I need to scale the parameters to
> > k=0.002
> > p=0.5
> > and max badness = 75
> >
> > where badness = complexity * exp((error/k)**p)
> >
> > I'd be very interested to see how that compares with your other
> cutoff
> > lines.
>
> See
>
> /tuning-math/files/Erlich/dave3.gif
>
> It looks like going back to logarithmic error scaling would be
better
> for seeing what is going on here . . . Matlab is chugging . . .

See /tuning-math/files/Erlich/dave4.gif

Your criterion (the cyan curve marked '75') allows in two commas of
about the complexity of those of amity and orwell, but giving about
the same error as diaschismic. I was hoping the numbers would be
readable on the latest graph; they aren't, but after all, they're
just numbers. But I don't know if anyone really wants to consider
adding these to their list at this point. I'd penalize complexity
more.

Anyway, it's not clear whether your function will allow in an
infinite number of commas. If we can find a similar function that is
b-e-p lower than a given epimericity contour (<1, right Gene?), then
we can be guaranteed that it gives a finite number, and perhaps find
them all.

🔗Gene Ward Smith <gwsmith@svpal.org>

1/8/2004 1:16:42 PM

--- In tuning-math@yahoogroups.com, "Paul Erlich" <perlich@a...>
wrote:

> Anyway, it's not clear whether your function will allow in an
> infinite number of commas. If we can find a similar function that
is
> b-e-p lower than a given epimericity contour (<1, right Gene?),
then
> we can be guaranteed that it gives a finite number, and perhaps
find
> them all.

Right. If Dave will give a list of commas he must have, we could
simply find out the maximum epimericity. Then we could produce a list
of commas with epimeriticy less than that, and see if there are any
commas on it Dave won't tolerate.

🔗Paul Erlich <perlich@aya.yale.edu>

1/8/2004 1:23:06 PM

--- In tuning-math@yahoogroups.com, "Gene Ward Smith" <gwsmith@s...>
wrote:
> --- In tuning-math@yahoogroups.com, "Paul Erlich" <perlich@a...>
> wrote:
>
> > Anyway, it's not clear whether your function will allow in an
> > infinite number of commas. If we can find a similar function that
> is
> > b-e-p lower than a given epimericity contour (<1, right Gene?),
> then
> > we can be guaranteed that it gives a finite number, and perhaps
> find
> > them all.
>
> Right. If Dave will give a list of commas he must have, we could
> simply find out the maximum epimericity. Then we could produce a
list
> of commas with epimeriticy less than that, and see if there are any
> commas on it Dave won't tolerate.

Obviously there will be! 2/1, 3/2, 4/3, 5/4, 6/5, 9/8, 10/9, and
16/15 automatically qualify.

🔗Dave Keenan <d.keenan@bigpond.net.au>

1/8/2004 3:54:25 PM

--- In tuning-math@yahoogroups.com, "Paul Erlich" <perlich@a...> wrote:
> --- In tuning-math@yahoogroups.com, "Paul Erlich" <perlich@a...>
> wrote:
> > --- In tuning-math@yahoogroups.com, "Dave Keenan" <d.keenan@b...>
> > wrote:
> > > Continued from the tuning list.
> > > Paul:
> > > >With my (Tenney) complexity and (all-interval-Tenney-minimax)
> error
> > > >measures?
> > >
> > > With these it seems I need to scale the parameters to
> > > k=0.002
> > > p=0.5
> > > and max badness = 75
> > >
> > > where badness = complexity * exp((error/k)**p)
> > >
> > > I'd be very interested to see how that compares with your other
> > cutoff
> > > lines.
> >
> > See
> >
> > /tuning-math/files/Erlich/dave3.gif
> >
> > It looks like going back to logarithmic error scaling would be
> better
> > for seeing what is going on here . . .

Yes. I like the log error scale.

> Matlab is chugging . . .
>
> See /tuning-math/files/Erlich/dave4.gif
>

Thanks very much for doing this Paul.

> Your criterion (the cyan curve marked '75') allows in two commas of
> about the complexity of those of amity and orwell, but giving about
> the same error as diaschismic.

Right. Well it was just an estimated modification to the original
formula, to suit your new complexity and error measures. It runs very
close to a lot of temps. I'd probably want to reduce the badness
cutoff slightly so they all fall outside it.

> I was hoping the numbers would be
> readable on the latest graph; they aren't, but after all, they're
> just numbers. But I don't know if anyone really wants to consider
> adding these to their list at this point. I'd penalize complexity
> more.

I'll go along with that. A bit more penalty for complexity.

> Anyway, it's not clear whether your function will allow in an
> infinite number of commas. If we can find a similar function that is
> b-e-p lower than a given epimericity contour (<1, right Gene?), then
> we can be guaranteed that it gives a finite number, and perhaps find
> them all.

It looks like I'd be just as happy with straight lines on this chart.
These should be easy enough to parameterise. e.g. the point (possibly
off the chart) at which they all intersect. Then the badness is the
inverse of the slope, or some such.

🔗Gene Ward Smith <gwsmith@svpal.org>

1/8/2004 6:47:36 PM

--- In tuning-math@yahoogroups.com, "Dave Keenan" <d.keenan@b...>
wrote:

> It looks like I'd be just as happy with straight lines on this
chart.

Could you enlighten the rest of us and give a comma list of commas
you want?

🔗Dave Keenan <d.keenan@bigpond.net.au>

1/9/2004 6:30:54 AM

--- In tuning-math@yahoogroups.com, "Gene Ward Smith" <gwsmith@s...>
wrote:
> --- In tuning-math@yahoogroups.com, "Dave Keenan" <d.keenan@b...>
> wrote:
>
> > It looks like I'd be just as happy with straight lines on this
> chart.
>
> Could you enlighten the rest of us and give a comma list of commas
> you want?

I wouldn't want to include any outside the 5-limit linear temperaments
having the following 18 vanishing commas. And I wouldn't mind leaving
off the last four.

81/80
32805/32768
2048/2025
15625/15552
128/125
3125/3072
250/243
78732/78125
20000/19683
25/24
648/625
135/128
256/243
393216/390625

1600000/1594323
16875/16384
2109375/2097152
531441/524288

I'm afraid I disagree with Herman about including the temperament
where the apotome (2187/2048) vanishes.

I admit I haven't heard it. My rejection is based purely on the fact
that it has errors of a similar size to others that I find marginal
(as approximations of 5-limit JI) - pelogic (135/128) and
quintuple-thirds (Blackwood's decatonic) (256/243) - while also having
about 1.5 times their complexity.

The only argument I've heard in favour of it is that Blackwood wrote
something in 21-ET that sounds good. But does it sound good because it
approximates 5-limit harmony, or despite not approximating it?

🔗Gene Ward Smith <gwsmith@svpal.org>

1/9/2004 1:55:45 PM

--- In tuning-math@yahoogroups.com, "Dave Keenan" <d.keenan@b...> wrote:

> I wouldn't want to include any outside the 5-limit linear temperaments
> having the following 18 vanishing commas. And I wouldn't mind leaving
> off the last four.

Your wishes can be accomodated by setting bounds for size and
epimericity. For the short list, we have size < 93 cents and
epimericity < 0.62, the only five limit comma which would be added to
the list if we used these bounds would be 1600000/1594323. Presumably
you have no objection to that, as it appears on your long list.

> 81/80
> 32805/32768
> 2048/2025
> 15625/15552
> 128/125
> 3125/3072
> 250/243
> 78732/78125
> 20000/19683
> 25/24
> 648/625
> 135/128
> 256/243
> 393216/390625

The long list has size < 93 and epimericity < 0.68. If we were to use
these bounds, we would add 6561/6250 and 20480/19683. The second of
these, 20480/19683, has epimericity 0.6757, which is a sliver higher
than the actual maximum epimericity of your long list, 0.6739, and so
setting the bound at 0.675 would leave it off. What do you make of the
6561/6250 comma? If you had no objection to letting it on to an
amended long list, you'd be in business there as well.

> 1600000/1594323
> 16875/16384
> 2109375/2097152
> 531441/524288
>
> I'm afraid I disagree with Herman about including the temperament
> where the apotome (2187/2048) vanishes.

I'd like to see Herman's list too.

🔗Dave Keenan <d.keenan@bigpond.net.au>

1/9/2004 3:45:00 PM

--- In tuning-math@yahoogroups.com, "Gene Ward Smith" <gwsmith@s...>
wrote:
> --- In tuning-math@yahoogroups.com, "Dave Keenan" <d.keenan@b...> wrote:
>
> > I wouldn't want to include any outside the 5-limit linear temperaments
> > having the following 18 vanishing commas. And I wouldn't mind leaving
> > off the last four.
>
> Your wishes can be accomodated by setting bounds for size and
> epimericity. For the short list, we have size < 93 cents and
> epimericity < 0.62, the only five limit comma which would be added to
> the list if we used these bounds would be 1600000/1594323. Presumably
> you have no objection to that, as it appears on your long list.

I could live with it, but I'd rather not.

> The long list has size < 93 and epimericity < 0.68. If we were to use
> these bounds, we would add 6561/6250 and 20480/19683. The second of
> these, 20480/19683, has epimericity 0.6757, which is a sliver higher
> than the actual maximum epimericity of your long list, 0.6739, and so
> setting the bound at 0.675 would leave it off. What do you make of the
> 6561/6250 comma? If you had no objection to letting it on to an
> amended long list, you'd be in business there as well.

I'd rather not.

What I don't like about both of these proposals is the "corners" in
the cutoff line. I prefer straight or smoothly curved cutoffs.

🔗Gene Ward Smith <gwsmith@svpal.org>

1/9/2004 4:48:02 PM

--- In tuning-math@yahoogroups.com, "Dave Keenan" <d.keenan@b...>
wrote:

> What I don't like about both of these proposals is the "corners" in
> the cutoff line. I prefer straight or smoothly curved cutoffs.

It gives you the commas on your list, but you reject it anyway
because it doesn't make use of your personal fetish about smooth
curves? You may be happy to known that the constant epimercity lines
*are* curved on Paul's graph.

As for the rest, your obsession with curves is preposterous.

🔗Dave Keenan <d.keenan@bigpond.net.au>

1/9/2004 6:26:09 PM

--- In tuning-math@yahoogroups.com, "Gene Ward Smith" <gwsmith@s...>
wrote:
> --- In tuning-math@yahoogroups.com, "Dave Keenan" <d.keenan@b...>
> wrote:
>
> > What I don't like about both of these proposals is the "corners" in
> > the cutoff line. I prefer straight or smoothly curved cutoffs.
>
> It gives you the commas on your list, but you reject it anyway

Read again.
/tuning-math/message/8521
I said I could live with it.

> because it doesn't make use of your personal fetish about smooth
> curves? You may be happy to known that the constant epimercity lines
> *are* curved on Paul's graph.
>
> As for the rest, your obsession with curves is preposterous.

It is neither fetish, obsession nor preposterous. (I guess I asked for
that :-) And note that I said a single straight line would be fine.

But rather, it comes from an understanding of how neural nets work (as
in human perception).

🔗Paul Erlich <perlich@aya.yale.edu>

1/11/2004 1:57:09 PM

I don't like these two-curve boundaries when it's clear one simple
curve could do. I personally could do without 78732/78125 and
20000/19683, but not without 531441/524288.

--- In tuning-math@yahoogroups.com, "Gene Ward Smith" <gwsmith@s...>
wrote:
> --- In tuning-math@yahoogroups.com, "Dave Keenan" <d.keenan@b...>
wrote:
>
> > I wouldn't want to include any outside the 5-limit linear
temperaments
> > having the following 18 vanishing commas. And I wouldn't mind
leaving
> > off the last four.
>
> Your wishes can be accomodated by setting bounds for size and
> epimericity. For the short list, we have size < 93 cents and
> epimericity < 0.62, the only five limit comma which would be added
to
> the list if we used these bounds would be 1600000/1594323.
Presumably
> you have no objection to that, as it appears on your long list.
>
> > 81/80
> > 32805/32768
> > 2048/2025
> > 15625/15552
> > 128/125
> > 3125/3072
> > 250/243
> > 78732/78125
> > 20000/19683
> > 25/24
> > 648/625
> > 135/128
> > 256/243
> > 393216/390625
>
> The long list has size < 93 and epimericity < 0.68. If we were to
use
> these bounds, we would add 6561/6250 and 20480/19683. The second of
> these, 20480/19683, has epimericity 0.6757, which is a sliver higher
> than the actual maximum epimericity of your long list, 0.6739, and
so
> setting the bound at 0.675 would leave it off. What do you make of
the
> 6561/6250 comma? If you had no objection to letting it on to an
> amended long list, you'd be in business there as well.
>
> > 1600000/1594323
> > 16875/16384
> > 2109375/2097152
> > 531441/524288
> >
> > I'm afraid I disagree with Herman about including the temperament
> > where the apotome (2187/2048) vanishes.
>
> I'd like to see Herman's list too.

🔗Paul Erlich <perlich@aya.yale.edu>

1/11/2004 1:58:50 PM

--- In tuning-math@yahoogroups.com, "Gene Ward Smith" <gwsmith@s...>
wrote:
> --- In tuning-math@yahoogroups.com, "Dave Keenan" <d.keenan@b...>
> wrote:
>
> > What I don't like about both of these proposals is the "corners"
in
> > the cutoff line. I prefer straight or smoothly curved cutoffs.
>
> It gives you the commas on your list, but you reject it anyway
> because it doesn't make use of your personal fetish about smooth
> curves?

Uh-oh.

> You may be happy to known that the constant epimercity lines
> *are* curved on Paul's graph.
>
> As for the rest, your obsession with curves is preposterous.

It may be time to run for the hills again :)

🔗Dave Keenan <d.keenan@bigpond.net.au>

1/11/2004 2:38:32 PM

--- In tuning-math@yahoogroups.com, "Paul Erlich" <perlich@a...> wrote:
> --- In tuning-math@yahoogroups.com, "Gene Ward Smith" <gwsmith@s...>
> wrote:
> > It gives you the commas on your list, but you reject it anyway
> > because it doesn't make use of your personal fetish about smooth
> > curves?
>
> Uh-oh.
>
> > You may be happy to known that the constant epimercity lines
> > *are* curved on Paul's graph.
> >
> > As for the rest, your obsession with curves is preposterous.
>
> It may be time to run for the hills again :)

Hee hee. Sorry to disappoint. :-)

🔗Gene Ward Smith <gwsmith@svpal.org>

1/11/2004 8:21:29 PM

--- In tuning-math@yahoogroups.com, "Paul Erlich" <perlich@a...>
wrote:

> I don't like these two-curve boundaries when it's clear one simple
> curve could do. I personally could do without 78732/78125 and
> 20000/19683, but not without 531441/524288.

Is this subjective, or can you quantify it?

🔗Paul Erlich <perlich@aya.yale.edu>

1/12/2004 11:42:23 AM

--- In tuning-math@yahoogroups.com, "Gene Ward Smith" <gwsmith@s...>
wrote:
> --- In tuning-math@yahoogroups.com, "Paul Erlich" <perlich@a...>
> wrote:
>
> > I don't like these two-curve boundaries when it's clear one
simple
> > curve could do. I personally could do without 78732/78125 and
> > 20000/19683, but not without 531441/524288.
>
> Is this subjective, or can you quantify it?

Actually, it's absurd. I misremembered the four curves that I drew,
which was easy since no one has referred to them yet. What I really
meant (:)) was that probably, all three of these should be in, or all
three should be out.