back to list

Re: consistency in scales and lattices

🔗Robert C Valentine <BVAL@IIL.INTEL.COM>

5/7/2001 5:41:18 AM

> From: paul@stretch-music.com
> Re: consistency in scales and lattices (was DAVE KEENAN'S MIRACLE SCALE)
>
> --- In tuning@y..., Robert C Valentine <BVAL@I...> wrote:
> >
> > Ahah, I forgot to mention, this program calculates a minimum
> > while considerring all rotations (modes) of the scale.
>
> Oh good.
>

Well, not necessarily. For instance, one might want to optimize the
seven note diatonic without consideration of how complex the locrian
mode turned out. But, thats the way I have it right now. (Originally
I looked at each mode seperately, and will probably go back to that
when I have more intuition how to cull the data).

> One other point -- I'd be wary of inconsistency -- in 72-tET, this may
> occur when using ratios of > 19 or higher odd numbers. In particular,
> if one interval falls into the bucket for a:b, and another
> interval falls into the bucket for b:c, you should make sure the
> bucket a:c contains no interval different from the "sum" of the two
> first intervals. Otherwise, a scale that seems good for dyads
> might fall apart when trying to construct chords.
>

And/or, things that may look a bit too spicy, may in fact represent
something that is quite reasonable.

A flaw/feature with the program is that it takes the extreme RI
interpretation of all the intervals. As an example, 408 cents
currently falls into the 19/15 bucket. Tuning the program so
that 81/64 will appear here makes the pockets around
the 'real pockets' (5/4, 3/2) much too narrow. So, though the
scale may well be consistent in stacking all its 3/2's, this
information is not communicated in the flattened scale, it only
communicates how that interval will be interpreted against the
current root. Of course, this just means that 81/64 and 19/15
are going to pun in the scales that this program analyzes, which
may be a reasonable assumption. [The program is not intended
to deal with a single scale of many notes (though a parent
scale or composite scale with a few transpositions may have
many notes). (Many is > 20)].

One way to approach this leads to the difference between prime
and odd limits. If the complexity function is rewritten to think
prime limit, (for instance, complexity being the product of
unique prime factors and exponents, so that

81/64 => 3*4*2*6 = 144 < 19/15 => 19*3*5 = 285),

then a certain number of interval stacks will be optimized
prior to falling into a higher prime bucket. This in turn is
taking a more 'lattice-like' approach to optimizing the scale,
since to a certain extent, travelling along a low prime dimension
is preferable to adding a new dimension.

Disclaimer : I am NOT posing anything here as a psychoacoustic
theory, it is just automating some search techniques through the
infinite tuning space.

Bob Valentine

🔗paul@stretch-music.com

5/7/2001 3:08:18 PM

--- In tuning@y..., Robert C Valentine <BVAL@I...> wrote:
> > From: paul@s...
> > Re: consistency in scales and lattices (was DAVE KEENAN'S
MIRACLE SCALE)
> >
> > --- In tuning@y..., Robert C Valentine <BVAL@I...> wrote:
> > >
> > > Ahah, I forgot to mention, this program calculates a minimum
> > > while considerring all rotations (modes) of the scale.
> >
> > Oh good.
> >
>
> Well, not necessarily. For instance, one might want to optimize the
> seven note diatonic without consideration of how complex the
locrian
> mode turned out.

I think you're missing my point. The way I see it, the major mode and
the locrian mode have _exactly the same consonances_, namely, seven
consonant thirds, seven consonant sixths, six consonant fifths, and
six consonant fourths. So in my view, a correct optimization would
give exactly the same result regardless of whether you started with
the locrian or the major mode.

> > One other point -- I'd be wary of inconsistency -- in 72-tET,
this may
> > occur when using ratios of > 19 or higher odd numbers. In
particular,
> > if one interval falls into the bucket for a:b, and another
> > interval falls into the bucket for b:c, you should make sure the
> > bucket a:c contains no interval different from the "sum" of the
two
> > first intervals. Otherwise, a scale that seems good for dyads
> > might fall apart when trying to construct chords.
> >
>
> And/or, things that may look a bit too spicy, may in fact represent
> something that is quite reasonable.
>
> A flaw/feature with the program is that it takes the extreme RI
> interpretation of all the intervals. As an example, 408 cents
> currently falls into the 19/15 bucket. Tuning the program so
> that 81/64 will appear here makes the pockets around
> the 'real pockets' (5/4, 3/2) much too narrow. So, though the
> scale may well be consistent in stacking all its 3/2's, this
> information is not communicated in the flattened scale, it only
> communicates how that interval will be interpreted against the
> current root. Of course, this just means that 81/64 and 19/15
> are going to pun in the scales that this program analyzes, which
> may be a reasonable assumption.

Of course. Punning does not imply inconsistency.

However, I'm a bit skeptical of a 19/15 'bucket' -- the most complex
interval I could tune by ear, in the most ideal conditions when I was
younger (and my ears more sensitive), was 17:13. However, 19:15 may
be tunable when a large otonal chord, such as 8:10:12:15:19, is the
context.

The harmonic entropy approach I mentioned, even though it doesn't
have local minima for 19:15 or even 17:13, does allow for ratios as
complex as 37:18 to be the most likely ratio-interpretation for a
given interval. However, the harmonic entropy graph doesn't get a
local minimum near this interval, since the second-most-likely, third-
most-likely, etc. . . . interpretations are taken into account at
each point, smoothing the graph.
>
> One way to approach this leads to the difference between prime
> and odd limits. If the complexity function is rewritten to think
> prime limit, (for instance, complexity being the product of
> unique prime factors and exponents, so that
>
> 81/64 => 3*4*2*6 = 144 < 19/15 => 19*3*5 = 285),
>
> then a certain number of interval stacks will be optimized
> prior to falling into a higher prime bucket. This in turn is
> taking a more 'lattice-like' approach to optimizing the scale,
> since to a certain extent, travelling along a low prime dimension
> is preferable to adding a new dimension.

Yes, but isn't it relevant HOW MANY STEPS along each prime dimension
you're travelling? I would analyze the above comparison as follows:

81/64 => 3*3*3*3*2*2*2*2*2*2 = 5184 > 19/15 => 19*3*5 = 285
>
> Disclaimer : I am NOT posing anything here as a psychoacoustic
> theory, it is just automating some search techniques through the
> infinite tuning space.
>
Tenney does a nice job of unifying these matters, and I think it's
worth it to take both into account whenever possible.

🔗Robert C Valentine <BVAL@IIL.INTEL.COM>

5/8/2001 2:59:42 AM

> From: paul@stretch-music.com
> Subject: Re: consistency in scales and lattices
>
> > > > Ahah, I forgot to mention, this program calculates a minimum
> > > > while considerring all rotations (modes) of the scale.
> > >
> > > Oh good.
> > >
> >
> > Well, not necessarily. For instance, one might want to optimize the
> > seven note diatonic without consideration of how complex the
> locrian
> > mode turned out.
>
> I think you're missing my point. The way I see it, the major mode and
> the locrian mode have _exactly the same consonances_, namely, seven
> consonant thirds, seven consonant sixths, six consonant fifths, and
> six consonant fourths. So in my view, a correct optimization would
> give exactly the same result regardless of whether you started with
> the locrian or the major mode.

I'll buy that, and since thats the way I'm doing it, I don't even
have to rewrite my code. Nonetheless, as I do a better job with
the data-mining, I'll probably look at both scales with minimum
<*|*> and modes with minimum <*|*>. It may turn out they always
coincide, but I'm not sure.

[ <*|*> refers to the fact that whatever this measure I'm using
is, it is similar to harmonic entropy in that it is both dealing
with complexity of the interval and accuracy. There is another
element that I try to factor in which is the maximum error. This
will relate to general idea of consistency, but so far, I haven't
quite factored it in automatically.

> <good stuff snipped>
>
> The harmonic entropy approach I mentioned, even though it doesn't
> have local minima for 19:15 or even 17:13, does allow for ratios as
> complex as 37:18 to be the most likely ratio-interpretation for a
> given interval. However, the harmonic entropy graph doesn't get a
> local minimum near this interval, since the second-most-likely, third-
> most-likely, etc. . . . interpretations are taken into account at
> each point, smoothing the graph.

My graph is much bumpier than yours, with a local minimum at any
interval that has minimal complexity at that point, and walls going
up from that point till they collide with a different bucket. If
I could just figure out the algorithn to produce the same graph
you did...

> >
> > One way to approach this leads to the difference between prime
> > and odd limits. If the complexity function is rewritten to think
> > prime limit, (for instance, complexity being the product of
> > unique prime factors and exponents, so that
> >
> > 81/64 => 3*4*2*6 = 144 < 19/15 => 19*3*5 = 285),
> >
> > then a certain number of interval stacks will be optimized
> > prior to falling into a higher prime bucket. This in turn is
> > taking a more 'lattice-like' approach to optimizing the scale,
> > since to a certain extent, travelling along a low prime dimension
> > is preferable to adding a new dimension.
>
> Yes, but isn't it relevant HOW MANY STEPS along each prime dimension
> you're travelling? I would analyze the above comparison as follows:
>
> 81/64 => 3*3*3*3*2*2*2*2*2*2 = 5184 > 19/15 => 19*3*5 = 285
> >

I wrote :

"for instance, complexity being the product of
unique prime factors and exponents,"

81/64 = 3^4/2^6 => 3*4*2*6 = 144.

You can see that it includes both "primes" and "distance",
although I don't know if it will produce interesting results for
any other ratios than this case. (I suppose I should take the
root of the distance product for a Euclidian distance or use
the sum for "Manhatten"... but thats a matter of refinement. Also
one would want to consider the treatment of the number '2'...)

> Tenney does a nice job of unifying these matters, and I think it's
> worth it to take both into account whenever possible.
>

Hacker readable reference?

Congratulations on purchasing Neils guitar. I decided to go with
the microtones-G&L in 31tet for other guitar-related reasons
(pickups and electronics etc). Hope I didn't make a big "oops"
here (like "oh we do website design now").

Bob Valentine

🔗paul@stretch-music.com

5/8/2001 11:59:26 AM

--- In tuning@y..., Robert C Valentine <BVAL@I...> wrote:
>
> My graph is much bumpier than yours, with a local minimum at any
> interval that has minimal complexity at that point,

How is that defined?

> and walls going
> up from that point till they collide with a different bucket. If
> I could just figure out the algorithn to produce the same graph
> you did...

You haven't figured it out yet? Well, come over to
harmonic_entropy@yahoogroups.com and I am at your service!

>(I suppose I should take the
> root of the distance product for a Euclidian distance or use
> the sum for "Manhatten"...

I use the "Manhattan" distance, where the length of one step along
each prime axis is log(p). This is Tenney's Harmonic Distance.

> but thats a matter of refinement. Also
> one would want to consider the treatment of the number '2'...)

Be very careful about that! Octave-equivalence does not simply mean
you drop all the 2s . . .
>
> > Tenney does a nice job of unifying these matters, and I think
it's
> > worth it to take both into account whenever possible.
> >
>
> Hacker readable reference?

Tenney's Harmonic Distance, defined above.
>
> Congratulations on purchasing Neils guitar. I decided to go with
> the microtones-G&L in 31tet for other guitar-related reasons
> (pickups and electronics etc). Hope I didn't make a big "oops"
> here (like "oh we do website design now").
>
Which G&L model did you get? I have a G&L S-500, which is my main
guitar, but I think I'm liking Neil's Carvin better (with its
splittable humbuckers, active electronics, flat radius, ebony
fingerboard, and maple top).