back to list

Triadic HE

🔗martinsj013 <martinsj@...>

9/29/2010 5:58:42 AM

All,

1) Does anyone have access to Paul Erlich's outline of the Triadic HE calculation? I have Paul's outline for dyadic HE (which I think I found in this group's archive). Based on the latter, I have implemented something based on my best guesses (and approximations), but would like to see the real thing. One of my simplifications means that I get the same answer for 4:5:6 as for 10:12:15 - I assume that Paul's would not do this - views?

2) But perhaps I should not be asking here. Mike B mentioned triadic HE recently, and in particular he said he intends to post about it at the new "research" group. Is that the best place for it? There is an HE group, but is it still active? What's wrong with HE discussion here?

Steve M.

🔗Mike Battaglia <battaglia01@...>

9/29/2010 6:59:17 AM

On Wed, Sep 29, 2010 at 7:58 AM, martinsj013 <martinsj@...> wrote:
>
> All,
>
> 1) Does anyone have access to Paul Erlich's outline of the Triadic HE calculation? I have Paul's outline for dyadic HE (which I think I found in this group's archive). Based on the latter, I have implemented something based on my best guesses (and approximations), but would like to see the real thing. One of my simplifications means that I get the same answer for 4:5:6 as for 10:12:15 - I assume that Paul's would not do this - views?

How are you doing it? Paul said something to the effect that one of
the problems in calculating triadic HE was in generalizing mediants to
triads and such. I was going to try just doing it the "naive" way and
seeing what curve comes out. I also need to figure out if the
probability curve for the triad would just be a 2d Gaussian curve or
what not, or if that's been fleshed out.

> 2) But perhaps I should not be asking here. Mike B mentioned triadic HE recently, and in particular he said he intends to post about it at the new "research" group. Is that the best place for it? There is an HE group, but is it still active? What's wrong with HE discussion here?

It can go anywhere. I don't care. I'm just tired of the flamewars on this list.

-Mike

🔗Carl Lumma <carl@...>

9/29/2010 11:30:57 AM

Hi Steve!

> 1) Does anyone have access to Paul Erlich's outline of the
> Triadic HE calculation? I have Paul's outline for dyadic HE
> (which I think I found in this group's archive). Based on
> the latter, I have implemented something based on my best
> guesses (and approximations), but would like to see the real
> thing. One of my simplifications means that I get the same
> answer for 4:5:6 as for 10:12:15 - I assume that Paul's would
> not do this - views?

Yep, that's wrong. Here, to quote George Clinton, are
the main ingredients:

* Plot triads in 2-D with log-frequency axes at 60deg
(representing the two interior 'thirds').

* On each triad (at a resolution of, say, 1 cent in each
axis) center a bivariate Gaussian with standard dev. of 1%
in both directions (IIRC 1% of frequency, e.g. 17 cents).

* On the plot, identify all just intonation triads with
Tenney height < n.

* For each JI triad, get the value of the Gaussian above
and multiply by the cube root of the Tenney height. That
gives p_t for the JI triads.

* Compute the entropy by Sum p_t log(p_t)

* Jack up n until your computer melts.

* Profit!

> 2) But perhaps I should not be asking here.

This is a great place to ask. Another good place is
Paul Erlich's facebook profile. :)

> Mike B mentioned triadic HE recently, and in particular
> he said he intends to post about it at the new "research"
> group.

Mike's working on the interesting idea of coming up with
a digital filter that will approximate harmonic entropy.
Then any number of notes (triads, tetrads, spectra of
arbitrary timbres and so on) could be computed in roughly
constant time. It appears Mike and I independently came
up with the idea around the same time.

> There is an HE group, but is it still active?

It is not active.

> What's wrong with HE discussion here?

Nothing! What ever gives you such an idea?

-Carl

🔗Carl Lumma <carl@...>

9/29/2010 11:35:13 AM

Mike wrote:

> Paul said something to the effect that one of
> the problems in calculating triadic HE was in generalizing
> mediants to triads and such.

That was solved long ago. There are no known problems
remaining with triadic h.e.

> 2d Gaussian curve or
> what not, or if that's been fleshed out.

See my latest.

-Carl

🔗Carl Lumma <carl@...>

9/29/2010 5:49:42 PM

I wrote:

> * For each JI triad, get the value of the Gaussian above
> and multiply by the cube root of the Tenney height.

Sorry, the *reciprocal* of the cube root of the Tenney height.

-Carl

🔗cameron <misterbobro@...>

9/29/2010 2:26:10 PM

Is harmonic entropy about psychoacoustics or about music?

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
>
> Hi Steve!
>
> > 1) Does anyone have access to Paul Erlich's outline of the
> > Triadic HE calculation? I have Paul's outline for dyadic HE
> > (which I think I found in this group's archive). Based on
> > the latter, I have implemented something based on my best
> > guesses (and approximations), but would like to see the real
> > thing. One of my simplifications means that I get the same
> > answer for 4:5:6 as for 10:12:15 - I assume that Paul's would
> > not do this - views?
>
> Yep, that's wrong. Here, to quote George Clinton, are
> the main ingredients:
>
> * Plot triads in 2-D with log-frequency axes at 60deg
> (representing the two interior 'thirds').
>
> * On each triad (at a resolution of, say, 1 cent in each
> axis) center a bivariate Gaussian with standard dev. of 1%
> in both directions (IIRC 1% of frequency, e.g. 17 cents).
>
> * On the plot, identify all just intonation triads with
> Tenney height < n.
>
> * For each JI triad, get the value of the Gaussian above
> and multiply by the cube root of the Tenney height. That
> gives p_t for the JI triads.
>
> * Compute the entropy by Sum p_t log(p_t)
>
> * Jack up n until your computer melts.
>
> * Profit!
>
> > 2) But perhaps I should not be asking here.
>
> This is a great place to ask. Another good place is
> Paul Erlich's facebook profile. :)
>
> > Mike B mentioned triadic HE recently, and in particular
> > he said he intends to post about it at the new "research"
> > group.
>
> Mike's working on the interesting idea of coming up with
> a digital filter that will approximate harmonic entropy.
> Then any number of notes (triads, tetrads, spectra of
> arbitrary timbres and so on) could be computed in roughly
> constant time. It appears Mike and I independently came
> up with the idea around the same time.
>
> > There is an HE group, but is it still active?
>
> It is not active.
>
> > What's wrong with HE discussion here?
>
> Nothing! What ever gives you such an idea?
>
> -Carl
>

🔗Carl Lumma <carl@...>

9/29/2010 7:16:42 PM

Cameron wrote:

> Is harmonic entropy about psychoacoustics or about music?

Psychoacoustics.

We used a few bits of psychoacoustics to formulate the regular
mapping paradigm, which in turn spits out generalizations of
common practice music theory. Well not quite; it just spits
out scales and chords and such, not rules about voice leading
or sonata form or such.

Of course you don't need any of it to make music. Even if you
happen to be interested in generalizations of common practice
music, you can come up with them without knowing about regular
mapping (in fact many did, which is confirmation of the
approach... Robert Valentine came up with what we now call
valentine temperament, Herman Miller with what we call porcupine
temperament, etc).

-Carl

🔗martinsj013 <martinsj@...>

9/30/2010 2:03:51 AM

--- Carl wrote [but I added the letters and noted Carl's correction]:
> A) Plot triads in 2-D with log-frequency axes at 60deg
> (representing the two interior 'thirds').
> B) On each triad (at a resolution of, say, 1 cent in each
> axis) center a bivariate Gaussian with standard dev. of 1%
> in both directions (IIRC 1% of frequency, e.g. 17 cents).
> C) On the plot, identify all just intonation triads with
> Tenney height < n.
> D) For each JI triad, get the value of the Gaussian above
> and DIVIDE by the cube root of the Tenney height. That
> gives p_t for the JI triads.
> E) Compute the entropy by Sum p_t log(p_t)
> F) Jack up n until your computer melts.

Carl,
thank you. I had A, B, E. I knew I needed to do C but did something simpler to start with. My only question is re D:
* is "dividing by the cube root of the Tenney height" a way of approximating the area of the plot "belonging" to that JI triad? Is it proven or just surmised? I did something more complicated (but not necessarily more accurate).

Also re D, I assume that p_t needs to be normalised such that the sum p_t = 1.

Steve M.

🔗Carl Lumma <carl@...>

9/30/2010 12:26:36 PM

Hi Steve,

> * is "dividing by the cube root of the Tenney height" a way of
> approximating the area of the plot "belonging" to that JI triad?
> Is it proven or just surmised?

It's been empirically demonstrated for dyads and triads.

> Also re D, I assume that p_t needs to be normalised such that
> the sum p_t = 1.

Correct.

-Carl

🔗martinsj013 <martinsj@...>

10/2/2010 2:34:17 PM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> > * is "dividing by the cube root of the Tenney height" a way of
> > approximating the area of the plot "belonging" to that JI triad?
> > Is it proven or just surmised?
> It's been empirically demonstrated for dyads and triads.

Carl, thank you. I now have some plausible-looking results but no time to work out how to present them yet - will do soon.

Steve M.

🔗Mike Battaglia <battaglia01@...>

10/2/2010 2:42:39 PM

On Thu, Sep 30, 2010 at 2:26 PM, Carl Lumma <carl@...> wrote:
> > * is "dividing by the cube root of the Tenney height" a way of
> > approximating the area of the plot "belonging" to that JI triad?
> > Is it proven or just surmised?
>
> It's been empirically demonstrated for dyads and triads.

How so? Through listening tests here?

-Mike

🔗Mike Battaglia <battaglia01@...>

10/2/2010 2:51:44 PM

On Wed, Sep 29, 2010 at 1:30 PM, Carl Lumma <carl@...> wrote:
>
> * Plot triads in 2-D with log-frequency axes at 60deg
> (representing the two interior 'thirds').

I had planned to do log-frequency axes at 90deg, and have them both
represent the low-middle dyad and the low-high dyad. This seems to be
saying to have the axes at 60deg and have them represent the
low-middle dyad and the middle-high dyad. What's the advantage of the
second way vs the first way?

> * On each triad (at a resolution of, say, 1 cent in each
> axis) center a bivariate Gaussian with standard dev. of 1%
> in both directions (IIRC 1% of frequency, e.g. 17 cents).

I'm confused by the phrase "1% of frequency, e.g. 17 cents." 1% of
frequency seems like it should be something in Hz. You mean that the
Gaussian is logarithmically distributed, right? So if a normal
Gaussian is G(x), this would be G(e^x) or something, so that it ends
up looking "straight" on a log-axis.

-Mike

🔗Carl Lumma <carl@...>

10/2/2010 3:38:49 PM

Mike wrote:

> > It's been empirically demonstrated for dyads and triads.
>
> How so? Through listening tests here?

It's proportional to the mediant-mediant widths for dyads,
and voronoi cell area for triads. -Carl

🔗Carl Lumma <carl@...>

10/3/2010 1:54:37 AM

Mike wrote:

> I had planned to do log-frequency axes at 90deg,

That doesn't work; since the axes are orthogonal, you don't
get the outer dyad. Read Paul's the Geometry of Triangular
Plots for more info.

> So if a normal Gaussian is G(x), this would be G(e^x) or
> something, so that it ends up looking "straight" on a log-axis.

I think it's like this...

List JI triads as 3-tuples of integers, e.g. [4,5,6] for 4:5:6.
Let j be an index returning a triad from this list.

Let t be the actual chord in [3rd,3rd] format, e.g. [400,300]
for the 12-ET major triad.

Then

p(j|t) = A/geomean(j) * exp(-(x^2 + y^2)/2s^2)

where

x = cents(j[1]/j[0]) - t[0]
y = cents(j[2]/j[1]) - t[1]

and A is a scaling factor such that

Sum p(j|t) = 1
j

and s is the standard deviation in cents (e.g. 17 for 1%).

Then at last

entropy(t) = - Sum (p(j|t) log(p(j|t)))
j

What do you think?

-Carl

🔗Mike Battaglia <battaglia01@...>

10/3/2010 10:11:21 AM

Carl wrote:
> What do you think?

I think it looks fine. As for the first part, I still don't get why
just creating orthogonal axes and having them represent the l-m dyad
and the l-h dyad wouldn't work though. It seems more intuitive for me
to find a major triad by looking for 386 cents and 702 cents than
looking for 386 cents and 316 cents.

-Mike

🔗genewardsmith <genewardsmith@...>

10/3/2010 10:31:51 AM

--- In tuning@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:
>
> Carl wrote:
> > What do you think?
>
> I think it looks fine. As for the first part, I still don't get why
> just creating orthogonal axes and having them represent the l-m dyad
> and the l-h dyad wouldn't work though.

Or we could use Euclidean interval space.

🔗Mike Battaglia <battaglia01@...>

10/3/2010 11:13:45 AM

On Sun, Oct 3, 2010 at 12:31 PM, genewardsmith
<genewardsmith@...> wrote:
>
> --- In tuning@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:
> >
> > Carl wrote:
> > > What do you think?
> >
> > I think it looks fine. As for the first part, I still don't get why
> > just creating orthogonal axes and having them represent the l-m dyad
> > and the l-h dyad wouldn't work though.
>
> Or we could use Euclidean interval space.

I assume that's different than what I said...? If so, how?

-Mike

🔗genewardsmith <genewardsmith@...>

10/3/2010 11:39:21 AM

--- In tuning@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:
>
> On Sun, Oct 3, 2010 at 12:31 PM, genewardsmith
> <genewardsmith@...> wrote:

> > Or we could use Euclidean interval space.
>
> I assume that's different than what I said...? If so, how?

http://xenharmonic.wikispaces.com/Monzos+and+Interval+Space

🔗Mike Battaglia <battaglia01@...>

10/3/2010 2:56:42 PM

On Sun, Oct 3, 2010 at 1:39 PM, genewardsmith
<genewardsmith@...> wrote:
>
> --- In tuning@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:
> >
> > On Sun, Oct 3, 2010 at 12:31 PM, genewardsmith
> > <genewardsmith@...> wrote:
>
> > > Or we could use Euclidean interval space.
> >
> > I assume that's different than what I said...? If so, how?
>
> http://xenharmonic.wikispaces.com/Monzos+and+Interval+Space

OK, but how does that apply here? We need to be using a space for
triads, and the whole point of this is that the triad space we choose
is going to have to deal with irrational numbers too.

-Mike

🔗Carl Lumma <carl@...>

10/3/2010 3:18:10 PM

Mike wrote:

> > What do you think?
>
> I think it looks fine.

Actually I think I may have screwed up. The distance
function (from j to t) I gave assumes 90deg axes I think...

> As for the first part, I still don't get
> why just creating orthogonal axes and having them represent the
> l-m dyad and the l-h dyad wouldn't work though. It seems more
> intuitive for me to find a major triad by looking for 386 cents
> and 702 cents than looking for 386 cents and 316 cents.

With a triangular mapping it doesn't matter which two
dyads you use. With a rectangular one, it does.

-Carl

🔗Mike Battaglia <battaglia01@...>

10/3/2010 3:24:03 PM

On Sun, Oct 3, 2010 at 5:18 PM, Carl Lumma <carl@...> wrote:
>
> With a triangular mapping it doesn't matter which two
> dyads you use. With a rectangular one, it does.
>
> -Carl

OK, so let's say you're analyzing two 3/2 fifths on top of each other.
So you go to 702 cents on the horizontal x' axis, and you go to 702
cents on the slanted y' axis. Let's say 702 cents is like an inch on
your computer screen, so you're effectively going to the point that's
an inch to the right and then an inch up and to the right from that at
a 60 degree angle. How does the 1404 cent dyad appear here, in a way
that it won't on a 90 deg plot?

-Mike

🔗Carl Lumma <carl@...>

10/3/2010 8:41:57 PM

Mike wrote

> OK, so let's say you're analyzing two 3/2 fifths on top of each other.
> So you go to 702 cents on the horizontal x' axis, and you go to 702
> cents on the slanted y' axis. Let's say 702 cents is like an inch on
> your computer screen, so you're effectively going to the point that's
> an inch to the right and then an inch up and to the right from that at
> a 60 degree angle. How does the 1404 cent dyad appear here, in a way
> that it won't on a 90 deg plot?

Ibid. -Carl

🔗Mike Battaglia <battaglia01@...>

10/3/2010 8:52:08 PM

On Sun, Oct 3, 2010 at 10:41 PM, Carl Lumma <carl@...> wrote:
>
> Mike wrote
>
> > OK, so let's say you're analyzing two 3/2 fifths on top of each other.
> > So you go to 702 cents on the horizontal x' axis, and you go to 702
> > cents on the slanted y' axis. Let's say 702 cents is like an inch on
> > your computer screen, so you're effectively going to the point that's
> > an inch to the right and then an inch up and to the right from that at
> > a 60 degree angle. How does the 1404 cent dyad appear here, in a way
> > that it won't on a 90 deg plot?
>
> Ibid. -Carl

I've read it twice now, I still don't really understand. Can you
comment on my specific example?

-Mike

🔗Mike Battaglia <battaglia01@...>

10/3/2010 8:59:31 PM

On Sun, Oct 3, 2010 at 10:52 PM, Mike Battaglia <battaglia01@...> wrote:
> I've read it twice now, I still don't really understand. Can you
> comment on my specific example?

I think I get it now. The problem comes from your initial description:
you said they should be at 60 degrees to each other, but if you do it
like that, it won't work. If the axes are 120 degrees away from each
other (e.g. one is up and to the left and one is straight to the
right), then I see what happens.

-Mike

🔗Carl Lumma <carl@...>

10/3/2010 11:22:53 PM

Mike wrote:

> I think I get it now. The problem comes from your initial
> description: you said they should be at 60 degrees to each
> other, but if you do it like that, it won't work. If the
> axes are 120 degrees away from each other (e.g. one is up
> and to the left and one is straight to the right), then I
> see what happens.

Have a look at Paul's hexagonal plots

/tuning/files/PaulErlich/trimap.jpg

-Carl

🔗martinsj013 <martinsj@...>

10/4/2010 6:04:28 AM

>Mike B > As for the first part, I still don't get
> > why just creating orthogonal axes and having them represent the
> > l-m dyad and the l-h dyad wouldn't work though. It seems more
> > intuitive for me to find a major triad by looking for 386 cents
> > and 702 cents than looking for 386 cents and 316 cents.

Carl L> With a triangular mapping it doesn't matter which two
> dyads you use. With a rectangular one, it does.

Sorry I missed this discussion so far; I have had no time this weekend to read the forum or to prepare any graphs of my results.

The way I am seeing this is:
* in a triad a:b:c, I call a:b "lower", b:c "upper" and a:c "outer".
* converting to cents, I have x and y for the lower and upper (i.e. like Carl, and not Mike B); and z for the outer.
* z = x+y is not independent of x and y; the possible values of (x,y,z) lie on an inclined plane (subspace of the full 3D x,y,z space) (which I assume is equivalent to triad space which is new to me).
* the distances within this space incorporate a contribution from all three dyads within the triad, and are therefore what is needed for the calculation.
* distances crop up in two places - the Gaussian function "height" depends on the distance between the actual chord and the candidate JI chord; and this function needs to be integrated over an area "belonging" to the candidate JI chord.
* initially I thought it was going to be essential to "plot" the points in this space in order to determine the neighbours and hence the Voronoi cells correctly; however, with the area approximation given by Carl, there is no need to do this.
* therefore, I simply express the distance in terms of the x and y differences - in effect, use x^2 + y^2 + (x+y)^2 in the Gaussian.

How does this sound?

Steve M.

🔗Carl Lumma <carl@...>

10/4/2010 11:38:32 AM

Steve wrote:

> * distances crop up in two places - the Gaussian function "height"
> depends on the distance between the actual chord and the candidate
> JI chord; and this function needs to be integrated over an area
> "belonging" to the candidate JI chord.

It's the former I think I got wrong. The latter is resolved
because we estimate this integral by multiplying the 'height'
of the Gaussian at the location of the candidate by the inverse
geomean of the candidate's identities (which approximates the
area 'belonging' to it).

> * initially I thought it was going to be essential to "plot"
> the points in this space in order to determine the neighbours
> and hence the Voronoi cells correctly; however, with the area
> approximation given by Carl, there is no need to do this.

Right. However it is still a good plot to view the results,
e.g. in a heat map or 3-D landscape plot.

> * therefore, I simply express the distance in terms of the
> x and y differences - in effect, use x^2 + y^2 + (x+y)^2 in
> the Gaussian.
>
> How does this sound?

I will have to review The Geometry of Triangular Plots
myself... how did you get the (x+y)^2 term?

-Carl

🔗Carl Lumma <carl@...>

10/4/2010 11:39:38 PM

Hi Steve (& Mike),

> * therefore, I simply express the distance in terms of the
> x and y differences - in effect, use x^2 + y^2 + (x+y)^2 in
> the Gaussian.

I think we need to use this (2y'+x')/sqrt(3) thing

http://lumma.org/tuning/erlich/2000.10.05.TheGeometryOfTriangularPlots.txt

Going back to the expression I gave before

/tuning/topicId_93392.html#93468
> p(j|t) = A/geomean(j) * exp(-(x^2 + y^2)/2s^2)
> where
> x = cents(j[1]/j[0]) - t[0]
> y = cents(j[2]/j[1]) - t[1]

We change it to

p(j|t) = A/geomean(j) * exp(-(@x^2 + @y^2)/2s^2)

where

@x = jx - tx
@y = jy - ty

jx = cents(j[1]/j[0])
tx = t[0]
jy' = cents(j[2]/j[1])
ty' = t[1]
jy = (2jy'+jx)/sqrt(3)
ty = (2ty'+tx)/sqrt(3)

No?

-Carl

🔗martinsj013 <martinsj@...>

10/6/2010 10:11:11 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> I will have to review The Geometry of Triangular Plots
> myself... how did you get the (x+y)^2 term?

Carl (sorry for delay),
it is simply z^2; remember that I am using 3 perpendicular axes x,y,z - so my x,y are not the same as in the GofTP post (thanks for re-posting it). I have reviewed it myself and am now pretty convinced that mine is equivalent; my inclined plane is equivalent to a stretch by sqrt(3) along the line y=x so that a square maps to a parallelogram with angles 60deg and 120deg, etc ...

I do have some results, but no way of displaying them nicely. Also, I seem to see the HE tailing off as the outer interval approaches 2/1 - a sign of not enough JI points "plotted", in my experience. What Tenney limit would you consider the minimum to obtain decent results?

Steve M.

🔗martinsj013 <martinsj@...>

10/6/2010 10:16:30 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> ... I think we need to use this (2y'+x')/sqrt(3) thing
> ... p(j|t) = A/geomean(j) * exp(-(@x^2 + @y^2)/2s^2)
> where
> @x = jx - tx
> @y = jy - ty
> jx = cents(j[1]/j[0])
> tx = t[0]
> jy' = cents(j[2]/j[1])
> ty' = t[1]
> jy = (2jy'+jx)/sqrt(3)
> ty = (2ty'+tx)/sqrt(3)
>
> No?

Please see my very recent post - I think my calculation is equivalent, but will check ...

🔗Carl Lumma <carl@...>

10/6/2010 3:24:56 PM

Hi Steve,

> I have reviewed it myself and am now pretty convinced that
> mine is equivalent; my inclined plane is equivalent to a
> stretch by sqrt(3) along the line y=x so that a square maps
> to a parallelogram with angles 60deg and 120deg, etc ...

Ok, sounds promising.

> I do have some results, but no way of displaying them nicely.

Can you give us a top 20 list of most consonant chords?
Eventually, we want to list the local minima and maxima.
What programming language are you using?

> Also, I seem to see the HE tailing off as the outer interval
> approaches 2/1 - a sign of not enough JI points "plotted", in
> my experience. What Tenney limit would you consider the minimum
> to obtain decent results?

Paul used n*d < 10,000. That would suggest a*b*c < 1,000,000.
Ideally, start with 1000 and double it each round until the
results stop changing.

-Carl

🔗Mike Battaglia <battaglia01@...>

10/6/2010 3:40:23 PM

On Wed, Oct 6, 2010 at 6:24 PM, Carl Lumma <carl@...> wrote:
>
> > I do have some results, but no way of displaying them nicely.
>
> Can you give us a top 20 list of most consonant chords?
> Eventually, we want to list the local minima and maxima.
> What programming language are you using?

Or post the code... I really want to see if the triadic case
calculated "properly" matches up to the Gaussian convolution approach.

-Mike

🔗Carl Lumma <carl@...>

10/7/2010 12:50:30 AM

Mike wrote:

> Or post the code... I really want to see if the triadic case
> calculated "properly" matches up to the Gaussian convolution
> approach.

Have you done dyadic comparisons?

-Carl

🔗martinsj013 <martinsj@...>

10/7/2010 3:11:45 AM

--- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:
> --- In tuning@yahoogroups.com, "Carl Lumma" <carl@> wrote:
> > ... I think we need to use this (2y'+x')/sqrt(3) thing
> > ... p(j|t) = A/geomean(j) * exp(-(@x^2 + @y^2)/2s^2)
> Please see my very recent post - I think my calculation is equivalent, but will check ...

Using the notation of Geometry of Triadic...
(x' and y' are the lower and upper intervals; x and y are where it is plotted in triadic space)

distance^2 of mapped point (x,y) from the origin:
x^2 + y^2 = x'^2 + ((2y'+x')/sqrt(3))^2
= x'^2 + (4y'^2+4x'y'+x'^2)/3
= 4/3*(x'^2+x'y'+y'^2)

My version: x'^2+ y'^2+ (x'+y')^2
= 2*(x'^2+x'y'+y'^2)

differing only by a constant factor.

Now I think this makes a difference - in effect my "s" is smaller.

Steve M.

🔗Mike Battaglia <battaglia01@...>

10/7/2010 6:18:07 AM

On Thu, Oct 7, 2010 at 2:50 AM, Carl Lumma <carl@...> wrote:
>
> Mike wrote:
>
> > Or post the code... I really want to see if the triadic case
> > calculated "properly" matches up to the Gaussian convolution
> > approach.
>
> Have you done dyadic comparisons?
>
> -Carl

Yes, they're posted in my files. They look identical to the HE curves
except they have a general upward slope because I fludged up the Farey
Series generator. I got those curves by just plotting all of the
rationals in Farey Series x in freq/log space and giving them a height
of sqrt(n*d), then convolving the signal with a logarithmically skewed
Gaussian (that would look linear on a log plot), and then doing some
-xlogx of something calculation that flipped it upside down and turned
it into something that looks pretty much like HE. Paul's not convinced
it's entirely mathematically equivalent (I think it might be), but
it's close enough that it's certainly useful as a model, and if it's
also as close for the triadic case then to extend it to tetrads,
pentads, etc would be trivial.

-Mike

🔗Carl Lumma <carl@...>

10/7/2010 10:43:07 AM

Mike wrote:

> > Have you done dyadic comparisons?
>
> Yes, they're posted in my files. They look identical to the HE
> curves except they have a general upward slope because I fludged
> up the Farey Series generator.

They're a bit hard for me to read with all the overlapping
numbers. The Farey series formulation itself is known to
have an upward slope, and N=100 is bound to cause problems too.

> it's entirely mathematically equivalent (I think it might be),
> but it's close enough that it's certainly useful as a model,
> and if it's also as close for the triadic case then to extend
> it to tetrads, pentads, etc would be trivial.

Yes, understood. It might be good to see how well you can
reproduce the dyadic case first.

-Carl

🔗Mike Battaglia <battaglia01@...>

10/7/2010 12:34:10 PM

On Thu, Oct 7, 2010 at 12:43 PM, Carl Lumma <carl@...> wrote:
>
> > it's entirely mathematically equivalent (I think it might be),
> > but it's close enough that it's certainly useful as a model,
> > and if it's also as close for the triadic case then to extend
> > it to tetrads, pentads, etc would be trivial.
>
> Yes, understood. It might be good to see how well you can
> reproduce the dyadic case first.

I've just lost all interest, really, and would rather spend time
working the Laplace transform, especially since you and Steve seem to
be handling the triadic entropy part of all of this. But if anyone's
interested, I'll post up the code when I get back to America (I
apparently didn't copy it right. Sacre bleu!).

Conceptually, it just makes the opposite set of assumptions as HE -
instead of giving the incoming dyad a Gaussian curve, and the basis
dyads "rectangular" domains of height 1 and width sqrt(n*d), it gives
the basis dyads each "Gaussian" domains height sqrt(n*d) and a
standard deviation of s, and treats the basis dyad as an impulse in
dyad space.

-Mike

🔗Carl Lumma <carl@...>

10/7/2010 12:46:58 PM

Mike wrote:

> > Yes, understood. It might be good to see how well you can
> > reproduce the dyadic case first.
>
> I've just lost all interest, really, and would rather spend
> time working the Laplace transform, especially since you and
> Steve seem to be handling the triadic entropy part of all
> of this. But if anyone's interested, I'll post up the code
> when I get back to America (I apparently didn't copy it right.
> Sacre bleu!).
>
> Conceptually, it just makes the opposite set of assumptions
> as HE - instead of giving the incoming dyad a Gaussian curve,
> and the basis dyads "rectangular" domains of height 1 and
> width sqrt(n*d), it gives the basis dyads each "Gaussian"
> domains height sqrt(n*d) and a standard deviation of s, and
> treats the basis dyad as an impulse in dyad space.

I think it's a very promising approach.

-Carl

🔗martinsj013 <martinsj@...>

10/10/2010 4:38:28 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> ... Can you give us a top 20 list of most consonant chords?
> Eventually, we want to list the local minima and maxima.

I have uploaded some files to a SteveMartin folder, including:

* 610 local minima. These were found in a grid of 14641 pts (lower interval 0-1200cents x upper interval 0-1200cents, increments of 10 cents). Tenney limit for the JI triads was only 10K; also, the minima at the edge of the grid could be spurious.

* some cross-sections of the surface plot, e.g. for lower=386c, or lower+upper=702c. As these have fewer points (e.g. 1200) they could be done at a higher Tenney limit (65536) without taking hours.

> What programming language are you using? ...

Perl, but I am thinking of re-writing it in C.

> Paul used n*d < 10,000. That would suggest a*b*c < 1,000,000.
> Ideally, start with 1000 and double it each round until the
> results stop changing.

Is it known that they will stop changing? It may be that the entropy continues to increase as N increases. I could look into this. However, my current idea is to use the local minima found above as a starting point to search for refined minima using a much higher Tenney limit (I have the list for 16M but have not tried to use it yet!).

Comments welcome.

Steve M.

🔗Carl Lumma <carl@...>

10/10/2010 7:31:55 PM

> I have uploaded some files to a SteveMartin folder, including:

Fantastic, Steve! I saw the cross-sections earlier and they
all looked reasonable.

I don't know what the mystery object is.

HE-means.png says Mann series, but on the graph it says n*d.
Mann series is n+d. ? At any rate, I can't tell if minima/
maxima are going away or just scaling out of sight. You can
compare to:
/tuning/files/PaulErlich/manuel2.jpg

On HE-mediants.png, the Tenney limit is probably too low.
At any rate, compare:
/tuning/files/PaulErlich/tenney/tcmp3.jpg

tminima.png has 90deg axes, which is unfortunate but I
understand if it's a limitation of your tools.

Thanks for tminima.csv, I'll look at it more closely
later tonight.

> * 610 local minima. These were found in a grid of 14641 pts
> (lower interval 0-1200cents x upper interval 0-1200cents,
> increments of 10 cents). Tenney limit for the JI triads was
> only 10K; also, the minima at the edge of the grid could be
> spurious.

1200-ET resolution would be ideal but in the meantime you
might try an ET near 120 with better JI approximations, like
118 or 121.

> > What programming language are you using? ...
>
> Perl, but I am thinking of re-writing it in C.

I was hoping you'd say something that I knew of a good
scientific computing/graphing library for... :)

> > Paul used n*d < 10,000. That would suggest a*b*c < 1,000,000.
> > Ideally, start with 1000 and double it each round until the
> > results stop changing.
>
> Is it known that they will stop changing? It may be that
> the entropy continues to increase as N increases.

That may be true, but the relative rankings of the minima
and maxima should stabilize.

One thing about computing harmonic entropy, even if it
takes hours, is that once you get it right you never have
to do it again. It's only gonna be 1.5MB or something.

-Carl

🔗Carl Lumma <carl@...>

10/12/2010 1:12:59 AM

Steve wrote:

> * 610 local minima. These were found in a grid of 14641 pts
> (lower interval 0-1200cents x upper interval 0-1200cents,
> increments of 10 cents). Tenney limit for the JI triads was
> only 10K; also, the minima at the edge of the grid could be
> spurious.

Congratulations on being the first person to compute triadic
harmonic entropy!

Your csv file was separated by semicolons instead of commas
for some reason, but this was easily fixed. I also had to
change the fwd slashes to semicolons to stop Excel from
formatting the chords like dates or times (what a frustrating
'feature'!).

I'm not seeing the magic chords I might expect, such as 490-490,
but this isn't damning and it needs further examination.

What are the "ct" and "pmax" columns?

-Carl

🔗martinsj013 <martinsj@...>

10/12/2010 4:53:37 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> ... HE-means.png says Mann series, but on the graph it says n*d.
> Mann series is n+d. ? At any rate, I can't tell if minima/
> maxima are going away or just scaling out of sight. You can
> compare to:
> /tuning/files/PaulErlich/manuel2.jpg

Carl thanks for the comments and the links to diagrams which I'd not found before.

re. "Mann" - my mistake, for some reason I thought that was the accepted name for the type of series we are using. Just to confirm, both in the dyadic and the triadic case I am using a set of ratios defined by the limit on Tenney height, n*d or a*b*c respectively. If that's wrong, I'd better throw away the results! (but not the code, which can be easily fixed).

> On HE-mediants.png, the Tenney limit is probably too low.
> At any rate, compare:
/tuning/files/PaulErlich/tenney/tcmp3.jpg

Well spotted, that was a quick look to see how results compared; I need to redo it with a higher limit; at this low limit anyway, the "definition" is much better with mediants than with means. Of course, with the triadic case, I am using neither (strictly).

> tminima.png has 90deg axes, which is unfortunate but I
> understand if it's a limitation of your tools.

I can project the results onto the other axes, once I have studied the layout in the link you gave.

> 1200-ET resolution would be ideal but in the meantime you
> might try an ET near 120 with better JI approximations, like
> 118 or 121.

Processing ...

> I was hoping you'd say something that I knew of a good
> scientific computing/graphing library for... :)

such as? I may be able to oblige. Or, do you have a tool that can process a CSV or Excel file? Are we talking open source or commercial? I assume you are thinking of surface charts and the like?

Steve M.

🔗martinsj013 <martinsj@...>

10/12/2010 5:12:59 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> Congratulations on being the first person to compute triadic
> harmonic entropy!

Thank you! (Am I really?)

> Your csv file was separated by semicolons instead of commas
> for some reason, but this was easily fixed. I also had to
> change the fwd slashes to semicolons to stop Excel from
> formatting the chords like dates or times (what a frustrating
> 'feature'!).

I'll try to prevent that in future.

> I'm not seeing the magic chords I might expect, such as 490-490,
> but this isn't damning and it needs further examination.

Thanks, another line of exploration ...

> What are the "ct" and "pmax" columns?

Right, should have explained these additional/intermediate outputs of the calculation: for each point x,y: "ct" is the number of JI triads (from the "Tenney" set) giving a probability > 0.01, "pmax" is the largest probability amongst them, and the last column identifies which JI triad has this pmax. (I had these for the dyadic case, to give me some understanding of what was going on, and kept them for the triadic case. Are they of interest / are there any others you'd like to see?)

Steve M.

🔗Chris Vaisvil <chrisvaisvil@...>

10/12/2010 5:28:14 AM

Hi!

I missed the CSV file link - or is it in the yahoo files section?
I'd love to see it. And congratulations!

Chris

On Tue, Oct 12, 2010 at 8:12 AM, martinsj013 <martinsj@...> wrote:
>
>
>
> --- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> > Congratulations on being the first person to compute triadic
> > harmonic entropy!
>
> Thank you! (Am I really?)
>

🔗cityoftheasleep <igliashon@...>

10/12/2010 10:11:17 AM

Yes, a link to the file would be appreciated!

-Igs

--- In tuning@yahoogroups.com, Chris Vaisvil <chrisvaisvil@...> wrote:
>
> Hi!
>
> I missed the CSV file link - or is it in the yahoo files section?
> I'd love to see it. And congratulations!
>
> Chris
>
> On Tue, Oct 12, 2010 at 8:12 AM, martinsj013 <martinsj@...> wrote:
> >
> >
> >
> > --- In tuning@yahoogroups.com, "Carl Lumma" <carl@> wrote:
> > > Congratulations on being the first person to compute triadic
> > > harmonic entropy!
> >
> > Thank you! (Am I really?)
> >
>

🔗Carl Lumma <carl@...>

10/12/2010 10:53:05 AM

/tuning/files/SteveMartin/

-Carl

--- In tuning@yahoogroups.com, "cityoftheasleep" <igliashon@...> wrote:
>
> Yes, a link to the file would be appreciated!
>
> -Igs
>
> --- In tuning@yahoogroups.com, Chris Vaisvil <chrisvaisvil@> wrote:
> >
> > Hi!
> >
> > I missed the CSV file link - or is it in the yahoo files section?
> > I'd love to see it. And congratulations!
> >
> > Chris

🔗Carl Lumma <carl@...>

10/12/2010 11:36:38 AM

Steve wrote:

> > Congratulations on being the first person to compute triadic
> > harmonic entropy!
>
> Thank you! (Am I really?)

Yes!

> re. "Mann" - my mistake, for some reason I thought that was
> the accepted name for the type of series we are using. Just
> to confirm, both in the dyadic and the triadic case I am using
> a set of ratios defined by the limit on Tenney height, n*d or
> a*b*c respectively. If that's wrong, I'd better throw away
> the results! (but not the code, which can be easily fixed).

You're doing it right. That's called (informally) a Tenney
series (in the dyadic case) or Tenney limit (generally).
Mann is n+d. It was tested and found to be inferior to n*d.

> > tminima.png has 90deg axes, which is unfortunate but I
> > understand if it's a limitation of your tools.
>
> I can project the results onto the other axes, once I have
> studied the layout in the link you gave.

Great!

> > I was hoping you'd say something that I knew of a good
> > scientific computing/graphing library for... :)
>
> such as?

You seem to be using a spreadsheet for plotting. Had you
said python, I could have suggested matplotlib, chaco, or
RPy. But I don't know from perl.

> Or, do you have a tool that can process a CSV or Excel file?

I'm using Excel to read the data, but I wouldn't use it
to plot the data.

> > What are the "ct" and "pmax" columns?
>
> Right, should have explained these additional/intermediate
> outputs of the calculation: for each point x,y: "ct" is the
> number of JI triads (from the "Tenney" set) giving a
> probability > 0.01, "pmax" is the largest probability amongst
> them, and the last column identifies which JI triad has
> this pmax. (I had these for the dyadic case, to give me
> some understanding of what was going on, and kept them for
> the triadic case. Are they of interest / are there any
> others you'd like to see?)

Aha, great work. They're of interest.

-Carl

🔗genewardsmith <genewardsmith@...>

10/12/2010 11:09:22 AM

--- In tuning@yahoogroups.com, "cityoftheasleep" <igliashon@...> wrote:
>
> Yes, a link to the file would be appreciated!

Something with results in a form other than an image file would be nice. What are the best 100 triads, or something.

🔗Carl Lumma <carl@...>

10/12/2010 11:45:18 AM

Gene wrote:

> Something with results in a form other than an image file
> would be nice. What are the best 100 triads, or something.

All the local minima are in tminima.csv, along with their
entropies, so they can be sorted. But it isn't a valid
csv file. I've placed an Excel version in my folder if
that helps:

/tuning/files/CarlLumma/tminima.xls

I'm giving Steve some breathing time before hitting him up
for the maxima. :)

-Carl

🔗Mike Battaglia <battaglia01@...>

10/12/2010 12:09:53 PM

On Tue, Oct 12, 2010 at 7:12 AM, martinsj013 <martinsj@...> wrote:
>
> --- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> > Congratulations on being the first person to compute triadic
> > harmonic entropy!
>
> Thank you! (Am I really?)

Sure are. Now, a question. You said "what is this graph?"

http://f1.grp.yahoofs.com/v1/IKK0TLZpk9LFdJKqxR-AxUUNcU1zgpHY9P2z_8Chn7OlzHSflvqOVPl9wVYcy7FjLkT4oc6YqQUHMdcxZuJHpPse0EPGggI/SteveMartin/f1024.png

Looks like a dyadic width vs complexity plot, where complexity might
be sqrt(n*d) or mediant to mediant widths, etc. Is that what this is
here? Either way, my approach was to first generate this graph and
then lowpass it by convolving it with a Gaussian... the lowpass
effectively creates "fields of attraction," then you flip it upside
down, etc.

How did you generate this graph?

-Mike

🔗Chris Vaisvil <chrisvaisvil@...>

10/12/2010 1:04:25 PM

Thanks!

ok, I get it - this is great!!!

What we need now is a rotatable 3-d surface projection!

Chris

On Tue, Oct 12, 2010 at 1:53 PM, Carl Lumma <carl@...> wrote:

>
>
> /tuning/files/SteveMartin/
>
> -Carl
>
>
> ---
>

🔗cityoftheasleep <igliashon@...>

10/12/2010 1:37:35 PM

Wow, results NOT as expected! Never would have guessed that 4:5:7 has such lower entropy than 4:5:6! Also, it is VERY interesting to me that the minima are almost never at exact Just ratios. VERY interesting.

Will be very curious about the maxima!

-Igs

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
>
> Gene wrote:
>
> > Something with results in a form other than an image file
> > would be nice. What are the best 100 triads, or something.
>
> All the local minima are in tminima.csv, along with their
> entropies, so they can be sorted. But it isn't a valid
> csv file. I've placed an Excel version in my folder if
> that helps:
>
> /tuning/files/CarlLumma/tminima.xls
>
> I'm giving Steve some breathing time before hitting him up
> for the maxima. :)
>
> -Carl
>

🔗Carl Lumma <carl@...>

10/12/2010 1:52:35 PM

Hi Igs,

> Also, it
> is VERY interesting to me that the minima are almost
> never at exact Just ratios. VERY interesting.

Can't be, since Steve is (so far) using a step size of
10 cents.

> Wow, results NOT as expected! Never would have guessed
> that 4:5:7 has such lower entropy than 4:5:6!

Yeah, that would be wrong. I haven't had time to do much
checking yet. And as I said, this is still at the formative
stage. ... Looks like it may be an artifact of the
resolution, since 7:4 happens to be close to a multiple
of 10 cents. ...

-Carl

🔗Graham Breed <gbreed@...>

10/13/2010 12:43:21 AM

On 12 October 2010 22:36, Carl Lumma <carl@...> wrote:

> You seem to be using a spreadsheet for plotting.  Had you
> said python, I could have suggested matplotlib, chaco, or
> RPy.  But I don't know from perl.
>
>> Or, do you have a tool that can process a CSV or Excel file?
>
> I'm using Excel to read the data, but I wouldn't use it
> to plot the data.

There is a CSV module for Python.

Graham

🔗martinsj013 <martinsj@...>

10/14/2010 4:52:18 AM

Igs> ... VERY interesting to me that the minima are almost never at exact Just ratios. VERY interesting.
Carl> ... Steve is (so far) using a step size of 10 cents.
Igs> Never would have guessed that 4:5:7 has such lower entropy than 4:5:6!
Carl> Yeah, that would be wrong. I haven't had time to do much checking yet. And as I said, this is still at the formative stage. ...

Agreed; in addition to the step-size of 10 cents (and the possibility that there is a fatal flaw in my implementation), I should re-iterate that the minima list was done using a very low Tenney limit. I have been distracted a bit by other possibilities (e.g using method of descents to refine the minima), but I think my next task is simply redo the same grid with a higher Tenney limit. Will do when I can ...

Steve M.

🔗Chris Vaisvil <chrisvaisvil@...>

10/14/2010 5:02:01 AM

It is possible that you introduced an offset somewhere.

Normalization with a known might correct that. For instance a Just major
chord should be a minimum. If you shift your data to show that the rest of
it may "snap into place".

Chris

On Thu, Oct 14, 2010 at 7:52 AM, martinsj013 <martinsj@lycos.com> wrote:

>
>
> Igs> ... VERY interesting to me that the minima are almost never at exact
> Just ratios. VERY interesting.
> Carl> ... Steve is (so far) using a step size of 10 cents.
> Igs> Never would have guessed that 4:5:7 has such lower entropy than 4:5:6!
> Carl> Yeah, that would be wrong. I haven't had time to do much checking
> yet. And as I said, this is still at the formative stage. ...
>
> Agreed; in addition to the step-size of 10 cents (and the possibility that
> there is a fatal flaw in my implementation), I should re-iterate that the
> minima list was done using a very low Tenney limit. I have been distracted a
> bit by other possibilities (e.g using method of descents to refine the
> minima), but I think my next task is simply redo the same grid with a higher
> Tenney limit. Will do when I can ...
>
> Steve M.
>
>
>

🔗martinsj013 <martinsj@...>

10/14/2010 5:14:47 AM

--- In tuning@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:
> Looks like a dyadic width vs complexity plot, where complexity might
> be sqrt(n*d) or mediant to mediant widths, etc. Is that what this is
> here? Either way, my approach was to first generate this graph and
> then lowpass it by convolving it with a Gaussian... the lowpass
> effectively creates "fields of attraction," then you flip it upside
> down, etc.
> > How did you generate this graph?

Mike,
it is indeed mediant-to-mediant widths (blue) and mean-to-mean widths ditto (red) plotted against dyad interval cents. Illustrating two things that may already have been obvious to some:
1) that they differ significantly only for the *neighbours* of the simplest ratios.
2) that this simple measure could be a useful complexity measure (indeed you said it above) offering something similar to HE.

Steve M.

🔗martinsj013 <martinsj@...>

10/14/2010 5:27:35 AM

--- In tuning@yahoogroups.com, Chris Vaisvil <chrisvaisvil@...> wrote:
> It is possible that you introduced an offset somewhere.
> Normalization with a known might correct that. For instance a Just major
> chord should be a minimum. If you shift your data to show that the rest of
> it may "snap into place".

Thanks for the thought, Chris, but I don't think so. A lot of the reported local minima involve the unison and the octave, which seems correct, and 700 cents which is v close to 3/2. Note also that 0:386:702 is not one of the plotted points; but 0:380:700, 0:390:700, 0:380:710, 0:390:710 are and one of them turns out to be a local minimum *on this grid*. Hence my talk of refining the grid, either in a targetted way, or in a "sledgehammer" way. Note also that the minimum may not be precisely at the Just chord - as already seen in dyadic HE I believe.

Steve M.

🔗martinsj013 <martinsj@...>

10/18/2010 5:43:47 AM

>> Igs> Never would have guessed that 4:5:7 has such lower entropy than 4:5:6!
> Carl> Yeah, that would be wrong. I haven't had time to do much checking yet. And as I said, this is still at the formative stage. ...
Steve> ... the minima list was done using a very low Tenney limit. ...

I've not had much time for this, but some progress to report:
* I have prettified the pictures of the minima and maxima, and the lists are now proper CSVs (see Files / SteveMartin).
* I have refined the locations, by iteration, to a 1-cent grid - will upload the lists and new diagram when I can. The results near the edges of the diagram are still unreliable, for reasons I think I understand but have not fixed yet.
* 4:5:7 is still lower than 4:5:6 even on this grid (Tenney limit still 10K) - so I have redone the calc for just these locations with a Tenney limit of 64K, and then sure enough 4:5:7 is higher than 4:5:6.
* The trouble is the calculations take so long (I am not using the most powerful PC)! and the last result above shows that trying to use a low Tenney limit is no good.

🔗Carl Lumma <carl@...>

10/18/2010 12:04:02 PM

> I've not had much time for this, but some progress to report:
> * I have prettified the pictures of the minima and maxima, and
> the lists are now proper CSVs (see Files / SteveMartin).
> * I have refined the locations, by iteration, to a 1-cent
> grid - will upload the lists and new diagram when I can. The
> results near the edges of the diagram are still unreliable,
> for reasons I think I understand but have not fixed yet.
> * 4:5:7 is still lower than 4:5:6 even on this grid (Tenney
> limit still 10K) - so I have redone the calc for just these
> locations with a Tenney limit of 64K, and then sure enough
> 4:5:7 is higher than 4:5:6.
> * The trouble is the calculations take so long (I am not using
> the most powerful PC)! and the last result above shows that
> trying to use a low Tenney limit is no good.

Excellent work as always Steve!

Tenney limits include many pathological triads, such as
1:2:5000. Since you're only going up to 1200+1200,
I assume these aren't slowing you down...

It is desirable to get to Tenney limit 1,000,000. At my
old job I had access to very fast machines. Now, just my
dual-core laptop (2008 vintage).

tmax2 and tmin2 have 1-cent resolution. Still Tenney = 10K?
And what do the column headers mean?

Thanks!

-Carl

🔗genewardsmith <genewardsmith@...>

10/18/2010 1:13:20 PM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:

> Excellent work as always Steve!

Sounds like it, but I'm still having trouble attempting to interpret this data. A Top 20 list posted here would be nice.

🔗martinsj013 <martinsj@...>

10/18/2010 3:10:29 PM

--- In tuning@yahoogroups.com, "genewardsmith" <genewardsmith@...> wrote:
> A Top 20 list posted here would be nice.

OK but I stretched it to 30 ...

Lower Upper Triadic HE (Tenney 10K) approximate JI triad
-2 0 0.000000705412988722 1/1/1
1198 0 0.000242561620198951 1/2/2
-3 1202 0.00345878494540176 1/1/2
17 1192 0.00346120500244273 1/1/2
1202 700 0.00401834430083239 1/2/3
1201 1199 0.0304371308344252 1/2/4
706 1198 0.0407338793692221 2/3/6
705 497 0.0531527624437082 2/3/4
703 -3 0.0813516159500647 2/3/3
700 3 0.0813564666013889 2/3/3
701 887 0.0858430846795844 2/3/5
499 386 0.109669630036789 3/4/5
-5 704 0.169929711786534 2/2/3
15 694 0.169959373951981 2/2/3
881 584 0.195752224808331 3/5/7
499 1202 0.199678729963811 3/4/8
384 584 0.209450441664418 4/5/7
496 702 0.258238579495058 3/4/6
1196 388 0.265836042679753 2/4/5
886 1017 0.277640367888065 3/5/9
387 316 0.31341752153555 4/5/6
886 1199 0.325566952045427 3/5/10
703 266 0.356931201836506 4/6/7
966 437 0.408472614497952 4/7/9
319 496 0.4220164342477 5/6/8
885 815 0.423290978323534 3/5/8
701 1050 0.430589493220335 4/6/11
497 968 0.439087870167509 3/4/7
389 1198 0.45110534880223 4/5/10
583 231 0.453281456560949 5/7/8

Steve.

🔗martinsj013 <martinsj@...>

10/18/2010 3:14:49 PM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> Tenney limits include many pathological triads, such as
> 1:2:5000. Since you're only going up to 1200+1200,
> I assume these aren't slowing you down...

Correct, I exclude these.

> tmax2 and tmin2 have 1-cent resolution. Still Tenney = 10K?

Yes, unfortunately.

> And what do the column headers mean?

Sorry I was in a rush. As I am again now, so:
the old (10cent step) location - lower then upper
Triadic HE for that (old) point
JI triad with highest probability for that (old) point
the new (1 cent step) location - lower then upper
Triadic HE for the new point.

HTH,
Steve.

🔗genewardsmith <genewardsmith@...>

10/18/2010 4:01:51 PM

--- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:
>
> --- In tuning@yahoogroups.com, "genewardsmith" <genewardsmith@> wrote:
> > A Top 20 list posted here would be nice.
>
> OK but I stretched it to 30 ...

How does one take "lower" and "upper" and convert that to a triad?

🔗Carl Lumma <carl@...>

10/18/2010 6:33:03 PM

Gene wrote:

> > OK but I stretched it to 30 ...
>
> How does one take "lower" and "upper" and convert that to a triad?

In triad a:b:c, b/a is "lower" and c/b is "upper"

I'm not sure where the negative numbers are coming from...

-Carl

🔗Carl Lumma <carl@...>

10/18/2010 6:40:34 PM

Steve wrote:

> > And what do the column headers mean?
>
> Sorry I was in a rush. As I am again now, so:
> the old (10cent step) location - lower then upper
> Triadic HE for that (old) point
> JI triad with highest probability for that (old) point
> the new (1 cent step) location - lower then upper
> Triadic HE for the new point.

Singing a song of longing for JIT(new)... :) -Carl

🔗martinsj013 <martinsj@...>

10/20/2010 10:09:29 PM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> In triad a:b:c, b/a is "lower" and c/b is "upper"
> I'm not sure where the negative numbers are coming from...

Short answer: edge effects. Longer answer:
Initially I calculated HE at points (x,y) on a grid 0..1200 step 10; found the local minima; then refined the location of each minimum by a search method; on occasion this resulted in x or y straying outside the original bounds. This is an artifact of the way I have done the calculation and needs to be fixed.

I believe that the function should be symmetrical about the lines x=0 and y=0 (and x+y=0 because we are really in triad space) and therefore any minimum near these lines should really be exactly on the line (unless there really is a pair straddling the line).

In my attempts to limit the number of JI triads considered for each point (x,y) I may have ended up with not enough points outside the grid area. I am working on this, but any comments are welcome.

Steve M.

🔗martinsj013 <martinsj@...>

10/28/2010 4:05:47 AM

--- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:
> In my attempts to limit the number of JI triads considered for each point (x,y) I may have ended up with not enough points outside the grid area. I am working on this, but any comments are welcome.

Time for an update: I am aiming to fulfil Carl's request for a recalculation of the grid but using a Tenney limit of 1M. Good news: it is running now. Bad news: It has been running for 24 hours already and nowhere near finished!

Steve M.

🔗Carl Lumma <carl@...>

10/28/2010 10:30:23 AM

Steve wrote:

> --- In tuning@yahoogroups.com, "martinsj013" <martinsj@> wrote:
> > In my attempts to limit the number of JI triads considered for
> > each point (x,y) I may have ended up with not enough points
> > outside the grid area. I am working on this, but any comments
> > are welcome.
>
> Time for an update: I am aiming to fulfil Carl's request for a
> recalculation of the grid but using a Tenney limit of 1M. Good
> news: it is running now. Bad news: It has been running for
> 24 hours already and nowhere near finished!

Now that's science! :)

-Carl

🔗martinsj013 <martinsj@...>

10/31/2010 1:17:23 PM

--- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:
> I am aiming to fulfil Carl's request for a recalculation of the grid but using a Tenney limit of 1M. Good news: it is running now. Bad news: It has been running for 24 hours already and nowhere near finished!

Finally, I have updated a list of the local minima and local maxima for a Tenney limit of 1 million on the same grid as before (lower and upper interval both varying over 0-1200 cents in 10 cent steps). Please see
/tuning/files/SteveMartin/tmin_1M.csv and
/tuning/files/SteveMartin/tmax_1M.csv

Now running: refinement of the above lists to an accuracy of 1 cent.

Steve M.

🔗Carl Lumma <carl@...>

11/1/2010 7:16:56 PM

> Finally, I have updated a list of the local minima and local
> maxima for a Tenney limit of 1 million on the same grid as before
> (lower and upper interval both varying over 0-1200 cents in
> 10 cent steps).

Woohoo!

And thanks for creating the friendly links.

Did you find a bottleneck or just wait for it? How long did
it finally take?

> Now running: refinement of the above lists to an accuracy of 1 cent.

Exciting!

-Carl

🔗martinsj013 <martinsj@...>

11/2/2010 2:44:30 PM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> Did you find a bottleneck or just wait for it? How long did
> it finally take?

Carl, and all,
I just waited for it. I didn't keep an accurate note of anything, but my poor little laptop took about 72 hours to calculate about 36,000 points (about a quarter of the grid). I then deployed my quadcore desktop (usually monopolised by my son, but he went away on a trip :-). One copy of the Perl program could only use one CPU (Task Manager reports "25%") but I ran three in parallel (getting Task Manager to report "75%"). Still it took about a further 48 hours.

Recall there were about 610 local minima at N=10K, and about the same number of maxima; now at N=1M there are 368 minima and about 400 maxima. The refinement calculation seems to typically involve say 40 points per minimum, so this will probably take about 48 hours as well!

I am sure the code is not particularly efficient; and I've not had time to re-write in C which I assume would be better. I explored cutting down on the number of JI triads to be compared to each grid point but this seemed to affect the results too much (i.e. points that I would have expected to be too far away to influence the result, seem in fact to influence the result). Settled for a largely brute force approach. Before I started the N=1M calcs, I did some sanity checks on results at N=10K. For example, 4/5/6 and its five "images" in triad space all have the same T.H.E. (or very nearly):

(use Courier font)
Lower Upper T.H.E. ct pmax JI@pmax
386, 316, 0.315979862507562, 11, 0.939027954830041, "4/5/6"
702, -316, 0.31597986250756, 11, 0.939027954830041, "4/6/5"
316, -702, 0.315979862507559, 11, 0.939027954830041, "5/6/4"
-316, -386, 0.31597986250756, 11, 0.939027954830041, "6/5/4"
-702, 386, 0.315979862507559, 11, 0.939027954830041, "6/4/5"
-386, 702, 0.31597986250756, 11, 0.939027954830041, "5/4/6"

My observations so far:
* a lot of the top minima have one interval of 0 cents, or 1200 cents - more so at N=1M than at N=10K it seems to me.
* refined minima do now seem to closely approximate JI intervals to the nearest 1 cent - more so than before.
* the order of the minima still does not follow strict Tenney height order - e.g. 5/5/9 lower than 5/5/8 lower than 5/5/7; and others less obvious.
* OTOH 4/5/6 is now lower than 4/5/7 (as previously advertised!), but by very little indeed (and note, both are further down the list than they were before).
* interesting that there's a clump of consecutive minima all sharing the 4/5 interval - not sure if this is significant or a coincidence. I can see other such pairs too, but maybe nothing in this.
* a lot of the top maxima have an interval around 30 cents, or 1170 cents.

The refined mins and maxs are churning out slowly; not finished yet but here is how far they've got:
/tuning/files/SteveMartin/tmin2_1M.csv
/tuning/files/SteveMartin/tmax2_1M.csv
Columns:
Lower, Upper, JI triad with pmax, (unrefined lower, upper). T.H.E.

Steve M.

🔗Carl Lumma <carl@...>

11/2/2010 11:39:16 PM

--- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:

> Carl, and all,
> I just waited for it. I didn't keep an accurate note of
> anything, but my poor little laptop took about 72 hours to
> calculate about 36,000 points (about a quarter of the grid).
> I then deployed my quadcore desktop (usually monopolised by
> my son, but he went away on a trip :-). One copy of the
> Perl program could only use one CPU (Task Manager reports
> "25%") but I ran three in parallel (getting Task Manager to
> report "75%"). Still it took about a further 48 hours.

Zowie.

> I am sure the code is not particularly efficient; and I've
> not had time to re-write in C which I assume would be better.
> I explored cutting down on the number of JI triads to be
> compared to each grid point but this seemed to affect the
> results too much (i.e. points that I would have expected to
> be too far away to influence the result, seem in fact to
> influence the result).

I considered such an optimization but didn't have any ideas
for how to benignly exclude JI points. Anyway, good to know
the bad news that such an idea is needed.

> * the order of the minima still does not follow strict Tenney
> height order - e.g. 5/5/9 lower than 5/5/8 lower than 5/5/7;
> and others less obvious.

Yes, it seems to occur when one interval in both chords is a
unison. I also notice it when the lower-entropy triad had a
larger SPAN. For instance, 10:11:22 vs. 9:10:11. There are
also cases like 8:10:13 vs. 8:9:12 but at the moment the
10-cent grid may be to blame.

> * OTOH 4/5/6 is now lower than 4/5/7 (as previously advertised!),
> but by very little indeed (and note, both are further down the
> list than they were before).

Here again, we're still getting the entropy at 0-390-700,
right?

> * a lot of the top maxima have an interval around 30 cents,
> or 1170 cents.

That's good.

> Lower, Upper, JI triad with pmax, (unrefined lower, upper)

Can you explain why you chose the refinement approach, vs.
just using a 1-cent grid from the start?

Thanks,

-Carl

🔗Michael <djtrancendance@...>

11/3/2010 7:47:11 AM

Steve>"Yes, it seems to occur when one interval in both chords is a unison. I
also notice it when the lower-entropy triad had a larger SPAN. For instance,
10:11:22 vs. 9:10:11. "

Oh man, another issue I have with Tenney Height (if I understand it
correctly): 22/11/10 is cubed-root of (22*11*10) (using Carl's Tenney Height
formula) and that "unreduce-able" 22 jacks up the result rather than reducing it
to, say, 11/5.

And only then doing a compromise of the original formula and the reduced version
IE cubed root(11/11/10)..where the outer 11/10 represents the octave-reduced
22/10 and then other 11/10 represents the original 11/10 in 10:11:22...and I
don't reduce 22/10 fully to 11/5 to help take into account the 11:10 (because
10: 5.5 : 5 is a little bit too optimistic an interpretation to plug into Tenney
Height IMVHO).

You'd wonder if there is a clever way to systematically do dyadic
"semi-reductions" kind of like that before applying the usual Tenney Height
formula to help level things out... Tenney Height, in this and many other ways,
seems to really fall apart with even remotely high-limits and treats them as
"exponentially worse than low-limits" when, to the ear, it seems clear to me
they often aren't.

🔗martinsj013 <martinsj@...>

11/3/2010 3:05:52 PM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> > ... 4/5/6 is now lower than 4/5/7 ... but by very little indeed ...
> Here again, we're still getting the entropy at 0-390-700, right?

Yes, I had in mind the list of local minima on the 10-cent grid, so was comparing the result for (390,310) with that for (390,580). But I then posted the refined results: yes, (386,316) has moved up but only by a few places (55th to 50th).

> Can you explain why you chose the refinement approach, vs.
> just using a 1-cent grid from the start?

I think it is the most efficient way to do it. I assume you didn't mean a 1-cent grid covering the whole area 0..1200 cents squared - for that would have 1201x1201 points, about 100 times as many as my 121x121 points, which already takes days to calculate. So I guess you mean to place a 1-cent grid around the vicinity of each local minimum on the 10-cent grid? As far as I can see that would need to be a 21x21 point grid, to give the best chance of ensuring that a local minimum would be found within that grid - that is 441 points per local minimum, and even then it is not guaranteed. My method in practice seems to use around 40 points per local minimum, and I think is guaranteed to find a minimum - however I would agree it is not guaranteed to find all minima. Am I missing another possibility?

IIRC, you mentioned way back a special point (was it 490,490?); would you like to see a 1-cent grid about that point, or any other place(s) in particular?

Steve M.

🔗Carl Lumma <carl@...>

11/3/2010 4:20:45 PM

Hi Steve,

> Yes, I had in mind the list of local minima on the 10-cent grid,
> so was comparing the result for (390,310) with that for (390,580).
> But I then posted the refined results: yes, (386,316) has moved
> up but only by a few places (55th to 50th).

Ok, I'm confused. I think I have your latest files but all
the intervals are multiples of 10.

> My method in practice seems to use around 40 points per local
> minimum, and I think is guaranteed to find a minimum - however
> I would agree it is not guaranteed to find all minima. Am I
> missing another possibility?

So you're checking about a radius of about 6 cents from
the unrefined point? No, I can't think of anything better.

> IIRC, you mentioned way back a special point (was it 490,490?);
> would you like to see a 1-cent grid about that point, or any
> other place(s) in particular?

Thanks for the offer.

491 + 491
and
384 + 384

-Carl

🔗martinsj013 <martinsj@...>

11/5/2010 12:51:03 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> Ok, I'm confused. I think I have your latest files but all
> the intervals are multiples of 10.

I assume you have the below two files, which are the latest:
/tuning/files/SteveMartin/tmin2_1M.csv
/tuning/files/SteveMartin/tmax2_1M.csv

If so, I probably confused you by failing to make clear that I changed the order of the columns from the previous (N=10K) results - the refined values are the *first* two columns now (but the T.H.E. is still the last - sorry). Look carefully as the first several results involve only 1200 and 0, but then 702, 498, 884, 386 start to appear.

> So you're checking about a radius of about 6 cents from
> the unrefined point? No, I can't think of anything better.

No, I use a search method - for each current point, check the function height for its eight neighbours on the grid; if one of them is lower, make it the current point and go to step 1; if not, reduce the grid distance and go to step 1; stop when reducing the grid distance would go too small. The number of points calculated is unpredictable but in my case turned out to be about 40 on average.

> 491 + 491
> and
> 384 + 384

OK, will do. Still haven't done anything about graphical output though.

Steve.

🔗Carl Lumma <carl@...>

11/5/2010 1:26:46 AM

Steve wrote:

> > Ok, I'm confused. I think I have your latest files but all
> > the intervals are multiples of 10.
>
> I assume you have the below two files, which are the latest:

That was the problem!

I think the columns are:
new lower
new upper
most probable JI triad for new point
old lower
old upper
harmonic entropy for new point

Is this right?

> > So you're checking about a radius of about 6 cents from
> > the unrefined point? No, I can't think of anything better.
>
> No, I use a search method - for each current point, check the
> function height for its eight neighbours on the grid; if one
> of them is lower, make it the current point and go to step 1;
> if not, reduce the grid distance and go to step 1; stop when
> reducing the grid distance would go too small. The number of
> points calculated is unpredictable but in my case turned out
> to be about 40 on average.

Aha, good.

> > 491 + 491
> > and
> > 384 + 384
>
> OK, will do. Still haven't done anything about graphical
> output though.

With the minima and maxima published, anyone can take a stab
at it. Interestingly, there are more minima than maxima.
Oh, and I'm seeing some duplicate minima, like

162 949
162 949
164 1120
164 1120

It looks like they were found from different old points.
That's a good sign but they should be deleted after this
is verified, yes?

-Carl

🔗martinsj013 <martinsj@...>

11/5/2010 3:31:51 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> I think the columns are:
> new lower
> new upper
> most probable JI triad for new point
> old lower
> old upper
> harmonic entropy for new point ...

Correct (I think I did put this in a post, but, lazily, not in the files themselves) ...

> Oh, and I'm seeing some duplicate minima ...
> ... It looks like they were found from different old points.
> That's a good sign but they should be deleted after this
> is verified, yes? ...

... Correct; I will tidy this up, er, soon.

Steve.

🔗martinsj013 <martinsj@...>

11/6/2010 8:03:08 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> > ... a 1-cent grid about that point ...
> 491 + 491
> 384 + 384

491 is done; however I don't see anything interesting; only a local minimum at (498,497); please see a not very good diagram at:
/tuning/files/SteveMartin/491.png

Steve M.

🔗Carl Lumma <carl@...>

11/6/2010 10:07:34 AM

--- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:
>
> --- In tuning@yahoogroups.com, "Carl Lumma" <carl@> wrote:
> > > ... a 1-cent grid about that point ...
> > 491 + 491
> > 384 + 384
>
> 491 is done; however I don't see anything interesting; only a
> local minimum at (498,497); please see a not very good diagram at:
> /tuning/files/SteveMartin/491.png

That's as I expected, actually, so that's good.
Thanks for doing it. I expect the same for 384 + 384,
but you never know...

-Carl

🔗martinsj013 <martinsj@...>

11/8/2010 5:12:20 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> > > 491 + 491
> > > 384 + 384

384 is now done - there is no minimum or maximum within 10 cents (on a square lattice) of this point (so I've not even posted the diagram).

FWIW, the nearest minima are at:
388,408
345,419
435,347
386,316
316,386
389,453

and the nearest maxima are at:
387,358
360,445
432,303
450,395

Steve M.

🔗martinsj013 <martinsj@...>

11/8/2010 5:37:43 AM

Carl> ... I think we need to use this (2y'+x')/sqrt(3) thing
> > > ... p(j|t) = A/geomean(j) * exp(-(@x^2 + @y^2)/2s^2)

Steve>
> (x' and y' are the lower and upper intervals; x and y are where it is plotted in triadic space)
> distance^2 of mapped point (x,y) from the origin:
> x^2 + y^2 = x'^2 + ((2y'+x')/sqrt(3))^2
> = x'^2 + (4y'^2+4x'y'+x'^2)/3
> = 4/3*(x'^2+x'y'+y'^2)
> My version: x'^2+ y'^2+ (x'+y')^2
> = 2*(x'^2+x'y'+y'^2)
> differing only by a constant factor.
> Now I think this makes a difference - in effect my "s" is smaller.

Carl, and all, to return to an issue that has been on the "back burner": post 93550 re my calculation of the distance between two triads vs the triad space definition as defined by Chalmers.

I think my distance is always a constant multiple of his, so for any given value of S, the Gaussian will be less spread if I use my distance than if I use his. But which is correct? In the dyadic case, it is "obvious" that the distance between two dyads is a value in cents, and so is S; so S has a real world meaning as a standard deviation capturing our hearing acuity. But in the triadic case, I am not sure I understand why the triad space distance is more correct than mine, and (possibly a separate issue) whether using the same value of S is appropriate. Any thoughts?

Steve M.

🔗Carl Lumma <carl@...>

11/9/2010 9:08:46 PM

Steve wrote:

> 384 is now done - there is no minimum or maximum within
> 10 cents (on a square lattice) of this point (so I've not
> even posted the diagram).

Ok, thanks.

> FWIW, the nearest minima are at:
> 388,408
> 345,419
> 435,347
> 386,316
> 316,386
> 389,453

Interesting. Am I right about the nearest JI points?

388 408 796
5/4 19/15 19/12 12:15:19

345 419 764
11/9 14/11 14/9 9:11:14

435 347 782
9/7 11/9 11/7 7:9:11

386 316 702
5/4 6/5 3/2 4:5:6

316 386 702
6/5 5/4 3/2 10:12:15

389 453 842
5/4 13/10 13/8 8:10:13

-Carl

🔗Carl Lumma <carl@...>

11/9/2010 9:09:14 PM

Steve wrote:

> Carl, and all, to return to an issue that has been on the "back
> burner": post 93550 re my calculation of the distance between
> two triads vs the triad space definition as defined by Chalmers.
>
> I think my distance is always a constant multiple of his, so for
> any given value of S, the Gaussian will be less spread if I use
> my distance than if I use his. But which is correct? In the
> dyadic case, it is "obvious" that the distance between two dyads
> is a value in cents, and so is S; so S has a real world meaning
> as a standard deviation capturing our hearing acuity. But in the
> triadic case, I am not sure I understand why the triad space
> distance is more correct than mine, and (possibly a separate
> issue) whether using the same value of S is appropriate.
> Any thoughts?

Since the two distance measures differ by a constant factor,
it doesn't seem like a separate issue, it seems like we just
want to know the right value for S. I don't think we know
the right value though, and it may vary by listener/timbre
anyway. Ideally we would have a budget to run the calculations
different with different S values, conduct experiments, etc.

Or is there another way the two distance formulations could
alter the results?

-Carl

🔗martinsj013 <martinsj@...>

11/13/2010 3:03:20 PM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> ... Am I right about the nearest JI points?
> 388 408 796
> 5/4 19/15 19/12 12:15:19
> 345 419 764
> 11/9 14/11 14/9 9:11:14
> 435 347 782
> 9/7 11/9 11/7 7:9:11
> 386 316 702
> 5/4 6/5 3/2 4:5:6
> 316 386 702
> 6/5 5/4 3/2 10:12:15
> 389 453 842
> 5/4 13/10 13/8 8:10:13

Yes, exactly.
I have now tidied up the lists of min and max for N=1M to 1-cent accuracy, and added two graphs.

/tuning/files/SteveMartin/tmin2_1M.csv
/tuning/files/SteveMartin/tmax2_1M.csv
/tuning/files/SteveMartin/tmin2_1M.png
/tuning/files/SteveMartin/tminmax2_1M.png

Steve M.

🔗Carl Lumma <carl@...>

11/15/2010 3:11:32 PM

Steve wrote:

> > ... Am I right about the nearest JI points?
> > 388 408 796
> > 5/4 19/15 19/12 12:15:19
> > 345 419 764
> > 11/9 14/11 14/9 9:11:14
> > 435 347 782
> > 9/7 11/9 11/7 7:9:11
> > 386 316 702
> > 5/4 6/5 3/2 4:5:6
> > 316 386 702
> > 6/5 5/4 3/2 10:12:15
> > 389 453 842
> > 5/4 13/10 13/8 8:10:13
>
> Yes, exactly.
> I have now tidied up the lists of min and max for N=1M to
> 1-cent accuracy, and added two graphs.

Excellent. The latter graph,

/tuning/files/SteveMartin/tminmax2_1M.png

is I think the cherry. But it needs to have higher
resolution. I can try my hand at this. But first:
What do you think of the idea of deleting the minima
where either interval is zero?

For those working in Excel, I put the extreme into a
single Excel file

/tuning/files/CarlLumma/TriadicEntropy.xls

-Carl

🔗martinsj013 <martinsj@...>

11/17/2010 1:41:28 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> /tuning/files/SteveMartin/tminmax2_1M.png
> ... higher resolution. I can try my hand at this.

Thank you. I am looking into Python and matplotlib but I don't have much time at the moment.

> What do you think of the idea of deleting the minima
> where either interval is zero?

Do you mean in the list (because it seems they can simply be ignored in the diagram)? So that we can say that 4:5:6 is (say) 20th among "interesting" triads rather than have to say it is 50th among "all" triads?

Steve.

🔗Carl Lumma <carl@...>

11/17/2010 11:01:35 AM

Steve wrote:

> > What do you think of the idea of deleting the minima
> > where either interval is zero?
>
> Do you mean in the list (because it seems they can simply be
> ignored in the diagram)?

Easier to ignore in the diagram, but why not delete both?
They seem like pathological cases to me.

-Carl

🔗Carl Lumma <carl@...>

11/17/2010 11:34:11 AM

I wrote:

> > Do you mean in the list (because it seems they can simply be
> > ignored in the diagram)?
>
> Easier to ignore in the diagram, but why not delete both?
> They seem like pathological cases to me.

Actually now I'm wondering if they adversely affect the
computation... -Carl

🔗martinsj013 <martinsj@...>

11/18/2010 5:48:59 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> > Easier to ignore in the diagram, but why not delete both?
> > They seem like pathological cases to me.
> Actually now I'm wondering if they adversely affect the
> computation... -Carl

re deleting pathological (or "degenerate") cases from the results list - could do so I suppose, but those triads would still be used in compositions, so it would be useful to know their T.H.E. as much as for non-pathological ones.

re adversely affecting the computation - sounds like you are suggesting removing pathological cases from the seed list of JI triads? Surely this would not be valid - although I am not fully aware of all the arguments for the validity of the calculation, I assume that they rest on facts such as: entropy is a well-defined and tested concept; triad space ditto; Tenny limits ditto; therefore the calculation represents something "real".

BTW, would you also call points with lower or upper =1200 cents pathological? what about lower+upper = 1200?

Steve.

🔗martinsj013 <martinsj@...>

11/18/2010 5:57:03 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> Since the two distance measures differ by a constant factor,
> it doesn't seem like a separate issue, it seems like we just
> want to know the right value for S. I don't think we know
> the right value though, and it may vary by listener/timbre
> anyway. Ideally we would have a budget to run the calculations
> different with different S values, conduct experiments, etc.

Agreed. What I meant was, suppose we are absolutely sure that 1.2% is the right value for dyads, then the two issues are (2) is 1.2% also the right value for triads? and (1) if so, should I apply it against the triad space distance or the Steve Martin defined distance? I think you are saying they are both moot, as we don't know the right value.

> Or is there another way the two distance formulations could
> alter the results?

Not that I can think of.

Steve.

🔗Carl Lumma <carl@...>

11/18/2010 11:44:07 AM

Steve wrote:

> re deleting pathological (or "degenerate") cases from the
> results list - could do so I suppose, but those triads would
> still be used in compositions, so it would be useful to know
> their T.H.E. as much as for non-pathological ones.

Wouldn't one use the dyadic entropy in such a case?

> re adversely affecting the computation - sounds like you are
> suggesting removing pathological cases from the seed list
> of JI triads?

Yes.

> Surely this would not be valid - although I am not fully aware
> of all the arguments for the validity of the calculation, I
> assume that they rest on facts such as: entropy is a well-
> defined and tested concept; triad space ditto; Tenny limits
> ditto; therefore the calculation represents something "real".

I don't have a strong feeling one way or the other. All
these things have justification to one extent or another,
but I'm not sure I see the justification for seeding with
things like 1:1:2.

> BTW, would you also call points with lower or upper =1200
> cents pathological? what about lower+upper = 1200?

No, because this isn't an octave-equivalent universe.
Unisons however...

-Carl

🔗Carl Lumma <carl@...>

11/18/2010 11:50:13 AM

Steve wrote:

> Agreed. What I meant was, suppose we are absolutely sure that
> 1.2% is the right value for dyads,

Sorry for not answering directly. No, I don't think that
follows. However copying what works on the dyads seems as
good a place to start as any.

> then the two issues are (2) is 1.2% also the right value for
> triads? and (1) if so, should I apply it against the triad
> space distance or the Steve Martin defined distance?

I suggested 1% in the triad space distance. That seemed to
best reflect listener's opinions in the dyadic case. But
really it's just a starting point.

> I think you are saying they are both moot, as we don't know
> the right value.

Pretty much.

-Carl

🔗martinsj013 <martinsj@...>

11/19/2010 3:31:42 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
SM> ... it would be useful to know [pathological triad] T.H.E. as much as for non-pathological ones.
> Wouldn't one use the dyadic entropy in such a case?

It wouldn't be possible to compare numerically the (dyadic) H.E. for the pathological triad with the (triadic) H.E. for the non-pathological - if that was important?

> ... I'm not sure I see the justification for seeding with
> things like 1:1:2.

Isn't the argument *for* it as follows: things like 1:1:2 are, no less than say 40:41:80, potential recognitions of a sounded triad, and so should be included in a calculation based on probabilities of the said potential recognitions. This kind of consideration led me to include not only 1:1:2 and 40:41:80 but also 41:40:80 - exactly the same as I would do in the middle of the space - e.g. 4:5:6 is near to both 40:51:60 and 41:50:60.

I guess the argument *against* is that because of the uniqueness of the 1:1 interval, the listener is in practise less likely to recognise a non-1:1 as a 1:1 than the Gaussian formula would suggest. But is that a justification for leaving it out altogether? By analogy, would you leave it out of the dyadic calculation?

Steve M.

🔗Carl Lumma <carl@...>

11/19/2010 1:59:59 PM

Steve wrote:

> > Wouldn't one use the dyadic entropy in such a case?
>
> It wouldn't be possible to compare numerically the (dyadic)
> H.E. for the pathological triad with the (triadic) H.E. for
> the non-pathological - if that was important?

Never say never... there might be a way to compare them.
One of the things Paul and I seem to agree on is the need
to use a "subsetting" approach to analyzing chords. See
for instance this post

/tuning/topicId_79751.html#80104?var=0&l=1

starting with the paragraph beginning "That said".

> > ... I'm not sure I see the justification for seeding with
> > things like 1:1:2.
>
> Isn't the argument *for* it as follows: things like 1:1:2 are,
> no less than say 40:41:80, potential recognitions of a sounded
> triad, and so should be included in a calculation based on
> probabilities of the said potential recognitions. This kind
> of consideration led me to include not only 1:1:2 and 40:41:80
> but also 41:40:80 - exactly the same as I would do in the middle
> of the space - e.g. 4:5:6 is near to both 40:51:60 and 41:50:60.

40:41:80 contains no unisons and is a completely different
kind of 'triad' than 1:1:2 or 1:1:3. I don't have a strong
feeling whether the brain interprets 1:1:anything as a triad
or not, but it seems to me to be a question worth pondering.
Maybe I'll ask Paul what he thinks. I'm leaning towards
omitting them from the results, but perhaps they need to be
in the computation to produce the important maxima near
the edges (?)

> By analogy, would you leave it
> out of the dyadic calculation?

I suppose that's a point: Paul always left 1:1 in the dyadic
stuff. Maybe I'm all wet.

-Carl

🔗Carl Lumma <carl@...>

11/19/2010 6:21:51 PM

I wrote:

> > By analogy, would you leave it
> > out of the dyadic calculation?
>
> I suppose that's a point: Paul always left 1:1 in the dyadic
> stuff. Maybe I'm all wet.

Paul says they should be left in. :)

-Carl

🔗martinsj013 <martinsj@...>

11/22/2010 9:25:16 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> Never say never... there might be a way to compare them.
> One of the things Paul and I seem to agree on is the need
> to use a "subsetting" approach to analyzing chords. See
> for instance this post
> /tuning/topicId_79751.html#80104?var=0&l=1

Food for thought - I am reading into this some support for the idea that the purely Gaussian probabilities are not the whole story ... should I be?

> 40:41:80 contains no unisons and is a completely different
> kind of 'triad' than 1:1:2 or 1:1:3. I don't have a strong
> feeling whether the brain interprets 1:1:anything as a triad
> or not, but it seems to me to be a question worth pondering.

Ditto...

> Maybe I'll ask Paul what he thinks. I'm leaning towards
> omitting them from the results, but perhaps they need to be
> in the computation to produce the important maxima near
> the edges (?)

I'll give it a try when I can.
(But I did see your later post "Paul says leave them in".)

PS To acknowledge your answer (in another post) to one of my questions: (paraphrased) "this is not an octave-equivalence situation" - of course I agree, bad question.

PPS Had trouble installing matplotlib, but got it now; should produce some better graphics now.

Steve M.

🔗Carl Lumma <carl@...>

11/22/2010 9:41:09 AM

--- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:

> > /tuning/topicId_79751.html#80104?var=0&l=1
>
> Food for thought - I am reading into this some support for the
> idea that the purely Gaussian probabilities are not the whole
> story ... should I be?

How is the probability distribution related to chord
subsetting?

> > Maybe I'll ask Paul what he thinks. I'm leaning towards
> > omitting them from the results, but perhaps they need to be
> > in the computation to produce the important maxima near
> > the edges (?)
>
> I'll give it a try when I can.
> (But I did see your later post "Paul says leave them in".)

It would be interesting to see... though it may not be
worth days of computation.

> PPS Had trouble installing matplotlib, but got it now; should
> produce some better graphics now.

Sweet. I should spend more time with R and less with
posting here. :)

-Carl

🔗martinsj013 <martinsj@...>

11/26/2010 10:10:46 AM

Carl> /tuning/topicId_79751.html#80104?var=0&l=1
Steve> I am reading into this some support for the idea that the purely Gaussian probabilities are not the whole story ...
Carl> How is the probability distribution related to chord subsetting?

What I meant was that in the H.E. algorithm as I know it, the probabilities are based entirely on the distance between chords and size of chords neighbourhood in triad space; mention of a "tendency to hear as 2:3 plus crap" seems to imply something else may be in play - unless this is already included in the calculation by way of the fact that chords including a 2:3 mostly have a larger neighbourhood already. Does that make any sense?

I've just uploaded a contour plot of the Triadic HE results. It uses square coordinates (x=lower, y=upper) instead of triad space, mainly because I haven't worked out how to get the axes where they should be for triad space. In addition to the contours, I've added the locations of local minima (in yellow) and maxima (in red).

/tuning/files/SteveMartin/tnxgc.svg

Steve M.

🔗Carl Lumma <carl@...>

11/26/2010 10:56:28 AM

--- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:
>
> What I meant was that in the H.E. algorithm as I know it, the
> probabilities are based entirely on the distance between chords
> and size of chords neighbourhood in triad space; mention of a
> "tendency to hear as 2:3 plus crap" seems to imply something
> else may be in play

Yes, this idea of subsetting would mean triadic entropy alone
is not enough to understand triads. Paul seemed pretty adamant
that dyadic composition is also important, though I am more
ambivalent.

> unless this is already included in the
> calculation by way of the fact that chords including a 2:3 mostly
> have a larger neighbourhood already. Does that make any sense?

2:2:3 and 2:3:3 live at the edges, not near 10:12:15,
10:13:15, etc. So they don't much influence the latter,
right? If they are important, it would seem to be in
fixing the high entropy of chords containing small intervals.

> /tuning/files/SteveMartin/tnxgc.svg

Do you have an SVG viewer that supports zoom?

Macromedia developed SVG and it seems Adobe end-of-lifed
the viewer last year. All my browsers will display it,
but only Opera will zoom, and then quite cumbersomely.
A shame! I don't know of a another vector format standard.

-Carl

🔗martinsj013 <martinsj@...>

11/27/2010 8:49:19 AM

Steve> ... chords including a 2:3 mostly have a larger neighbourhood already...
Carl> 2:2:3 and 2:3:3 live at the edges, not near 10:12:15, 10:13:15, etc. So they don't much influence the latter, right?

Ah, I wasn't sufficient clear that I meant the outer interval to be 2:3, i.e. triads 2p:q:3p of which some are near 10:12:15.

> Do you have an SVG viewer that supports zoom?

I have to admit I know v little about SVG and only tried this format because you or Gene suggested it, and matplotlib can do it. It turns out (!) that I am using "Eye of Gnome" on my Ubuntu laptop; it allows me to zoom into the picture but I guess it is cheating because I get blocking and blurring - am I right you're expecting better from SVG?

🔗Mike Battaglia <battaglia01@...>

11/27/2010 9:17:23 AM

On Fri, Nov 26, 2010 at 1:56 PM, Carl Lumma <carl@...> wrote:
>
> --- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:
> >
> > What I meant was that in the H.E. algorithm as I know it, the
> > probabilities are based entirely on the distance between chords
> > and size of chords neighbourhood in triad space; mention of a
> > "tendency to hear as 2:3 plus crap" seems to imply something
> > else may be in play
>
> Yes, this idea of subsetting would mean triadic entropy alone
> is not enough to understand triads. Paul seemed pretty adamant
> that dyadic composition is also important, though I am more
> ambivalent.

I'm also ambivalent from way over here in the peanut gallery.

I think that 2:3 plus crap pretty much nails it, except that it
doesn't actually get so far as saying "this other note is just crap,
treat it as noise." 2:3 plus white noise isn't quite as sad as 2:3
plus crap.

-Mike

🔗Carl Lumma <carl@...>

11/27/2010 10:08:45 AM

--- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:

> I meant the outer interval to be 2:3, i.e. triads 2p:q:3p of
> which some are near 10:12:15.

That's a point. The minor chords are one test of whether
subsetting is needed. Magic chords are the other (what I
call super-saturated suspensions). That's the part where
we actually have to have competent listeners audition these
things and compare to the triadic h.e. values.

> > Do you have an SVG viewer that supports zoom?
>
> I have to admit I know v little about SVG and only tried
> this format because you or Gene suggested it, and matplotlib
> can do it. It turns out (!) that I am using "Eye of Gnome"
> on my Ubuntu laptop; it allows me to zoom into the picture
> but I guess it is cheating because I get blocking and
> blurring - am I right you're expecting better from SVG?

Yes, that's wrong. It's slow enough in Firefox that I can
see it drawing the paths, so I know it's vector and it
should scale perfectly.

Sigh... I haven't looked at this in years... and neither,
it seems, has anyone else. SVG was the promised land.
The only other reasonable vector format I know of,
postscript, had been eaten by PDF. Le sigh, le boo-hoo.
High-res PNG seems to be the way to go.

-Carl

🔗martinsj013 <martinsj@...>

11/28/2010 1:34:15 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> ... The minor chords are one test of whether
> subsetting is needed. Magic chords are the other (what I
> call super-saturated suspensions).

I'll have to read up on those!

> I haven't looked at [SVG] in years... and neither, it seems, has anyone else ... High-res PNG seems to be the way to go.

I found javascript code SVGpan from Andrea Leofreddi, which works well on his examples but not so well on mine (yet).

Steve M.

🔗Carl Lumma <carl@...>

11/28/2010 1:49:31 AM

Steve wrote:

> > ... The minor chords are one test of whether
> > subsetting is needed. Magic chords are the other
> > (what I call super-saturated suspensions).
>
> I'll have to read up on those!

The test points I gave were them.
5/4 + 5/4 ~~ 14/9, but the idea is the consonance
can be improved by tempering 225/224 across the thirds
to yield a better 14/9 without making the thirds
much worse.
4/3 + 4/3 ~~ 7/4 is another, based on 64/63.
In listening tests, I've personally not found a big
case for these things, though others seem to report
that the tempered versions sound better.
Triadic entropy shouldn't catch them, and that's what
we found. So if you do think the tempered versions
sound better, it's a case where dyadic composition
trumps triadic entropy.

-Carl

🔗martinsj013 <martinsj@...>

11/28/2010 1:45:32 PM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> 5/4 + 5/4 ~~ 14/9 ... tempering 225/224
> 4/3 + 4/3 ~~ 7/4 ... 64/63.

This gives me the idea to look at a cross-section of the surface, where "lower=upper" - not done yet.

In a similar vein I have already looked at other cross-sections:
* where lower=4:5 - some time ago, no surprises.
* where outer=2:3 - ditto, ditto.

* where lower=4t-2400, upper=2400-3t - the idea is that a minimum near t=700 is in some sense an optimum MT 5th for major triads. The answer I get is 696.1 cents (the extra precision seems appropriate here). Of course we might get a different answer if we used the inversions of the major triad, but I've not tried that. (By the way, there are lower minima at t=600 and t=800 but these are of course just dyads.)

* where lower=2400-3t, upper=4t-2400 - as above but for minor triads. There are three interesting minima (t=600 and t=800 again are not interesting) but none is as low as in the above.
~~~ t=709.2 I think this approximates the 6:7:9 triad
~~~ t=676.6 ditto 4:5:6 (but poorly!)
~~~ t=696.2 ditto 10:12:15. (but the weakest min of the three)

* where lower=4t-2400, upper=4800-6t - as above but for the "major-minor 7th" (5th omitted). Apart from the t=600 and t=800 minima, the graph is remarkably flat - the cross-section manages to dodge any noticeable minima.

Steve M.

🔗Carl Lumma <carl@...>

11/30/2010 9:10:38 PM

Hi Steve,

> * where lower=4t-2400, upper=2400-3t - the idea is that a
> minimum near t=700 is in some sense an optimum MT 5th for
> major triads. The answer I get is 696.1 cents (the extra
> precision seems appropriate here).

Are you saying you ran the cross-section at 0.1-cent
resolution?

MT = meantone I assume, on account that those equations
enforce meantone (with pure octaves).

Of the various optimality criteria for pure-octaves
meantone, 696.1 is just short of the mean-squared 696.16
fifth (the Woolhouse fifth). Maybe Graham can tell us
what the pure-octaves version of TOP-RMS is.

> * where lower=2400-3t, upper=4t-2400 - as above but for
> minor triads. There are three interesting minima
> (t=600 and t=800 again are not interesting) but none is
> as low as in the above.
> ~~~ t=709.2 I think this approximates the 6:7:9 triad
> ~~~ t=676.6 ditto 4:5:6 (but poorly!)
> ~~~ t=696.2 ditto 10:12:15. (but the weakest min of
> the three)

Yeah, these equations don't actually enforce particular
temperaments because the mapping isn't fixed. You are
seeing a mavila in the middle there, again a bit shy of
the sum-squared-optimal 677.1 cents. The first one is
superpythagorean. I don't know about the last one.

> * where lower=4t-2400, upper=4800-6t - as above but for the
> "major-minor 7th" (5th omitted). Apart from the t=600 and
> t=800 minima, the graph is remarkably flat - the cross-
> section manages to dodge any noticeable minima.

That's the mapping for the "dominant" temperament, which
isn't particularly good with pure octaves. On top of the
fact that 4:5:7 may not be the strongest minimum around...
What happens if you try lower=4t-2400, upper=6t-3600?
Or maybe lower=4t-2390.5, upper=4780.9-6t?

-Carl

🔗Mike Battaglia <battaglia01@...>

11/30/2010 9:28:46 PM

On Wed, Dec 1, 2010 at 12:10 AM, Carl Lumma <carl@...> wrote:
>
> > * where lower=4t-2400, upper=4800-6t - as above but for the
> > "major-minor 7th" (5th omitted). Apart from the t=600 and
> > t=800 minima, the graph is remarkably flat - the cross-
> > section manages to dodge any noticeable minima.
>
> That's the mapping for the "dominant" temperament, which
> isn't particularly good with pure octaves. On top of the
> fact that 4:5:7 may not be the strongest minimum around...
> What happens if you try lower=4t-2400, upper=6t-3600?
> Or maybe lower=4t-2390.5, upper=4780.9-6t?

I am coming in shamelessly late, and I can see I missed a ton of stuff
in the move down here. Are you guys running some kind of triadic HE
minimizer? Do we have finished Triadic HE charts now...?

-Mike

🔗Carl Lumma <carl@...>

11/30/2010 10:03:41 PM

Mike wrote:

> I am coming in shamelessly late, and I can see I missed a
> ton of stuff in the move down here. Are you guys running
> some kind of triadic HE minimizer? Do we have finished
> Triadic HE charts now...?

Yup, Steve has computed triadic entropy for 0-1200 x 0-1200
at 1 cent resolution and, apparently, s=1.2%. See his folder
in the files section.

Of course it's one thing to compute and another to figure
out what the results mean. That's what we're working on
now. In this case, we're looking at fancy 2-D cross
sections through the thing. Have a look upthread...

-Carl

🔗Mike Battaglia <battaglia01@...>

11/30/2010 10:18:19 PM

On Wed, Dec 1, 2010 at 1:03 AM, Carl Lumma <carl@...> wrote:
>
> Yup, Steve has computed triadic entropy for 0-1200 x 0-1200
> at 1 cent resolution and, apparently, s=1.2%. See his folder
> in the files section.

Awesome, so this is it here then?
/tuning/files/SteveMartin/tnxgc.svg

> Of course it's one thing to compute and another to figure
> out what the results mean. That's what we're working on
> now. In this case, we're looking at fancy 2-D cross
> sections through the thing. Have a look upthread...

Also, in tandem with this, would it be too much trouble Steve to
generate a square plot that is lower dyad v outer dyad, instead of
lower dyad vs upper dyad? That would make things a lot easier to
figure out what's going on. I guess you could do it by just taking the
data for upper dyad and combining it with lower dyad to get the outer
one. I'd do it myself, but I'm going through a computer crisis now and
don't have MATLAB set up.

If it's a lot of trouble, no worries.

-Mike

🔗Carl Lumma <carl@...>

11/30/2010 10:44:13 PM

Mike wrote:

> Awesome, so this is it here then?
> /tuning/files/SteveMartin/tnxgc.svg

I recommend
/tuning/files/CarlLumma/TriadicEntropy.xls

> Also, in tandem with this, would it be too much trouble Steve to
> generate a square plot that is lower dyad v outer dyad, instead of
> lower dyad vs upper dyad? That would make things a lot easier to
> figure out what's going on.

With triangular axes you can have both at once. I think
Steve is working on it.

-Carl

🔗genewardsmith <genewardsmith@...>

12/1/2010 12:20:36 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:

> Of the various optimality criteria for pure-octaves
> meantone, 696.1 is just short of the mean-squared 696.16
> fifth (the Woolhouse fifth). Maybe Graham can tell us
> what the pure-octaves version of TOP-RMS is.

696.24 cents.

🔗Graham Breed <gbreed@...>

12/1/2010 1:39:41 AM

On 1 December 2010 09:10, Carl Lumma <carl@...> wrote:

> Of the various optimality criteria for pure-octaves
> meantone, 696.1 is just short of the mean-squared 696.16
> fifth (the Woolhouse fifth).  Maybe Graham can tell us
> what the pure-octaves version of TOP-RMS is.

I agree with Gene for the unstretched TOP-RMS (TE) fifth: 696.24
cents. I can't optimize the STD error right now so I don't know if
it's different to that precision.

A Tenney-weighted fifth should be slightly sharp of an equally
weighted 5-limit one, anyway, because the 3 gets more weight.

Graham

🔗martinsj013 <martinsj@...>

12/1/2010 3:10:51 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> Are you saying you ran the cross-section at 0.1-cent
> resolution?
Yes, but only as a refinement, after looking for interesting areas at 1 cent resolution.

> MT = meantone I assume, on account that those equations
> enforce meantone (with pure octaves).
Yes, I meant meantone; and the minimum near 700 cents was significant; but there were other less significant ones (e.g. t=646, t=760) ...

> > * lower=2400-3t, upper=4t-2400 ... minor triads.
> > ~~~ t=709.2 ... 6:7:9
> > ~~~ t=676.6 ... 4:5:6
> > ~~~ t=696.2 ... 10:12:15.
> Yeah, these equations don't actually enforce particular
> temperaments because the mapping isn't fixed.
... I don't quite see why the first equations enforce meantone, but the above equations do not - surely the mapping is not fixed in either case?

> You are
> seeing a mavila in the middle there, again a bit shy of
> the sum-squared-optimal 677.1 cents. The first one is
> superpythagorean. I don't know about the last one.
Thanks for the info. I will look up the mappings. Isn't the last one just meantone again - certainly the value of t is almost the same, and isn't the mapping the same (4 generators = 1:5)?

> What happens if you try lower=4t-2400, upper=6t-3600?
> Or maybe lower=4t-2390.5, upper=4780.9-6t?
OK, will try these.

🔗Jacques Dudon <fotosonix@...>

12/1/2010 4:26:37 AM

Gene wrote :

> --- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
>
> > Of the various optimality criteria for pure-octaves
> > meantone, 696.1 is just short of the mean-squared 696.16
> > fifth (the Woolhouse fifth). Maybe Graham can tell us
> > what the pure-octaves version of TOP-RMS is.
>
> 696.24 cents.

Just for info, this is close enough to the Golden meantone meta-temperament = 696,214473955 c.
- - - - - - -
Jacques

🔗martinsj013 <martinsj@...>

12/1/2010 6:25:43 AM

Mike> ... lower dyad v outer dyad instead of lower dyad vs upper dyad ...
Not too hard; will upload shortly to:
/tuning/files/SteveMartin/tnxgclo.zip
(but I don't really see the advantage over the original).

Carl> With triangular axes you can have both at once. I think Steve is working on it.
I intended to, but have not spent any time on it recently - Mike, if you're familiar with MATLAB, do you know how to do oblique axes?

Meanwhile if you want to try zoom/pan with SVG: extract the .svg and .js from the below .zip into the same folder; open the .svg in Firefox or similar. Well it works on my system (but very slowly) - "mousewheel" to zoom in/out, "drag" to move around the diagram.
/tuning/files/SteveMartin/tnxgc.zip

I will upload the .png as well - if 1.4MB is not too large?
Steve.

🔗martinsj013 <martinsj@...>

12/1/2010 6:48:25 AM

--- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:
> --- In tuning@yahoogroups.com, "Carl Lumma" <carl@> wrote:
> > 5/4 + 5/4 ~~ 14/9 ... tempering 225/224
> > 4/3 + 4/3 ~~ 7/4 ... 64/63.
> This gives me the idea to look at a cross-section of the surface, where "lower=upper" - not done yet.

Now done - obviously l=u=0 and l=u=1200 are points of low T.H.E.; apart from those two, the minima are numerous (30 of them) but very shallow; they include (497,497) and (401,401), but the lowest 10 are at:
702 (4:6:9)
600 (5:7:10)
832 (5:8:13)
951 (4:7:12)
137 (12:13:14)
99 (16:17:18)
171 (9:10:11)
192 (8:9:10)
1008 (5:9:16)
770 (7:11:17)

(Note: in each case the JI triad is the most probable according to the T.H.E. program.)

There is also (351,351) (18:22:27). Anything interesting there?

Steve M.

🔗Carl Lumma <carl@...>

12/1/2010 12:37:34 PM

--- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:

> > Yeah, these equations don't actually enforce particular
> > temperaments because the mapping isn't fixed.
>
> ... I don't quite see why the first equations enforce meantone,
> but the above equations do not - surely the mapping is not fixed
> in either case?

Right, I was correcting myself.

> Thanks for the info. I will look up the mappings. Isn't the
> last one just meantone again - certainly the value of t is almost
> the same, and isn't the mapping the same (4 generators = 1:5)?

Yeah, you might be getting 1/3-comma meantone there.

> > What happens if you try lower=4t-2400, upper=6t-3600?
> > Or maybe lower=4t-2390.5, upper=4780.9-6t?
>
> OK, will try these.

Great!

-Carl

🔗martinsj013 <martinsj@...>

12/2/2010 12:31:13 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> Right, I was correcting myself.
Oh, right, I see.

> > > What happens if you try lower=4t-2400, upper=6t-3600?

The minima are deeper (compared with (4t-2400, 4800-6t)), at:
697.0 (4:5:7)
720.0 (3:4:6)
753.4 (7:10:17)
775.2 (4:6:11)

> > > Or maybe lower=4t-2390.5, upper=4780.9-6t?

Not as deep as the above, but deeper than for (4t-2400, 4800-6t):
656.4 (7:8:13)
666.2 (6:7:11)
679.4 (5:6:9)
699.2 (4:5:7)
773.6 (8:12:13)

Any comments?

Are there other equations I should try (e.g. no need to stick with 3t, 4t)? I assume that we are only expecting to confirm the rationale for temperaments that have been found by other means.

Steve M.

🔗Carl Lumma <carl@...>

12/2/2010 10:12:23 AM

>>> * where lower=4t-2400, upper=4800-6t - as above but for the
>>> "major-minor 7th" (5th omitted). Apart from the t=600 and
>>> t=800 minima, the graph is remarkably flat - the cross-
>>> section manages to dodge any noticeable minima.
>>
>> That's the mapping for the "dominant" temperament, which
>> isn't particularly good with pure octaves. On top of the
>> fact that 4:5:7 may not be the strongest minimum around...
>> What happens if you try lower=4t-2400, upper=6t-3600?
>
> The minima are deeper (compared with (4t-2400, 4800-6t)), at:
> 697.0 (4:5:7)
> 720.0 (3:4:6)
> 753.4 (7:10:17)
> 775.2 (4:6:11)

Before you said there were none (?).

>> Or maybe lower=4t-2390.5, upper=4780.9-6t?
>
> Not as deep as the above, but deeper than for (4t-2400, 4800-6t):
> 656.4 (7:8:13)
> 666.2 (6:7:11)
> 679.4 (5:6:9)
> 699.2 (4:5:7)
> 773.6 (8:12:13)
>
> Any comments?

It seems to be working. The mapping to 4:5:7 is within
a tenth-cent of the TOP tuning.

To tie this in to the other thread, the equation *does*
enforce a linear keyboard mapping, where the generator values
above will give those different chords under the same
fingering.

> Are there other equations I should try (e.g. no need to
> stick with 3t, 4t)? I assume that we are only expecting to
> confirm the rationale for temperaments that have been found
> by other means.

Yes, I'm already convinced. It is good news for both sides.
If you would like to go down the list and check more of
these, here are some 5-limit mappings

/tuning/database?method=reportRows&tbl=10

-Carl

🔗martinsj013 <martinsj@...>

12/2/2010 2:09:24 PM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> >> What happens if you try lower=4t-2400, upper=6t-3600?
> > The minima are deeper (compared with (4t-2400, 4800-6t)), at:
> Before you said there were none (?).
Oops, yes - I meant there were none that were obvious to the eye.
To try and quantify that - e.g. there is one near 4:5:7 but its T.H.E. is only 0.008 lower than the points 8 cents either side of it on the cross-section; whereas for the corresponding minima on your two suggested equations the corresponding figures are 0.037 and 0.096.

> /tuning/database?method=reportRows&tbl=10
Oops (again) - I am regretting having asked!

Steve M.

🔗Carl Lumma <carl@...>

12/2/2010 2:18:11 PM

--- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:
>
> --- In tuning@yahoogroups.com, "Carl Lumma" <carl@> wrote:
> > >> What happens if you try lower=4t-2400, upper=6t-3600?
> > > The minima are deeper (compared with (4t-2400, 4800-6t)), at:
> > Before you said there were none (?).

I'm baffled how what I posted turned into the above traffic
jam of ASCII.

> Oops, yes - I meant there were none that were obvious to
> the eye. To try and quantify that - e.g. there is one near
> 4:5:7 but its T.H.E. is only 0.008 lower than the points
> 8 cents either side of it on the cross-section; whereas for
> the corresponding minima on your two suggested equations the
> corresponding figures are 0.037 and 0.096.

Thanks. That's good. A very good check you thought of here.

> Oops (again) - I am regretting having asked!

:)

-Carl

🔗Mike Battaglia <battaglia01@...>

12/4/2010 3:46:00 PM

One more thing that seems particularly interesting:

On Sun, Nov 28, 2010 at 4:45 PM, martinsj013 <martinsj@...> wrote:
>
> * where lower=4t-2400, upper=2400-3t - the idea is that a minimum near t=700 is in some sense an optimum MT 5th for major triads. The answer I get is 696.1 cents (the extra precision seems appropriate here). Of course we might get a different answer if we used the inversions of the major triad, but I've not tried that. (By the way, there are lower minima at t=600 and t=800 but these are of course just dyads.)

That is a brilliant idea. And it spits out something close to
1/4-comma meantone. Wow.

> * where lower=2400-3t, upper=4t-2400 - as above but for minor triads. There are three interesting minima (t=600 and t=800 again are not interesting) but none is as low as in the above.
> ~~~ t=709.2 I think this approximates the 6:7:9 triad
> ~~~ t=676.6 ditto 4:5:6 (but poorly!)
> ~~~ t=696.2 ditto 10:12:15. (but the weakest min of the three)

And the strongest it spits out is superpyth. Wow. I wonder why it
ranks mavila above meantone though...? It ranks a really crappy 4:5:6
as being lower in entropy than a more in tune 10:12:15? I guess this
comes from the way entropy slopes as you travel further away from a
minimum. This might be another reason to use the Vos curve instead of
a Gaussian. I'm especially interested to see how that would look.

This is fascinating.

-Mike

🔗Mike Battaglia <battaglia01@...>

12/4/2010 4:17:06 PM

On Wed, Dec 1, 2010 at 9:48 AM, martinsj013 <martinsj@...> wrote:
>
> Now done - obviously l=u=0 and l=u=1200 are points of low T.H.E.; apart from those two, the minima are numerous (30 of them) but very shallow; they include (497,497) and (401,401), but the lowest 10 are at:
> 702 (4:6:9)
> 600 (5:7:10)
> 832 (5:8:13)
> 951 (4:7:12)
> 137 (12:13:14)
> 99 (16:17:18)
> 171 (9:10:11)
> 192 (8:9:10)
> 1008 (5:9:16)
> 770 (7:11:17)

Is this in order? It listed the approx 16:17:18 as being lower in
entropy than the approx 8:9:10?

> (Note: in each case the JI triad is the most probable according to the T.H.E. program.)
>
> There is also (351,351) (18:22:27). Anything interesting there?

I notice that a lot of these are just regular JI dyads that are being
bisected. So the 192 one is a bisected 5/4, the 99 cent one is a
bisecected 9/8 (approximately), the 950 cent one is a bisected 1/3,
etc. It didn't put 9:12:16 on there - e.g. about 400 cents, doubled? I
would have expected that would be lower in entropy than something like
12:13:14 for sure.

Is there any way that we could come up with some method for coming up
with an "entropy" for temperaments? Perhaps we could analyze all of
the 5-limit temperaments in Carl's list, tweak them to see the ideal
4:5:6 that's generated, and then see how the resulting triads compare
in entropy.

For example, you came up with the local minimum that corresponded to
meantone, what happens if you do the same thing for augmented,
diminished, and porcupine? How do the resulting 5-limit triads compare
to one another? Are there any "surprises" along the way? :)

I guess you could compare the minor triads too, but I don't think it
matters as much, since the whole point of major triads is to have low
entropy, and for minor triads to have comparatively higher entropy.

I need to get MATLAB working on this computer stat, so I can jump in
on this. A triadic entropy minimizer would be awesome right now.

-Mike

🔗martinsj013 <martinsj@...>

12/5/2010 5:16:42 AM

--- In tuning@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:
> Is this in order? It listed the approx 16:17:18 as being lower in
> entropy than the approx 8:9:10?
Yes; my attempted explanation is that although the minimum for 8:9:10 may be deeper, it is further away from the l=u line.

> I notice that a lot of these are just regular JI dyads that are being bisected.
Yes; i.e. the outer is a JI dyad, which as you know carries equal weight with l and u in the calculation.

> It didn't put 9:12:16 on there - e.g. about 400 cents, doubled? I
would have expected that would be lower in entropy than something like
12:13:14 for sure.

OK, 9:12:16 is about 500 cents doubled; 16:20:25 is about 400 cents doubled. Both *are* minima on this cross-section, but are lower down the list than the ten I showed. Why? I don't know; possibly with the latter it is because it is also near 12:15:19 and 15:19:24, and ambiguity makes for high entropy. Remember that all of these minima are much shallower than for the "meantone" cross-sections I showed before.

> Is there any way that we could come up with some method for coming up with an "entropy" for temperaments?

I'm thinking about this; happy to receive advice.

> ... what happens if you do the same thing for augmented,
> diminished, and porcupine?

Yes, I'm working through Carl's list (slowly). I need to present the results in a more consistent and informative way - thinking about this too.

🔗monz <joemonz@...>

12/5/2010 7:50:34 AM

hi guys ... i haven't been following any discussions on the tuning list at all lately, but happened to read this post and thought you'd be interested in my webpage if you don't know it:

http://tonalsoft.com/enc/m/meantone.aspx

In fact 696.1 cents is very close to the 5ths sizes of both 7/26-comma meantone,
which is an optimal under some criteria, and Golden Meantone, which has
its own special interval-size properties.

-monz
http://tonalsoft.com/tonescape.aspx
Tonescape microtonal music software

--- In tuning@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:
>
> One more thing that seems particularly interesting:
>
> On Sun, Nov 28, 2010 at 4:45 PM, martinsj013 <martinsj@...> wrote:
> >
> > * where lower=4t-2400, upper=2400-3t - the idea is that a minimum near t=700 is in some sense an optimum MT 5th for major triads. The answer I get is 696.1 cents (the extra precision seems appropriate here). Of course we might get a different answer if we used the inversions of the major triad, but I've not tried that. (By the way, there are lower minima at t=600 and t=800 but these are of course just dyads.)
>
> That is a brilliant idea. And it spits out something close to
> 1/4-comma meantone. Wow.
>
> > * where lower=2400-3t, upper=4t-2400 - as above but for minor triads. There are three interesting minima (t=600 and t=800 again are not interesting) but none is as low as in the above.
> > ~~~ t=709.2 I think this approximates the 6:7:9 triad
> > ~~~ t=676.6 ditto 4:5:6 (but poorly!)
> > ~~~ t=696.2 ditto 10:12:15. (but the weakest min of the three)
>
> And the strongest it spits out is superpyth. Wow. I wonder why it
> ranks mavila above meantone though...? It ranks a really crappy 4:5:6
> as being lower in entropy than a more in tune 10:12:15? I guess this
> comes from the way entropy slopes as you travel further away from a
> minimum. This might be another reason to use the Vos curve instead of
> a Gaussian. I'm especially interested to see how that would look.
>
> This is fascinating.
>
> -Mike
>

🔗Carl Lumma <carl@...>

12/5/2010 11:57:13 AM

Hi Steve,

> Now done - obviously l=u=0 and l=u=1200 are points of low
> T.H.E.; apart from those two, the minima are numerous
> (30 of them) but very shallow; they include (497,497) and
> (401,401), but the lowest 10 are at:
>
> 702,(4;6;9),1/1,
> 600,(5;7;10),50/49,
> 832,(5;8;13),65/64,
> 951,(4;7;12),49/48,
> 137,(12;13;14),169/168,
> 99,(16;17;18),289/288,
> 171,(9;10;11),100/99,
> 192,(8;9;10),81/80,
> 1008,(5;9;16),81/80,
> 770,(7;11;17),121/119,
> 351,(18;22;27),243/242,

I've put in the commas tempered above, in a format Excel
should accept.

Can you put the t.h.e. values of the tempered point
at the end of each line?

-Carl

🔗Carl Lumma <carl@...>

12/5/2010 12:07:15 PM

Mike wrote:

> That is a brilliant idea. And it spits out something close to
> 1/4-comma meantone. Wow.

It shouldn't be surprising. If the we optimize tunings
didn't correspond to entropy minima it would indicate a
problem.

> > * where lower=2400-3t, upper=4t-2400 - as above but for minor
> > triads.
> > ~~~ t=709.2 I think this approximates the 6:7:9 triad
> > ~~~ t=676.6 ditto 4:5:6 (but poorly!)
> > ~~~ t=696.2 ditto 10:12:15. (but the weakest min of the three)
>
> And the strongest it spits out is superpyth. Wow. I wonder
> why it ranks mavila above meantone though...?

Because 4:5:6 is a much deeper minimum than 10:12:15.
This is along the lines of Igs' objection to dyadic entropy,
that a wolf 3:2 has the same entropy as a pure 6:5.

6:7:9 is not quite as deep as 4:5:6, but superpyth is
a lot more accurate than mavila.

> This might be another reason to use the Vos curve instead
> of a Gaussian. I'm especially interested to see how that
> would look.

Vos curves are based on a single paper... Gaussians are
much more common and, it seems to me, less likely to be
wrong. I should read that Vos paper though.

> Is there any way that we could come up with some method
> for coming up with an "entropy" for temperaments?

The primes-based weighted error measures we use have the
benefit of working on any chord in the temperament.
Harmonic entropy isn't even octave-equivalent. So I imagine
you'd have to specify a representative chord for the whole
temperament, which strikes me as a lot less elegant.

> Perhaps we could analyze all of the 5-limit temperaments
> in Carl's list, tweak them to see the ideal 4:5:6 that's
> generated, and then see how the resulting triads compare
> in entropy.

We can certainly use entropy to find tunings this way, but
they would only be valid for 4:5:6. And the agreement is
so good with the weighted error we use, it seems like
overkill.

-Carl

🔗Mike Battaglia <battaglia01@...>

12/5/2010 12:15:12 PM

On Sun, Dec 5, 2010 at 8:16 AM, martinsj013 <martinsj@...> wrote:
>
> > It didn't put 9:12:16 on there - e.g. about 400 cents, doubled? I
> would have expected that would be lower in entropy than something like
> 12:13:14 for sure.
>
> OK, 9:12:16 is about 500 cents doubled; 16:20:25 is about 400 cents doubled. Both *are* minima on this cross-section, but are lower down the list than the ten I showed. Why? I don't know; possibly with the latter it is because it is also near 12:15:19 and 15:19:24, and ambiguity makes for high entropy. Remember that all of these minima are much shallower than for the "meantone" cross-sections I showed before.

Yeah, whoops, I meant 500 cents doubled. I can't believe a pure
9:12:16 is lower in entropy than a distorted 16:17:18/

> > Is there any way that we could come up with some method for coming up with an "entropy" for temperaments?
>
> I'm thinking about this; happy to receive advice.

You could try calculating all of the entropy for each triad for some
suitable scale (say porcupine[7]). That means, take porcupine[7] and
come up with the combination set for all notes taken in sets of 3, and
then get the THE for each of them.

Or, you could try something like what Paul did with his dyadic HE
Minimizer, but with a triadic HE Minimizer.

-Mike

🔗Mike Battaglia <battaglia01@...>

12/5/2010 12:23:23 PM

On Sun, Dec 5, 2010 at 3:07 PM, Carl Lumma <carl@...> wrote:
>
> Mike wrote:
> > And the strongest it spits out is superpyth. Wow. I wonder
> > why it ranks mavila above meantone though...?
>
> Because 4:5:6 is a much deeper minimum than 10:12:15.
> This is along the lines of Igs' objection to dyadic entropy,
> that a wolf 3:2 has the same entropy as a pure 6:5.
> //
>
> Vos curves are based on a single paper... Gaussians are
> much more common and, it seems to me, less likely to be
> wrong. I should read that Vos paper though.

I don't see the significance of the Gaussian here. It was picked as an
arbitrary way to represent a "probability curve" that some interval
were mistuned. When you're going beyond the local minima and comparing
mistuned versions of pure versions of other intervals, then the choice
of curve you pick becomes significant outside of any initial
conjecture.

If we all think that a mavila 4:5:6 is less dissonant than a pure
10:12:15 (done with sine waves), then that's good evidence to use some
other kind of distribution curve than a Gaussian. On the other hand -
is it? Perhaps the flat 4:5:6 really is less dissonant than 10:12:15,
if sines are involved.

> > Is there any way that we could come up with some method
> > for coming up with an "entropy" for temperaments?
>
> The primes-based weighted error measures we use have the
> benefit of working on any chord in the temperament.
> Harmonic entropy isn't even octave-equivalent. So I imagine
> you'd have to specify a representative chord for the whole
> temperament, which strikes me as a lot less elegant.

Or, my idea was to come up with a representative scale for the
temperament - so like porcupine[7] - and do an THE analysis of all of
the triads in that temperament. Find out what the lowest entropy
triads are, the highest ones, the mean, the variance, etc. Or do it on
porcupine[15], although that will probably skew the mean upwards.

-Mike

🔗Carl Lumma <carl@...>

12/5/2010 9:41:28 PM

Mike wrote:

> Or, you could try something like what Paul did with his dyadic
> HE Minimizer, but with a triadic HE Minimizer.

That's a thought. Paul used octave-equivalent entropy for
those. Here we might simply restrict voicing to 2 octaves.
Conveniently, that's the range of our precomputed t.h.e. values.

For a 10 tone scale, that would be 20 things 3 at a time, or
1,140 triads to sum the entropy of in each round. Probably
doable though there's no guarantee it'd converge.

To fill Steve in, the idea is roughly:

1. Given a scale size n, find a random scale with n notes to
the octave. Look up all the triads in two octaves of the scale
and sum their entropies.

2. Randomly add or subtract 2 cents from one of the notes in
the scale. Sum the entropies again. If the new sum is smaller
than the old sum, keep the new scale. Otherwise discard.

3. If you have discarded for n^2 consecutive iterations, return
current scale and goto 1. Else goto 2.

You do this a million times and then look at the best scoring
scales, as well as scales returned more than once.

See also: http://lumma.org/tuning/erlich/

-Carl

🔗Carl Lumma <carl@...>

12/6/2010 1:44:15 AM

Mike wrote:

> So looking at the latest HE graph,
> /tuning/files/SteveMartin/tnxgc.svg
> you will notice that there are a bunch of "valleys" that run
> through the chart that correspond randomly to low entropy?
> For example, there is a vertical line at l=700, and a vertical
> line at l=1200. Pretty obvious why this is, because if one of
> the dyads in the chord is 2/1 or 3/2, it's going to be of
> comparatively low entropy to the things around it no matter how
> bad the other dyad is.

The other two dyads, yes.

> But what about the line going from (0,1200) to (1200,702)? That
> one follows a line of 200u+83l=240,000. If we assume the line
> goes through actually (1200,700), not (1200,702), then a line
> can be traced through it that is 12u+5l=14,400. WTF does this
> mean?
> A clue might be given by looking at the points on this line:
> 1:2:3, 2:3:5, 3:4:7, 4:5:9, 5:6:11, 6:7:13.

The first is 'superparticular' -- the simplest triad given
a = 1. When a = 2 it'd be 2:3:4, except that contains an octave
and you've already acknowledged such chords, so 2:3:5 is next up.
Same for the rest when a = 3, 4, 5, and 6.

> There's also an obvious "curve" like structure below this that
> runs from 1:1:1 to 4:5:6 to 3:4:5 to 2:3:4 to 1:2:3. This also
> contains things like 3:5:7. This curve seems to contain all of
> the triads that are "isoharmonic," e.g. for any triad a:b:c,
> c-b = b-a. So these triads can be generated by a:a+D:a+2D,
> where D is the difference.

Again, I think this is more to do with JI than entropy.
Your formula is equivalent to a:((a+c)/2):c which is the
mediant between a and c.

-Carl

🔗Carl Lumma <carl@...>

12/6/2010 1:50:12 AM

Steve wrote:

> Yes, I'm working through Carl's list (slowly). I need to present
> the results in a more consistent and informative way - thinking
> about this too.

I want to make it clear that if you're talking about the
"database" in this group, it is the due to Paul Erlich.

I also don't think further validation of entropy's ability
to find optimal generators is a high priority right now... of
course it never hurts.

Re. your earlier question about Matlab, some of Paul's
hexagonal plots show 90deg axes at the edges. I take it this
means he was drawing his own 60deg axes on top of them...

-Carl

🔗Mike Battaglia <battaglia01@...>

12/6/2010 3:32:33 AM

On Mon, Dec 6, 2010 at 4:44 AM, Carl Lumma <carl@...> wrote:
>
> > But what about the line going from (0,1200) to (1200,702)? That
> > one follows a line of 200u+83l=240,000. If we assume the line
> > goes through actually (1200,700), not (1200,702), then a line
> > can be traced through it that is 12u+5l=14,400. WTF does this
> > mean?
> > A clue might be given by looking at the points on this line:
> > 1:2:3, 2:3:5, 3:4:7, 4:5:9, 5:6:11, 6:7:13.
>
> The first is 'superparticular' -- the simplest triad given
> a = 1. When a = 2 it'd be 2:3:4, except that contains an octave
> and you've already acknowledged such chords, so 2:3:5 is next up.
> Same for the rest when a = 3, 4, 5, and 6.

Yes, but 5:7:12 is also on the curve. And points like 4:5:9 looks to
have a wider field of attraction than 4:5:7. The remarkable thing
about this curve is that the entire curve has a uniformly low entropy,
even between minima on the curve. The region surrounding 3:4:7, for
example, is lower in entropy than the region surrounding 3:4:6, where
there is a temporary "break" in the contour map up to blue until the
next two minima are hit. I'm not sure why this is.

The take home point is that even between labeled minima, the entire
curve is of relatively low entropy. So for any triad in which the
frequencies of the notes follow this pattern, even if the frequencies
aren't integer ratios, will be of relatively low entropy.

> > There's also an obvious "curve" like structure below this that
> > runs from 1:1:1 to 4:5:6 to 3:4:5 to 2:3:4 to 1:2:3. This also
> > contains things like 3:5:7. This curve seems to contain all of
> > the triads that are "isoharmonic," e.g. for any triad a:b:c,
> > c-b = b-a. So these triads can be generated by a:a+D:a+2D,
> > where D is the difference.
>
> Again, I think this is more to do with JI than entropy.
> Your formula is equivalent to a:((a+c)/2):c which is the
> mediant between a and c.

This curve has slightly higher entropy than the other one, but still
cuts a blue swathe through the surrounding red landscape. So any triad
in which the note frequencies have a constant difference between them
will be of relatively lower entropy than the immediately surrounding
triads. The effect isn't as prominent as with the a:b:a+b triads.

-Mike

🔗john777music <jfos777@...>

12/6/2010 4:57:26 PM

This might be relevant... The formula I have for the consonance value of an interval (using sine wave tones only) is:
(2 + 1/x + 1/y - y/x)/2....where y/x < 0.9375 or 15/16.
It seems to me that all intervals with a value less than 0.75 are dissonant. Between 0.75 and 0.99999 are Minor and 1.0 or higher are Major.
I then used my formula in a more convoluted way for chords with three or more notes and to my surprise I found that the six note E minor chord (again using only sine wave tones):
10:15:20:24:30:40
has a value of around -0.49.
Yet it still sounds okay.
This indicates to me that as long as all the intervals (pairs of notes) in a chord are good then the chord should be good.
In other words if the Harmonic Entropy theory is correct for pairs of notes (intervals) then there's no need to consider Triadic HE because if all the intervals that occur are good, then the chord should be good.

John.

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
>
> Steve wrote:
>
> > Yes, I'm working through Carl's list (slowly). I need to present
> > the results in a more consistent and informative way - thinking
> > about this too.
>
> I want to make it clear that if you're talking about the
> "database" in this group, it is the due to Paul Erlich.
>
> I also don't think further validation of entropy's ability
> to find optimal generators is a high priority right now... of
> course it never hurts.
>
> Re. your earlier question about Matlab, some of Paul's
> hexagonal plots show 90deg axes at the edges. I take it this
> means he was drawing his own 60deg axes on top of them...
>
> -Carl
>

🔗Mike Battaglia <battaglia01@...>

12/6/2010 10:44:50 PM

On Mon, Dec 6, 2010 at 7:57 PM, john777music <jfos777@...> wrote:
>
> This indicates to me that as long as all the intervals (pairs of notes) in a chord are good then the chord should be good.
> In other words if the Harmonic Entropy theory is correct for pairs of notes (intervals) then there's no need to consider Triadic HE because if all the intervals that occur are good, then the chord should be good.

I'm not sure about that - the 13-limit otonality and the 13-limit
utonality sound a lot different, although they have the same
intervals. And 6:7:9 and 14:18:21 sound a lot different as well.

-Mike

🔗martinsj013 <martinsj@...>

12/7/2010 1:27:28 PM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> Can you put the t.h.e. values of the tempered point
> at the end of each line?

No problem:

702,(4;6;9),1/1,5.7729
600,(5;7;10),50/49,5.7903
832,(5;8;13),65/64,5.8033
951,(4;7;12),49/48,5.8228
137,(12;13;14),169/168,5.8362
99,(16;17;18),289/288,5.8445
171,(9;10;11),100/99,5.8446
192,(8;9;10),81/80,5.8469
1008,(5;9;16),81/80,5.8478
770,(7;11;17),121/119,5.8500
(17 others omitted including 497 and 401)
351,(18;22;27),243/242,5.8814

But are the T.H.E. values meaningful? I am thinking that the "percentage score" Igs wanted for dyadic H.E. could be useful here too.

Steve M.

🔗Carl Lumma <carl@...>

12/8/2010 1:26:54 AM

Steve wrote:

> > Can you put the t.h.e. values of the tempered point
> > at the end of each line?
>
> No problem:
>
> 702,(4;6;9),1/1,5.7729
> 600,(5;7;10),50/49,5.7903
> 832,(5;8;13),65/64,5.8033
> 951,(4;7;12),49/48,5.8228
> 137,(12;13;14),169/168,5.8362
> 99,(16;17;18),289/288,5.8445
> 171,(9;10;11),100/99,5.8446
> 192,(8;9;10),81/80,5.8469
> 1008,(5;9;16),81/80,5.8478
> 770,(7;11;17),121/119,5.8500
> (17 others omitted including 497 and 401)
> 351,(18;22;27),243/242,5.8814
>
> But are the T.H.E. values meaningful?

I guess my hypothesis would be that, for points on this
line, the greater the entropy rise above the most probable
JI chord, the higher the badness of the comma tempered out.

The badness of a comma can be given as the product of its
size in cents and the log of the product of its numerator
and denominator. That is, for a comma n/d, badness is
cents(n/d) * log(n*d).

Are Mike and Igs calling these "isoharmonic" chords?
Because that's what I named this spreadsheet:

http://lumma.org/temp/isoharmonic.xls

-Carl

🔗Mike Battaglia <battaglia01@...>

12/8/2010 1:48:28 AM

On Wed, Dec 8, 2010 at 4:26 AM, Carl Lumma <carl@...> wrote:
>
> Are Mike and Igs calling these "isoharmonic" chords?
> Because that's what I named this spreadsheet:
>
> http://lumma.org/temp/isoharmonic.xls

My post about the isoharmonic triads was unrelated to this. I stole
the "isoharmonic" term from George Secor's 17-tet paper, I think. I
was using it to mean a chord like 9:11:13, or 7:11:15, or something
where all of the frequencies in the triad are separated by the same
"difference tone." I didn't want to use the "difference tone"
terminology because not only are there no "difference tones" happening
in the ear when these chords are perceived, but there's definitely no
"difference tones" happening in a model that doesn't model difference
tones.

This is a different concept, because it involves triads where the
frequencies are separated logarithmically by the same number of cents.

-Mike

🔗Carl Lumma <carl@...>

12/8/2010 2:03:07 AM

Mike wrote:
> >
> > Are Mike and Igs calling these "isoharmonic" chords?
> > Because that's what I named this spreadsheet:
> >
> > http://lumma.org/temp/isoharmonic.xls
>
> My post about the isoharmonic triads was unrelated to this.
> I stole the "isoharmonic" term from George Secor's 17-tet paper,
> I think. I was using it to mean a chord like 9:11:13,
> or 7:11:15, or something where all of the frequencies in the
> triad are separated by the same "difference tone." [snip]
> This is a different concept, because it involves triads where
> the frequencies are separated logarithmically by the same number
> of cents.

Alright then, howabout this:

http://lumma.org/temp/l=u.xls

-Carl

🔗genewardsmith <genewardsmith@...>

12/8/2010 10:10:53 AM

--- In tuning@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:

> This is a different concept, because it involves triads where the
> frequencies are separated logarithmically by the same number of cents.

I'm interested in such chords, but I can't follow this conversation because you guys don't use standard terminology for chords. Hence most of us, I am sure, have no idea what you are talking about.

🔗Mike Battaglia <battaglia01@...>

12/8/2010 10:21:56 AM

On Wed, Dec 8, 2010 at 1:10 PM, genewardsmith
<genewardsmith@...> wrote:
>
> --- In tuning@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:
>
> > This is a different concept, because it involves triads where the
> > frequencies are separated logarithmically by the same number of cents.
>
> I'm interested in such chords, but I can't follow this conversation because you guys don't use standard terminology for chords. Hence most of us, I am sure, have no idea what you are talking about.

What's the confusion? I used the term in a single post I made a few
days ago in which I defined the chords I was referring to
algebraically.

-Mike

🔗genewardsmith <genewardsmith@...>

12/8/2010 10:57:21 AM

--- In tuning@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:
>
> On Wed, Dec 8, 2010 at 1:10 PM, genewardsmith
> <genewardsmith@...> wrote:

> > I'm interested in such chords, but I can't follow this conversation because you guys don't use standard terminology for chords. Hence most of us, I am sure, have no idea what you are talking about.
>
> What's the confusion? I used the term in a single post I made a few
> days ago in which I defined the chords I was referring to
> algebraically.

Stuff like "600,(5;7;10),50/49,5.7903" is not normally how we refer to chords.

🔗Carl Lumma <carl@...>

12/8/2010 11:11:36 AM

Gene wrote:

> Stuff like "600,(5;7;10),50/49,5.7903" is not normally how we
> refer to chords.

This is comma-separated text meant to be viewed in a spreadsheet.
The columns are:

* The dyad which is the inner two intervals in the triad (cents)
* The most-probable JI triad approximated (semicolons instead of
colons else Excel will interpret them as times)
* The comma tempered out
* The entropy of the tempered triad

All of this is visible in my Excel sheet

http://lumma.org/temp/l=u.xls

-Carl

🔗genewardsmith <genewardsmith@...>

12/8/2010 11:42:53 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
>
> Gene wrote:
>
> > Stuff like "600,(5;7;10),50/49,5.7903" is not normally how we
> > refer to chords.
>
> This is comma-separated text meant to be viewed in a spreadsheet.

Your spreadsheets are not in standard notation either, nor are they self-explanatory. But thanks for the explanation; the 5;7;10 in place of 5:7:10 was very confusing. Why not give results in people-readable form instead of spreadsheet gibberish? Why force people to use ^$#*(! spreadsheets?

🔗genewardsmith <genewardsmith@...>

12/8/2010 1:15:06 PM

--- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:
>
> --- In tuning@yahoogroups.com, "Carl Lumma" <carl@> wrote:
> > Can you put the t.h.e. values of the tempered point
> > at the end of each line?
>
> No problem:
>
> 702,(4;6;9),1/1,5.7729
> 600,(5;7;10),50/49,5.7903

> But are the T.H.E. values meaningful? I am thinking that the "percentage score" Igs wanted for dyadic H.E. could be useful here too.

It seems to me they are fairly cockeyed, due to neglect of critical bands.

🔗Carl Lumma <carl@...>

12/8/2010 1:50:28 PM

--- In tuning@yahoogroups.com, "genewardsmith" <genewardsmith@...> wrote:

> It seems to me they are fairly cockeyed, due to neglect of
> critical bands.

Can you give examples?

-Carl

🔗genewardsmith <genewardsmith@...>

12/8/2010 3:05:50 PM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
>
> --- In tuning@yahoogroups.com, "genewardsmith" <genewardsmith@> wrote:
>
> > It seems to me they are fairly cockeyed, due to neglect of
> > critical bands.
>
> Can you give examples?

137,(12;13;14),169/168,5.8362
99,(16;17;18),289/288,5.8445
171,(9;10;11),100/99,5.8446
192,(8;9;10),81/80,5.8469

16:17:18 is particularly egregious.

🔗Carl Lumma <carl@...>

12/8/2010 3:43:15 PM

Gene wrote:

> > Can you give examples?
>
> 137,(12;13;14),169/168,5.8362
> 99,(16;17;18),289/288,5.8445
> 171,(9;10;11),100/99,5.8446
> 192,(8;9;10),81/80,5.8469
>
> 16:17:18 is particularly egregious.

Relative to what?

Do you mean the fact that the entropy is going up here
while the JI approximations are getting simpler and the
interior intervals are (mostly) getting bigger?
One must also keep in mind that the temperament error is
going up. My spreadsheet shows that the entropy of the
JI chords goes down as expected. I'll tack those on the
end here:

99,(16;17;18),289/288,5.8445,5.8424
171,(9;10;11),100/99,5.8446,5.8264
192,(8;9;10),81/80,5.8469,5.8176

Not disagreeing with you, just thinking outloud. We
need ears on this.

Strictly speaking, these entropies pertain to three sine
tones only. But because complex ratios are both more
numerous and also capable of being smaller, results in
the dyadic case were generally respectful of critical
band effects and I would expect that to be true here also.

-Carl

🔗genewardsmith <genewardsmith@...>

12/8/2010 4:14:15 PM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
>
> Gene wrote:
>
> > > Can you give examples?
> >
> > 137,(12;13;14),169/168,5.8362
> > 99,(16;17;18),289/288,5.8445
> > 171,(9;10;11),100/99,5.8446
> > 192,(8;9;10),81/80,5.8469
> >
> > 16:17:18 is particularly egregious.
>
> Relative to what?

Relative to lots of stuff which would sound more consonant than two semitones in a row.

> Do you mean the fact that the entropy is going up here
> while the JI approximations are getting simpler and the
> interior intervals are (mostly) getting bigger?

No, I mean that the critical band dissonances 17/16 and 18/17, if I understand correctly, are being scored high by entropy; presumably because entropy ignores critical bands and therefore produces results which are cockeyed. I'm not making some kind of deep point here, merely stating the obvious if I'm understanding these numbers aright.

🔗Carl Lumma <carl@...>

12/8/2010 4:29:54 PM

--- In tuning@yahoogroups.com, "genewardsmith" <genewardsmith@...> wrote:
>
> > > 137,(12;13;14),169/168,5.8362
> > > 99,(16;17;18),289/288,5.8445
> > > 171,(9;10;11),100/99,5.8446
> > > 192,(8;9;10),81/80,5.8469
> > >
> > > 16:17:18 is particularly egregious.
> >
> > Relative to what?
>
> Relative to lots of stuff which would sound more consonant
> than two semitones in a row.

These are minima on a cross-section of triadic entropy
defined by lower=upper. No claim is being made that these
are overall Concordant Things.

-Carl

🔗Mike Battaglia <battaglia01@...>

12/8/2010 7:51:09 PM

On Wed, Dec 8, 2010 at 7:29 PM, Carl Lumma <carl@...> wrote:
>
> --- In tuning@yahoogroups.com, "genewardsmith" <genewardsmith@...> wrote:
> >
> > > > 137,(12;13;14),169/168,5.8362
> > > > 99,(16;17;18),289/288,5.8445
> > > > 171,(9;10;11),100/99,5.8446
> > > > 192,(8;9;10),81/80,5.8469
> > > >
> > > > 16:17:18 is particularly egregious.
> > >
> > > Relative to what?
> >
> > Relative to lots of stuff which would sound more consonant
> > than two semitones in a row.
>
> These are minima on a cross-section of triadic entropy
> defined by lower=upper. No claim is being made that these
> are overall Concordant Things.

There is something weird about the list. Where is 9:12:16? Is it not
on the list of the lowest entropy intervals on that line? A pure
9:12:16 would be on the l=u line, and I don't see how it can have
higher entropy than a mistuned 16:17:18.

-Mike

🔗martinsj013 <martinsj@...>

12/10/2010 7:08:06 AM

--- In tuning@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:
> > > > > 137,(12;13;14),169/168,5.8362
> > > > > 99,(16;17;18),289/288,5.8445
> > > > > 171,(9;10;11),100/99,5.8446
> > > > > 192,(8;9;10),81/80,5.8469
> > ... minima on a cross-section of triadic entropy ...
> There is something weird about the list. Where is 9:12:16? Is it not
> on the list of the lowest entropy intervals on that line? A pure
> 9:12:16 would be on the l=u line, and I don't see how it can have
> higher entropy than a mistuned 16:17:18.

I can see what/why you're asking; but note that:
* the list was only the top 10 of 30 local minima found on the line; l=497 cents is in the list (#17).
* just looking at the contour plot, you can see that the cross-section falls into one of your valleys when l (=u) is around 70-200 cents, has climbed up to a higher level when l is around 300-350 cents, and is still there (just about) at 400 cents. So the list is consistent with the contour plot at least, not that I have a full explanation (but see below).
* l=u=497 is not a local minimum in the contour plot (but l=498, u=497 is).
* with reference to the full list (at the bottom of this post, and I will also upload it to
/tuning/files/SteveMartin/LUcrossect.csv):

~~~ "MLJI" is the member of the Tenney set with the highest probability;
~~~ "MLJI prob" is that highest probability;
~~~ the MLJI prob for 9/12/16 is higher than that for 16/17/18; I can have a go at explaining that, but what do you think?

Lower Upper T.H.E. MLJI MLJI prob
0 0 4.14220558530341 1/1/1 0.438880640820319
1200 1200 5.18528014235984 1/2/4 0.227818027359687
702 702 5.77289183012587 4/6/9 0.0775949263971548
600 600 5.7902606656242 5/7/10 0.0322615525833345
832 832 5.80334701482419 5/8/13 0.0318805282409328
951 951 5.82279554998273 4/7/12 0.0318693681306855
137 137 5.83617956474491 12/13/14 0.030927524226253
99 99 5.84452454608467 16/17/18 0.0252904547925563
171 171 5.84462948128287 9/10/11 0.0371988559907993
192 192 5.84689650581254 8/9/10 0.0392817300807022
1008 1008 5.84777103185681 5/9/16 0.039284704711604
770 770 5.85003390038553 7/11/17 0.0255134065797206
217 217 5.85201474943876 7/8/9 0.0377108756611819
458 458 5.85338102930826 10/13/17 0.0333002985956444
544 544 5.85512474719855 8/11/15 0.0377676064446722
1042 1042 5.85623988213157 6/11/20 0.0378681776223624
249 249 5.85979500772603 6/7/8 0.0318261728795526
427 427 5.86016106590877 11/14/18 0.0277025254078419
497 497 5.86231365589106 9/12/16 0.0385078295366118
644 644 5.86864491559875 9/13/19 0.026539267747204
1066 1066 5.86948219747604 7/13/24 0.0336007552733021
904 904 5.87002595240641 6/10/17 0.0221471780686454
881 881 5.87051556511398 9/15/25 0.028737488391564
1111 1111 5.87461190177319 10/19/36 0.0234822908826253
292 292 5.8751664019775 5/6/7 0.0194080198301852
923 923 5.87852042228243 10/17/29 0.0265498216964395
401 401 5.87878598449277 12/15/19 0.0210676508287809
793 793 5.87930387306207 12/19/30 0.0242112160662288
351 351 5.88140450799729 18/22/27 0.0205249435801608
747 747 5.88271670193765 11/17/26 0.0218315589050617
373 373 5.88362066350747 13/16/20 0.0189649491633541
333 333 5.8897191238034 19/23/28 0.0182054051394438

🔗martinsj013 <martinsj@...>

12/10/2010 7:39:39 AM

Gene> 16:17:18 is particularly egregious ...

Carl> These are minima on a cross-section of triadic entropy defined by lower=upper. No claim is being made that these are overall Concordant Things.

Just to amplify what Carl is saying, as the thread has got quite long and perhaps difficult to follow,
* a contour plot of the T.H.E. values for (lower interval, upper interval) on a square grid with 10 cents spacing (but with the local minima and maxima to a 1 cent accuracy superimposed) is at:
/tuning/files/SteveMartin/tnxgc.png
* the local minima from that 2d area are listed at:
/tuning/files/SteveMartin/tmin2_1M.csv
* the minima on a 1d cross-section having lower=upper are at:
/tuning/files/SteveMartin/LUcrossect.csv

Of course, a minimum on the 1d line is a constrained minimum, so in general may be found not to have a particularly low T.H.E. value; furthermore even a local minimum on the 2d area need not have a particularly low T.H.E. value if it is an area of generally high T.H.E. - indeed we do find that the weakest few of the minima have a value that is higher than that of the weakest few of the local maxima.

Steve M.

🔗martinsj013 <martinsj@...>

12/18/2010 2:56:51 AM

Steve> ... my calculation of the distance between two triads vs the triad space definition as defined by Chalmers. I think my distance is always a constant multiple of his ...
Carl> ... it seems like we just want to know the right value for S. I don't think we know the right value though, and it may vary by listener/timbre anyway ... Or is there another way the two distance formulations could alter the results?

re the last question I think the answer is no; that is, I agree that these two formulations differ only by a constant factor, not in shape; and unless we have a precise value of S to work with, this is not too important. I have realised, though, that a comparison with "dyad space" can be made: if the distance between 1:1 and 1:2 is 1200 cents, then it follows that the distance between 1:1:1 and 1:1:2 should be 1200 cents also. Or does it? And between 1:1:1 and 1:2:2? And between 1:1:2 and 1:2:2? Triad space says all three are 1200 cents whereas my formula says sqrt(2) times as big.

P.S. I have been thinking about what tetrad space would look like, and my very brief answer is:
http://en.wikipedia.org/wiki/Rhombic_dodecahedron

Steve M.

🔗Carl Lumma <carl@...>

12/18/2010 12:05:31 PM

Steve wrote:

> re the last question I think the answer is no; that is, I agree
> that these two formulations differ only by a constant factor,
> not in shape; and unless we have a precise value of S to
> work with, this is not too important. I have realised, though,
> that a comparison with "dyad space" can be made: if the distance
> between 1:1 and 1:2 is 1200 cents, then it follows that the
> distance between 1:1:1 and 1:1:2 should be 1200 cents also.
> Or does it? And between 1:1:1 and 1:2:2? And between 1:1:2
> and 1:2:2? Triad space says all three are 1200 cents whereas my
> formula says sqrt(2) times as big.

Sounds bad. If I were you I'd recode it.

> P.S. I have been thinking about what tetrad space would
> look like, and my very brief answer is:
> http://en.wikipedia.org/wiki/Rhombic_dodecahedron

Why do you think that? Three intervals are required to define
a tetrad, producing six. So without having thought about it
at all, I would suspect it's a tetrahedron.

-Carl

🔗martinsj013 <martinsj@...>

12/19/2010 2:28:04 AM

Steve> ... Triad space says all three are 1200 cents whereas my formula says sqrt(2) times as big.
Carl> Sounds bad. If I were you I'd recode it.

OK, I guess you're right. But I think you are implicitly agreeing with some of my points, can I ask for confirmation?

* the distance between 1:1:1 and 1:1:2, and between 1:1:1 and 1:2:2, and between 1:1:2 and 1:2:2, is 1200 cents.

* if my calculation multiplies all distances by sqrt(2) this does not affect the shape of triad space, but is tantamount to dividing the parameter S by sqrt(2).

* I thought I was using 1.2% i.e. 20.65 cents, for S; but because of the above, I am actually using 14.60 cents or 0.85%

* This is still a reasonable value for S, so the T.H.E. calculation is still reasonable.

Steve.

P.S. outside of the "+ve quadrant" (or I should say "+ve sextant") of triad space, we have four other points that only use the numbers 1 and 2 when stated as ratios in lowest terms - 1:2:1, 2:2:1, 2:1:1, 2:1:2. Each of these is 1200 cents distant from 1:1:1, but e.g. the distance between 1:1:2 and 1:2:1 is 1200*sqrt(3), and between 1:1:2 and 2:2:1 is 2400 cents, etc

🔗Mike Battaglia <battaglia01@...>

12/19/2010 8:45:31 AM

On Sun, Dec 19, 2010 at 5:28 AM, martinsj013 <martinsj@...> wrote:
>
> Steve> ... Triad space says all three are 1200 cents whereas my formula says sqrt(2) times as big.
> Carl> Sounds bad. If I were you I'd recode it.
>
> OK, I guess you're right. But I think you are implicitly agreeing with some of my points, can I ask for confirmation?
>
> * the distance between 1:1:1 and 1:1:2, and between 1:1:1 and 1:2:2, and between 1:1:2 and 1:2:2, is 1200 cents.
>
> * if my calculation multiplies all distances by sqrt(2) this does not affect the shape of triad space, but is tantamount to dividing the parameter S by sqrt(2).

Aha! I knew something like this was up. That's why I was asking so
much about the triads in the other thread.

If you look at the "pathological" triads along the axes, i.e. where
one of the dyads is 1:1, there are a lot of "dips" that aren't there
in the dyadic version. I always suspected something was weird about
this, if s=1.2% is used...

> * I thought I was using 1.2% i.e. 20.65 cents, for S; but because of the above, I am actually using 14.60 cents or 0.85%

That makes so much more sense now. So we've been studying something
analogous to the "best case" HE curve.

How much more work would it take to recalculate at s=1.2%? My main
interest is in seeing what the model puts out for different types of
minor triads, as well as seeing if 9:11:13 has a dip in the curve
where 9:11 didn't.

-Mike

🔗Carl Lumma <carl@...>

12/19/2010 11:59:30 AM

>> Steve> ... Triad space says all three are 1200 cents whereas
>> my formula says sqrt(2) times as big.
>
> Carl> Sounds bad. If I were you I'd recode it.
>
> OK, I guess you're right. But I think you are implicitly
> agreeing with some of my points, can I ask for confirmation?
>
> * the distance between 1:1:1 and 1:1:2, and between 1:1:1
> and 1:2:2, and between 1:1:2 and 1:2:2, is 1200 cents.
>
> * if my calculation multiplies all distances by sqrt(2) this
> does not affect the shape of triad space, but is tantamount
> to dividing the parameter S by sqrt(2).
>
> * I thought I was using 1.2% i.e. 20.65 cents, for S; but
> because of the above, I am actually using 14.60 cents or 0.85%
>
> * This is still a reasonable value for S, so the T.H.E.
> calculation is still reasonable.

I agree with all of the above, except I thought you initially
shot for s = 1% but found your coordinate system made it 1.2 (?).
1% is the desired value.

> P.S. outside of the "+ve quadrant" (or I should say
> "+ve sextant") of triad space, we have four other points that
> only use the numbers 1 and 2 when stated as ratios in lowest
> terms - 1:2:1, 2:2:1, 2:1:1, 2:1:2. Each of these is
> 1200 cents distant from 1:1:1, but e.g. the distance between
> 1:1:2 and 1:2:1 is 1200*sqrt(3), and between 1:1:2 and 2:2:1
> is 2400 cents, etc.

Yep.

-Carl

🔗martinsj013 <martinsj@...>

12/20/2010 2:23:01 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> ... I thought you initially shot for s = 1% but found your coordinate system made it 1.2 (?). 1% is the desired value.

No, I was shooting for 1.2% (I remember one of the H.E. graphs definitely had 1.2% for "normal" and 0.6% for "acute" hearing) and my coordinate system I now realise made it 0.85%. Sorry if it wasn't clear, but earlier posts of mine were alluding to this uncertainty.

Redoing the entire calculation at 1% is simple - but will take CPU time!

Steve M.

🔗martinsj013 <martinsj@...>

12/20/2010 2:38:01 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> > (Steve:) tetrad space
> > http://en.wikipedia.org/wiki/Rhombic_dodecahedron
> Why do you think that? Three intervals are required to define
> a tetrad, producing six. So without having thought about it
> at all, I would suspect it's a tetrahedron.

I agree that the points 1:1:1:1, 1:1:1:2, 1:1:2:2 and 1:2:2:2 form a tetrahedron (I am guessing that's what you mean), but:
1) I don't think it is a regular tetrahedron, and
2) I am including the 11 other points (12, if you include 2:2:2:2) whose lowest terms ratio involves only 1 and 2.

I am drafting a fuller response but it's taking time, and I think the requests re triadic H.E. are probably more deserving of it.

Steve M.

🔗Carl Lumma <carl@...>

12/20/2010 12:27:19 PM

--- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:
>
> No, I was shooting for 1.2% (I remember one of the H.E. graphs
> definitely had 1.2% for "normal" and 0.6% for "acute" hearing)
> and my coordinate system I now realise made it 0.85%. Sorry if
> it wasn't clear, but earlier posts of mine were alluding to this
> uncertainty.

Sorry I misunderstood.

We experimented with a wide range of values; 1% seemed to me
to be best. You can see the performance of three subjects in
the original Houtsma & Goldstein experiments here:
/tuning/files/CarlLumma/Fig.7.png

> Redoing the entire calculation at 1% is simple - but will
> take CPU time!

It will be interesting to compare the results!

-Carl

🔗Carl Lumma <carl@...>

12/20/2010 1:15:34 PM

--- In tuning@yahoogroups.com, "martinsj013" <martinsj@...> wrote:

> > Why do you think that? Three intervals are required to define
> > a tetrad, producing six. So without having thought about it
> > at all, I would suspect it's a tetrahedron.
>
> I agree that the points 1:1:1:1, 1:1:1:2, 1:1:2:2 and 1:2:2:2
> form a tetrahedron (I am guessing that's what you mean), but:

The purpose of these barycentric plots is to express all the
dyads in a chord, and their relationships, in as few spatial
dimensions as possible. So the number of spatial dimensions
should equal the minimum number of independent intervals
required to define the chord. This is always the number of
interior intervals in the chord, so tetrads are in 3-space.
Then we need an edge for each dyad; in this case 6.

> 2) I am including the 11 other points (12, if you include
> 2:2:2:2) whose lowest terms ratio involves only 1 and 2.

Ah, that might give you a 3-space unit cell. That's
analogous to the hexagon for triads, which includes the
inversions...

With our triads up to 1:2:4 sans inversions, we actually
have a parallelogram consisting of two regular triangles
sharing an edge. It's parts of two neighboring hexagonal
unit cells. We probably have parts of neighboring rhombic
dodecs with tetrads up to 1:2:4:8...

-Carl

🔗Carl Lumma <carl@...>

12/20/2010 7:22:11 PM

I wrote:
> This is always the number of
> interior intervals in the chord, so tetrads are in 3-space.
> Then we need an edge for each dyad; in this case 6.

And presumably the four triads are represented by the
projection of a point onto each of the four faces.

> With our triads up to 1:2:4 sans inversions, we actually
> have a parallelogram consisting of two regular triangles
> sharing an edge. It's parts of two neighboring hexagonal
> unit cells. We probably have parts of neighboring rhombic
> dodecs with tetrads up to 1:2:4:8...

Presumably we are talking about this guy
http://en.wikipedia.org/wiki/Disphenoid_tetrahedral_honeycomb
and indeed, these are not regular tetrahedra. Not sure if
that's a problem or not. . .

-Carl

🔗martinsj013 <martinsj@...>

12/21/2010 4:38:09 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> ... So the number of spatial dimensions
> should equal the minimum number of independent intervals
> required to define the chord ... so tetrads are in 3-space.
Agreed.

> Then we need an edge for each dyad; in this case 6.
Note that in triad space the subspaces where a dyad is constant are lines, but in tetrad space they are planes (UIAMM), so we need six of these; the faces of the rhombic dodecahedron form six pairs of parallels. I guess the normals (through the origin) to these planes are the axes; is that what you mean?

> ... [including the 11 other points (12, if you include
> > 2:2:2:2) whose lowest terms ratio involves only 1 and 2.]
> ... might give you a 3-space unit cell. That's
> analogous to the hexagon for triads
Exactly what I meant.

Also:
--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> And presumably the four triads are represented by the
> projection of a point onto each of the four faces.
I haven't thought about that.

> Presumably we are talking about this guy
> http://en.wikipedia.org/wiki/Disphenoid_tetrahedral_honeycomb
> and indeed, these are not regular tetrahedra. Not sure if
> that's a problem or not. . .

The sum-of-squares for the six intervals turns out to be 3l^2 + 4m^2 + 3u^2 + 4mu + 2lu + 4lm (in terms of lower, middle, upper) so m has a different effect from l or u. In the tetrahedron (it looks like the one I'm thinking of is same as the "disphenoid" one) four edges are of unit size (1200 cents if the unit is the octave), and two are 2/sqrt3 times as big. This seems to make some sense in that 1:1:1:1 and 1:1:2:2 differ in two places, as do 1:1:1:2 and 1:2:2:2, whilst the other pairings differ in only one place (provided we allow that 1:1:1:1 is equivalent to 2:2:2:2). This is why I wasn't 100% sure that, in the triad space equilateral triangle, all three distances should be 1200 cents (though I can't justify a different value).

🔗Carl Lumma <carl@...>

12/21/2010 1:07:13 PM

Hi Steve,

> Note that in triad space the subspaces where a dyad is constant
> are lines, but in tetrad space they are planes (UIAMM),

UIAMM?

I was thinking they would still be lines, hence six edges.

> I guess the normals (through the origin) to these planes are
> the axes; is that what you mean?

I was confusing barycentric coordinates and trilinear
coordinates, which are closely related (at least in 2-D).
It's trilinear we've been using.

> > And presumably the four triads are represented by the
> > projection of a point onto each of the four faces.
>
> I haven't thought about that.

Lines normal to the faces would additionally keep a constant
triad.

> The sum-of-squares for the six intervals turns out to be
> 3l^2 + 4m^2 + 3u^2 + 4mu + 2lu + 4lm
> (in terms of lower, middle, upper) so m has a different effect
> from l or u. In the tetrahedron (it looks like the one I'm
> thinking of is same as the "disphenoid" one) four edges are
> of unit size (1200 cents if the unit is the octave), and two
> are 2/sqrt3 times as big. This seems to make some sense in
> that 1:1:1:1 and 1:1:2:2 differ in two places, as do 1:1:1:2
> and 1:2:2:2, whilst the other pairings differ in only one
> place (provided we allow that 1:1:1:1 is equivalent to 2:2:2:2).
> This is why I wasn't 100% sure that, in the triad space
> equilateral triangle, all three distances should be 1200 cents
> (though I can't justify a different value).

I reckon my one-octave tetrahedron would have vertices
1:1:1:1, 1:1:1:2, 1:1:2:2, and 1:2:2:2 with all edges
being 1200 cents...

Sorry for the brainstorm, I'm headed out to vaccinate the
kids, then mailing stuff, then...

-Carl

🔗martinsj013 <martinsj@...>

12/22/2010 2:14:56 AM

--- In tuning@yahoogroups.com, "Carl Lumma" <carl@...> wrote:
> UIAMM?
"Unless I am much mistaken" - sorry, I made that one up :-)

> > The sum-of-squares for the six intervals turns out to be
> > 3l^2 + 4m^2 + 3u^2 + 4mu + 2lu + 4lm
> > (in terms of lower, middle, upper) so m has a different effect
> > from l or u. In the tetrahedron (it looks like the one I'm
> > thinking of is same as the "disphenoid" one) four edges are
> > of unit size (1200 cents if the unit is the octave), and two
> > are 2/sqrt3 times as big. This seems to make some sense in
> > that 1:1:1:1 and 1:1:2:2 differ in two places, as do 1:1:1:2
> > and 1:2:2:2, whilst the other pairings differ in only one
> > place (provided we allow that 1:1:1:1 is equivalent to 2:2:2:2).
> > This is why I wasn't 100% sure that, in the triad space
> > equilateral triangle, all three distances should be 1200 cents
> > (though I can't justify a different value).
>
> I reckon my one-octave tetrahedron would have vertices
> 1:1:1:1, 1:1:1:2, 1:1:2:2, and 1:2:2:2 with all edges
> being 1200 cents...
So it's one vote each (I'm joking). I suspect that there can only be one "correct" answer, and it can't be that difficult to show, but I haven't achieved it yet.

> Sorry for the brainstorm, I'm headed out to vaccinate the
> kids, then mailing stuff, then...
Yep, there are things that need to be done before Saturday!

Steve M.

🔗martinsj013 <martinsj@...>

1/3/2011 9:49:19 AM

Steve> No, I was shooting for 1.2% (I remember one of the H.E. graphs definitely had 1.2% for "normal" and 0.6% for "acute" hearing) and my coordinate system I now realise made it 0.85% ... Redoing the entire calculation at 1% is simple - but will take CPU time!
Carl> We experimented with a wide range of values; 1% seemed to me to be best ... It will be interesting to compare the results!

I have only done the comparison for three cross-sections - l=0, u=0 and l+u=1200; please see:
/tuning/files/SteveMartin/t1pc.xls

IIRC, Mike B said he thought there were too many wrinkles on (one of) the lines, but the differences do not look great to me. But then, my value of s was effectively 0.85%, not the 1.2% I thought it was. Any comments?

Steve M.

🔗Mike Battaglia <battaglia01@...>

1/3/2011 10:21:14 PM

On Mon, Jan 3, 2011 at 12:49 PM, martinsj013 <martinsj@...> wrote:
>
> Steve> No, I was shooting for 1.2% (I remember one of the H.E. graphs definitely had 1.2% for "normal" and 0.6% for "acute" hearing) and my coordinate system I now realise made it 0.85% ... Redoing the entire calculation at 1% is simple - but will take CPU time!
> Carl> We experimented with a wide range of values; 1% seemed to me to be best ... It will be interesting to compare the results!
>
> I have only done the comparison for three cross-sections - l=0, u=0 and l+u=1200; please see:
> /tuning/files/SteveMartin/t1pc.xls
>
> IIRC, Mike B said he thought there were too many wrinkles on (one of) the lines, but the differences do not look great to me. But then, my value of s was effectively 0.85%, not the 1.2% I thought it was. Any comments?
>
> Steve M.

Hi Steve,

That looks more like it. Thanks for posting this. The line in specific
I was talking about was the one where the two inner dyads are the
same, e.g. where l = u and o = 2l = 2u. I thought it was weird that
9:12:16 didn't make the top few, being as it should be represented
perfectly, but a detuned 16:17:18 ended up there. Maybe it'll be
different in this version.

-Mike

🔗Carl Lumma <carl@...>

1/4/2011 12:47:38 AM

Hi Steve,

> IIRC, Mike B said he thought there were too many wrinkles on
> (one of) the lines, but the differences do not look great
> to me. But then, my value of s was effectively 0.85%, not
> the 1.2% I thought it was. Any comments?

None at the moment... still trying to find time to read the
papers you pointed to... -C.

🔗martinsj013 <martinsj@...>

1/4/2011 7:04:51 AM

--- In tuning@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:
> On Mon, Jan 3, 2011 at 12:49 PM, martinsj013 <martinsj@...> wrote:
> > I have only done the comparison for three cross-sections - l=0, u=0 and l+u=1200 ...

> That looks more like it. Thanks for posting this. The line in specific
> I was talking about was the one where the two inner dyads are the
> same, e.g. where l = u ...

OK, I will do that one next, though I am not expecting much difference.

Steve M.

🔗martinsj013 <martinsj@...>

1/8/2011 2:38:46 AM

--- In tuning@yahoogroups.com, Mike Battaglia <battaglia01@...> wrote:
> ... The line in specific
> I was talking about was the one where the two inner dyads are the
> same, e.g. where l = u and o = 2l = 2u. I thought it was weird that
> 9:12:16 didn't make the top few, being as it should be represented
> perfectly, but a detuned 16:17:18 ended up there. Maybe it'll be
> different in this version.

I have now done the comparison (s=0.85% vs s=1%) for l=u, please see:
/tuning/files/SteveMartin/t1pclu.xls

There are some differences (e.g. slightly fewer minima) but not much that I can see.

Steve M.

🔗Mike Battaglia <battaglia01@...>

1/8/2011 6:16:36 AM

Hi Steve,

Thanks for this. Can we get another svg graph, if possible? Would be really
helpful.

-Mike

On Sat, Jan 8, 2011 at 5:38 AM, martinsj013 <martinsj@...> wrote:

>
>
> --- In tuning@yahoogroups.com <tuning%40yahoogroups.com>, Mike Battaglia
> <battaglia01@...> wrote:
> > ... The line in specific
>
> > I was talking about was the one where the two inner dyads are the
> > same, e.g. where l = u and o = 2l = 2u. I thought it was weird that
> > 9:12:16 didn't make the top few, being as it should be represented
> > perfectly, but a detuned 16:17:18 ended up there. Maybe it'll be
> > different in this version.
>
> I have now done the comparison (s=0.85% vs s=1%) for l=u, please see:
> /tuning/files/SteveMartin/t1pclu.xls
>
> There are some differences (e.g. slightly fewer minima) but not much that I
> can see.
>
> Steve M.
>
>
>