back to list

Cangwu absolute

🔗genewardsmith <genewardsmith@sbcglobal.net>

1/1/2012 1:38:28 AM

If you divide the cangwu badness polynomial by TE complexity, the constant term becomes TE error, and the leading term is proportional to TE complexity. What would be a good name for this?

🔗gbreed@gmail.com

1/1/2012 3:56:25 AM

For equal temperaments you've canceled out complexity. The result is a function of TE error. For higher ranks I don't know how it works out.

Graham

------Original message------
From: genewardsmith <genewardsmith@sbcglobal.net>
To: <tuning-math@yahoogroups.com>
Date: Sunday, January 1, 2012 9:38:28 AM GMT-0000
Subject: [tuning-math] Cangwu absolute

If you divide the cangwu badness polynomial by TE complexity, the constant term becomes TE error, and the leading term is proportional to TE complexity. What would be a good name for this?

------------------------------------

Yahoo! Groups Links

🔗genewardsmith <genewardsmith@sbcglobal.net>

1/1/2012 7:31:35 AM

--- In tuning-math@yahoogroups.com, "gbreed@..." <gbreed@...> wrote:
>
> For equal temperaments you've canceled out complexity. The result is a function of TE error. For higher ranks I don't know how it works out.

You get a polynomial where if Temperament B is either lower in complexity or lower in error than Temperament A, but not both, there is some value of the parameter where the badness is equal. If B is higher in both error and complexity, the polynomial is higher for all (positive) values. This seems useful.

🔗gbreed@gmail.com

1/1/2012 8:43:39 AM

Hang on, what polynomial then?

Graham

------Original message------
From: genewardsmith <genewardsmith@sbcglobal.net>
To: <tuning-math@yahoogroups.com>
Date: Sunday, January 1, 2012 3:31:35 PM GMT-0000
Subject: [tuning-math] Re: Cangwu absolute

--- In tuning-math@yahoogroups.com, "gbreed@..." <gbreed@...> wrote:
>
> For equal temperaments you've canceled out complexity. The result is a function of TE error. For higher ranks I don't know how it works out.

You get a polynomial where if Temperament B is either lower in complexity or lower in error than Temperament A, but not both, there is some value of the parameter where the badness is equal. If B is higher in both error and complexity, the polynomial is higher for all (positive) values. This seems useful.

------------------------------------

Yahoo! Groups Links

🔗genewardsmith <genewardsmith@sbcglobal.net>

1/1/2012 10:15:35 AM

--- In tuning-math@yahoogroups.com, "gbreed@..." <gbreed@...> wrote:
>
> Hang on, what polynomial then?

The Cangwu badness polynomial divided by TE complexity.

🔗Carl Lumma <carl@lumma.org>

1/1/2012 1:44:00 PM

Gene wrote:

>> Hang on, what polynomial then?
>
>The Cangwu badness polynomial divided by TE complexity.

Can you write it out?

-Carl

🔗genewardsmith <genewardsmith@sbcglobal.net>

1/1/2012 4:44:31 PM

--- In tuning-math@yahoogroups.com, Carl Lumma <carl@...> wrote:
>
> Gene wrote:
>
> >> Hang on, what polynomial then?
> >
> >The Cangwu badness polynomial divided by TE complexity.
>
> Can you write it out?

Cangwu badness polynomial (at least as normalized by me):

C(x) = det([(1+x)vi.vj - ai.aj])

Here the vi are weighted vals, and ai are (sum vi)/n, where n is the dimension of the val--in other words, the average of the coefficients of the weighted val.

TE complexity of wedgie W (Gene normalized version) = RMS of weighted wedgie W = sqrt(det[vi.vj] / C(n, r)) where n is the dimension and r is the rank, abd C(n, r) is n choose r.

The polynomial is therefore P(x) = C(x)/(TE complexity).

🔗gbreed@gmail.com

1/2/2012 1:08:20 AM

Whatever vj and aj are supposed to be, you must be dividing the square of Cangwu badness by complexity. That will give a non-trivial result but what's the use of it?

Graham

------Original message------
From: genewardsmith <genewardsmith@sbcglobal.net>
To: <tuning-math@yahoogroups.com>
Date: Monday, January 2, 2012 12:44:31 AM GMT-0000
Subject: [tuning-math] Re: Cangwu absolute

--- In tuning-math@yahoogroups.com, Carl Lumma <carl@...> wrote:
>
> Gene wrote:
>
> >> Hang on, what polynomial then?
> >
> >The Cangwu badness polynomial divided by TE complexity.
>
> Can you write it out?

Cangwu badness polynomial (at least as normalized by me):

C(x) = det([(1+x)vi.vj - ai.aj])

Here the vi are weighted vals, and ai are (sum vi)/n, where n is the dimension of the val--in other words, the average of the coefficients of the weighted val.

TE complexity of wedgie W (Gene normalized version) = RMS of weighted wedgie W = sqrt(det[vi.vj] / C(n, r)) where n is the dimension and r is the rank, abd C(n, r) is n choose r.

The polynomial is therefore P(x) = C(x)/(TE complexity).

------------------------------------

Yahoo! Groups Links

🔗genewardsmith <genewardsmith@sbcglobal.net>

1/2/2012 7:01:19 AM

--- In tuning-math@yahoogroups.com, "gbreed@..." <gbreed@...> wrote:
>
> Whatever vj and aj are supposed to be, you must be dividing the square of Cangwu badness by complexity. That will give a non-trivial result but what's the use of it?

One thing it does is related to the olympic medal procedure as defined by Herman. If one such polynomial lies entirely under another, it must have both error and complexity lower. If two polynomials are equal at some spot, that's the point where the weighting of error vs complexity makes it so, because the range is from error to complexity. Error and complexity are how we usually define these things, so why not with the polynomial?