back to list

for Gene - computing harmonic entropy

🔗Carl Lumma <ekin@lumma.org>

6/22/2007 1:57:24 AM

Gene,

Have you seen this?

-Carl

________________________________________________________________________

Date: Sun, 1 Feb 2004.
Subject: Re: Newbie question: How is harmonic entropy computed?

> 1) I must be missing something, but I did not find any clear
> indication of a how harmonic entropy is computed. What I see is a lot
> of nice figures and diagrams but I am still looking for clues to how
> they are obtained... I understand that it is directly related to
> Shannon's entropy definition Xlog(X), where X is a probability density
> function, but could not derive an algorithm by myself. If available,
> could anyone give a pointer to a place where the computations are
> explained more precisely? The best introductory site I could find
> is at < http://tonalsoft.com/enc/harmentr.htm > but this page is
> lacking the maths I am looking for.

The probability function you are looking for begins as a three-parameter
function, where the first parameter (call it j) is an index into your
big series of ratios (Farey series or n*d < 10000 or 65536 or whatever)
and the second (call it i) is the interval actually being heard.

We define p(j,i,s) as 1/(s*sqrt(2*pi)) times the integral from
mediant(j-1,j) to mediant(j,j+1) of

exp( -(cents(t)-cents(i))^2 / (2s^2) ) dt

IMPORTANT: the units of t should be logarithmic, e.g., cents.

The function you're integrating is nearly constant, so it makes little
difference if we replace this with

(mediant(j+1,j)-mediant(j,j-1))*exp( -(cents(j)-cents(i))^2 / (2s^2) )

and a constant of proportionality determined such that

SUM (p(j,i,s)) = 1
j

It apparently also makes little difference in the result, if for the
partitioning, we replace the mediants with means (which will allow us to
use voronoi cells in the generalization to higher dimensions), giving

1/2*(cents(j+1)-cents(j-1))*exp( -(cents(j)-cents(i))^2 / (2s^2)

In any case, the harmonic entropy HE(i,s) is then simply

-SUM (p(j,i,s) log(p(j,i,s)))
j

> 2) Actually I am interested in finding a way to measure the
> consonance/dissonance of complex chords: not only for dyads, triads
> or tetrads, but for an arbitrary multi-component cluster. I
> understand that in this case the decomposition into individual dyads
> does not lead to a correct measure of harmonic entropy, as stated at
> < http://tonalsoft.com/td/erlich/entropy.htm >. How did you obtain
> the HE for triads and tetrads ?

I haven't yet, but I've described how the calculation is to be done,
shown some partitionings for triads using Voronoi cells, and how the
sizes of these partitions follow the same dependence on the product of
the numbers in the three-term ratio as, in the dyadic case, the mediants
follow on the product of the numbers in the ordinary ratio. So some
features of triadic and tetradic harmonic entropy are already known with
a high degree of certainty.

> 3) Last question, how closely is harmonic entropy related to Barlow's
> harmonicity? Harmonicity seems less mathematically founded than
> harmonic entropy, but it has a straightforward generalization to
> complex chords by substituting LCM/GCD to the original p/q ratio.

For a two-note chord, LCM/GCD gives you p*q, not p/q, so I'm not sure I
understand this substitution. But the harmonic entropy calculations and
graphs seem to be saying that for any selection of not-too-complex
ratios or chords, the product of the numbers in the ratio (be it two-
term, three-term . . .) gives the correct dissonance ranking, and in
fact the log of the product gives the approximate relative entropy. See
the "files" folder of this group.

But that doesn't explain what you then *do* with the ratio in Barlow's
Harmonicity, which doesn't seem related to how we perceive consonance,
especially in that it assumes that simple ratios have zero tolerance for
mistuning, and can't even accept irrational inputs! This substitution
you mention, to the extent that it's valid, really has little to do with
the rest of Barlow's formulation, correct?

Best,
Paul

🔗Gene Ward Smith <genewardsmith@sbcglobal.net>

6/22/2007 1:11:22 PM

--- In tuning-math@yahoogroups.com, Carl Lumma <ekin@...> wrote:
>
> Gene,
>
> Have you seen this?

Doesn't seem familiar. Where's it from?

🔗Carl Lumma <ekin@lumma.org>

6/23/2007 1:36:20 AM

At 01:11 PM 6/22/2007, you wrote:
>--- In tuning-math@yahoogroups.com, Carl Lumma <ekin@...> wrote:
>>
>> Gene,
>>
>> Have you seen this?
>
>Doesn't seem familiar. Where's it from?

Beats me. Is it something you wanted to know?

-Carl

🔗Carl Lumma <ekin@lumma.org>

6/23/2007 8:09:14 PM

Reason I ask is, I feel discussion here often gets bogged
down in minutia like which averaging function to use (it doesn't
matter much) or which lattice distance to use (ditto), while
the big questions go untouched. I'm far from guiltless in this.

What would it take to calculate the entropy of a 7-note chord?
I should think we'll need:

* A space in which 7-ads can be embedded.
* A way to partition the space into cells associated with
each chord.
* A generalization of the gaussian distribution to said space.
* A way to integrate under said distribution between cell
boundaries.

-Carl

>>> Gene,
>>>
>>> Have you seen this?
>>
>>Doesn't seem familiar. Where's it from?
>
>Beats me. Is it something you wanted to know?
>
>-Carl

🔗Carl Lumma <ekin@lumma.org>

6/27/2007 8:14:23 AM

For triads, Paul tried a triangular dyads plot partitioned
by voronoi cells. But after the n*d product worked so well
to estimate the mediant-mediant widths for dyads, I believe
he abandoned voronoi cells in favor of a*b*c for triads.

-Carl

At 08:09 PM 6/23/2007, you wrote:
>Reason I ask is, I feel discussion here often gets bogged
>down in minutia like which averaging function to use (it doesn't
>matter much) or which lattice distance to use (ditto), while
>the big questions go untouched. I'm far from guiltless in this.
>
>What would it take to calculate the entropy of a 7-note chord?
>I should think we'll need:
>
>* A space in which 7-ads can be embedded.
>* A way to partition the space into cells associated with
>each chord.
>* A generalization of the gaussian distribution to said space.
>* A way to integrate under said distribution between cell
>boundaries.
>
>-Carl
>
>>>> Gene,
>>>>
>>>> Have you seen this?
>>>
>>>Doesn't seem familiar. Where's it from?
>>
>>Beats me. Is it something you wanted to know?
>>
>>-Carl