back to list

Newbie question: How is harmonic entropy computed?

🔗Denix 13 <denix13@...>

2/1/2004 5:41:21 PM

Hi there,

I am a new subscriber to the list so excuse me for my silly
preliminary questions.

1) I must be missing something, but I did not find any clear
indication of a how harmonic entropy is computed. What I see is a lot
of nice figures and diagrams but I am still looking for clues to how
they are obtained... I understand that it is directly related to
Shannon's entropy definition Xlog(X), where X is a probability density
function, but could not derive an algorithm by myself. If available,
could anyone give a pointer to a place where the computations are
explained more precisely? The best introductory site I could find
is on the <a href="http://tonalsoft.com/enc/harmentr.htm"> harmony
entropy page at Joe Monzo's Site</a> but this page is lacking the
maths I am looking for.

2) Actually I am interested in finding a way to measure the
consonance/dissonance of complex chords: not only for dyads, triads,
or tetrads, but for an arbitrary multi-component cluster. I understand
that in this case the decomposition into individual dyads does not
lead to a correct measure of harmonic entropy, as stated in the
page <a href="http://tonalsoft.com/td/erlich/entropy.htm"> on Harmonic
Entropy</a>. How did you obtain the HE for triads and tetrads ?

3) Last question, how closely is harmonic entropy related to Barlow's
harmonicity? Harmonicity seems less mathematically founded than
harmonic entropy, but it has a straightforward generalization to
complex chords by substituting LCM/GCD to the original p/q ratio.

Thanks.

🔗wallyesterpaulrus <wallyesterpaulrus@...>

2/1/2004 9:04:47 PM

--- In harmonic_entropy@yahoogroups.com, Denix 13 <denix13@w...>
wrote:

> Hi there,
>
> I am a new subscriber to the list so excuse me for my
silly
> preliminary questions.

Not silly at all -- don't be shy!

> 1) I must be missing something, but I did not find any
clear
> indication of a how harmonic entropy is computed. What I see is a
lot
> of nice figures and diagrams but I am still looking for clues to
how
> they are obtained... I understand that it is directly related
to
> Shannon's entropy definition Xlog(X), where X is a probability
density
> function, but could not derive an algorithm by myself. If
available,
> could anyone give a pointer to a place where the computations
are
> explained more precisely? The best introductory site I could
find
> is on the <a href="http://tonalsoft.com/enc/harmentr.htm">
harmony
> entropy page at Joe Monzo's Site</a> but this page is lacking
the
> maths I am looking for.

It might have helped to also look at
http://tonalsoft.com/td/erlich/entropy.htm . . .

Then again, it might not.

The probability function you are looking for begins as a three-
parameter function, where the first parameter (call it j) is an index
into your big series of ratios (Farey series or n*d < 10000 or 65536
or whatever) and the second (call it i) is the interval actually
being heard.

We define p(j,i,s) as 1/(s*sqrt(2*pi)) times
the integral from
mediant(j-1,j)
to
mediant(j,j+1)
of
exp( -(cents(t)-cents(i))^2 / (2s^2) ) dt

IMPORTANT: the units of t should be logarithmic, e.g., cents.

The function you're integrating is nearly constant, so it makes
little difference if we replace this with

(mediant(j+1,j)-mediant(j,j-1))*exp( -(cents(j)-cents(i))^2 / (2s^2) )

and a constant of proportionality determined such that

SUM (p(j,i,s)) = 1
j

It apparently also makes little difference in the result, if for the
partitioning, we replace the mediants with means (which will allow us
to use voronoi cells in the generalization to higher dimensions),
giving:

1/2*(cents(j+1)-cents(j-1))*exp( -(cents(j)-cents(i))^2 / (2s^2)

In any case, the harmonic entropy HE(i,s) is then simply

-SUM (p(j,i,s) log(p(j,i,s)))
j
.

> 2) Actually I am interested in finding a way to measure
the
> consonance/dissonance of complex chords: not only for dyads,
triads,
> or tetrads, but for an arbitrary multi-component cluster. I
understand
> that in this case the decomposition into individual dyads does
not
> lead to a correct measure of harmonic entropy, as stated in
the
> page <a href="http://tonalsoft.com/td/erlich/entropy.htm"> on
Harmonic
> Entropy</a>. How did you obtain the HE for triads and tetrads ?

I haven't yet, but I've described how the calculation is to be done,
shown some partitionings for triads using Voronoi cells, and how the
sizes of these partitions follow the same dependence on the product
of the numbers in the three-term ratio as, in the dyadic case, the
mediants follow on the product of the numbers in the ordinary ratio.
So some features of triadic and tetradic harmonic entropy are already
known with a high degree of certainty.

> 3) Last question, how closely is harmonic entropy related to
Barlow's
> harmonicity? Harmonicity seems less mathematically founded
than
> harmonic entropy, but it has a straightforward generalization
to
> complex chords by substituting LCM/GCD to the original p/q ratio.

For a two-note chord, LCM/GCD gives you p*q, not p/q, so I'm not sure
I understand this substitution. But the harmonic entropy calculations
and graphs seem to be saying that for any selection of not-too-
complex ratios or chords, the product of the numbers in the ratio (be
it two-term, three-term . . .) gives the correct dissonance ranking,
and in fact the log of the product gives the approximate relative
entropy. See the "files" folder of this group.

But that doesn't explain what you then *do* with the ratio in
Barlow's Harmonicity, which doesn't seem related to how we perceive
consonance, especially in that it assumes that simple ratios have
zero tolerance for mistuning, and can't even accept irrational
inputs! This substitution you mention, to the extent that it's valid,
really has little to do with the rest of Barlow's formulation,
correct?

Best,
Paul