back to list

RE: [harmonic_entropy] Digest Number 188

🔗Bohlen, Heinz <heinz.bohlen@...>

2/3/2004 8:30:45 AM

This is a message for Paul.
Your answer to the "newbie question" made me look at your entropy graphs
once more, and from that resulted two comments.

1. It would be helpful if the graphs didn't just end at 2:1, but would
include 3:1. Not only because this would aid an assessment of the BP scale
with regard to harmonic entropy, but also because it might generally shed
some light on the perception of intervals that exceed the octave span.

2. The differentiation between intervals as presented in the graphs appears
a bit low, and the entropy for intervals generally accepted as strikingly
consonant a bit high, at least from my admittedly subjective view point. To
explain what I mean: using Shannon's entropy expression, the probability
resulting from the Mann series for the fifth (HE ~ 2.5) turns out to be ~ 18
% only, and for the major sixth (HE ~ 3) ~ 12 % only, while the worst case
is as high as ~ 7 % (HE ~ 3.8). Or am I interpreting the graphs incorrectly?

Heinz

-----Original Message-----
From: harmonic_entropy@yahoogroups.com
[mailto:harmonic_entropy@yahoogroups.com]
Sent: Monday, February 02, 2004 1:17 PM
To: harmonic_entropy@yahoogroups.com
Subject: [harmonic_entropy] Digest Number 188

There are 2 messages in this issue.

Topics in this digest:

1. Newbie question: How is harmonic entropy computed?
From: Denix 13 <denix13@...>
2. Re: Newbie question: How is harmonic entropy computed?
From: "wallyesterpaulrus" <wallyesterpaulrus@...>

________________________________________________________________________
________________________________________________________________________

Message: 1
Date: Mon, 2 Feb 2004 02:41:21 +0100
From: Denix 13 <denix13@...>
Subject: Newbie question: How is harmonic entropy computed?

Hi there,

I am a new subscriber to the list so excuse me for my silly
preliminary questions.

1) I must be missing something, but I did not find any clear
indication of a how harmonic entropy is computed. What I see is a lot
of nice figures and diagrams but I am still looking for clues to how
they are obtained... I understand that it is directly related to
Shannon's entropy definition Xlog(X), where X is a probability density
function, but could not derive an algorithm by myself. If available,
could anyone give a pointer to a place where the computations are
explained more precisely? The best introductory site I could find
is on the <a href="http://tonalsoft.com/enc/harmentr.htm"> harmony
entropy page at Joe Monzo's Site</a> but this page is lacking the
maths I am looking for.

2) Actually I am interested in finding a way to measure the
consonance/dissonance of complex chords: not only for dyads, triads,
or tetrads, but for an arbitrary multi-component cluster. I understand
that in this case the decomposition into individual dyads does not
lead to a correct measure of harmonic entropy, as stated in the
page <a href="http://tonalsoft.com/td/erlich/entropy.htm"> on Harmonic
Entropy</a>. How did you obtain the HE for triads and tetrads ?

3) Last question, how closely is harmonic entropy related to Barlow's
harmonicity? Harmonicity seems less mathematically founded than
harmonic entropy, but it has a straightforward generalization to
complex chords by substituting LCM/GCD to the original p/q ratio.

Thanks.

________________________________________________________________________
________________________________________________________________________

Message: 2
Date: Mon, 02 Feb 2004 05:04:47 -0000
From: "wallyesterpaulrus" <wallyesterpaulrus@...>
Subject: Re: Newbie question: How is harmonic entropy computed?

--- In harmonic_entropy@yahoogroups.com, Denix 13 <denix13@w...>
wrote:

> Hi there,
>
> I am a new subscriber to the list so excuse me for my
silly
> preliminary questions.

Not silly at all -- don't be shy!

> 1) I must be missing something, but I did not find any
clear
> indication of a how harmonic entropy is computed. What I see is a
lot
> of nice figures and diagrams but I am still looking for clues to
how
> they are obtained... I understand that it is directly related
to
> Shannon's entropy definition Xlog(X), where X is a probability
density
> function, but could not derive an algorithm by myself. If
available,
> could anyone give a pointer to a place where the computations
are
> explained more precisely? The best introductory site I could
find
> is on the <a href="http://tonalsoft.com/enc/harmentr.htm">
harmony
> entropy page at Joe Monzo's Site</a> but this page is lacking
the
> maths I am looking for.

It might have helped to also look at
http://tonalsoft.com/td/erlich/entropy.htm . . .

Then again, it might not.

The probability function you are looking for begins as a three-
parameter function, where the first parameter (call it j) is an index
into your big series of ratios (Farey series or n*d < 10000 or 65536
or whatever) and the second (call it i) is the interval actually
being heard.

We define p(j,i,s) as 1/(s*sqrt(2*pi)) times
the integral from
mediant(j-1,j)
to
mediant(j,j+1)
of
exp( -(cents(t)-cents(i))^2 / (2s^2) ) dt

IMPORTANT: the units of t should be logarithmic, e.g., cents.

The function you're integrating is nearly constant, so it makes
little difference if we replace this with

(mediant(j+1,j)-mediant(j,j-1))*exp( -(cents(j)-cents(i))^2 / (2s^2) )

and a constant of proportionality determined such that

SUM (p(j,i,s)) = 1
j

It apparently also makes little difference in the result, if for the
partitioning, we replace the mediants with means (which will allow us
to use voronoi cells in the generalization to higher dimensions),
giving:

1/2*(cents(j+1)-cents(j-1))*exp( -(cents(j)-cents(i))^2 / (2s^2)

In any case, the harmonic entropy HE(i,s) is then simply

-SUM (p(j,i,s) log(p(j,i,s)))
j
.

> 2) Actually I am interested in finding a way to measure
the
> consonance/dissonance of complex chords: not only for dyads,
triads,
> or tetrads, but for an arbitrary multi-component cluster. I
understand
> that in this case the decomposition into individual dyads does
not
> lead to a correct measure of harmonic entropy, as stated in
the
> page <a href="http://tonalsoft.com/td/erlich/entropy.htm"> on
Harmonic
> Entropy</a>. How did you obtain the HE for triads and tetrads ?

I haven't yet, but I've described how the calculation is to be done,
shown some partitionings for triads using Voronoi cells, and how the
sizes of these partitions follow the same dependence on the product
of the numbers in the three-term ratio as, in the dyadic case, the
mediants follow on the product of the numbers in the ordinary ratio.
So some features of triadic and tetradic harmonic entropy are already
known with a high degree of certainty.

> 3) Last question, how closely is harmonic entropy related to
Barlow's
> harmonicity? Harmonicity seems less mathematically founded
than
> harmonic entropy, but it has a straightforward generalization
to
> complex chords by substituting LCM/GCD to the original p/q ratio.

For a two-note chord, LCM/GCD gives you p*q, not p/q, so I'm not sure
I understand this substitution. But the harmonic entropy calculations
and graphs seem to be saying that for any selection of not-too-
complex ratios or chords, the product of the numbers in the ratio (be
it two-term, three-term . . .) gives the correct dissonance ranking,
and in fact the log of the product gives the approximate relative
entropy. See the "files" folder of this group.

But that doesn't explain what you then *do* with the ratio in
Barlow's Harmonicity, which doesn't seem related to how we perceive
consonance, especially in that it assumes that simple ratios have
zero tolerance for mistuning, and can't even accept irrational
inputs! This substitution you mention, to the extent that it's valid,
really has little to do with the rest of Barlow's formulation,
correct?

Best,
Paul

________________________________________________________________________
________________________________________________________________________

------------------------------------------------------------------------
Yahoo! Groups Links

To visit your group on the web, go to:
/harmonic_entropy/

To unsubscribe from this group, send an email to:
harmonic_entropy-unsubscribe@yahoogroups.com

Your use of Yahoo! Groups is subject to:
http://docs.yahoo.com/info/terms/
------------------------------------------------------------------------

🔗wallyesterpaulrus <wallyesterpaulrus@...>

2/3/2004 1:07:19 PM

--- In harmonic_entropy@yahoogroups.com, "Bohlen, Heinz"
<heinz.bohlen@c...> wrote:
> This is a message for Paul.
> Your answer to the "newbie question" made me look at your entropy
>graphs
> once more, and from that resulted two comments.
>
> 1. It would be helpful if the graphs didn't just end at 2:1, but
>would
> include 3:1. Not only because this would aid an assessment of the
>BP scale
> with regard to harmonic entropy, but also because it might
>generally shed
> some light on the perception of intervals that exceed the octave
>span.

Most of the graphs (including the one on the homepage of this group)
go to 4:1, some go even further.

> 2. The differentiation between intervals as presented in the graphs
>appears
> a bit low, and the entropy for intervals generally accepted as
>strikingly
> consonant a bit high, at least from my admittedly subjective view
>point.

Sure. You may prefer the version of harmonic entropy which includes
unreduced ratios, though then the weighting of the ratios is more
arbitrary since the concept of "width" is no longer applicable. It
seems that in the usual case, the simplest ratios get an entropy
proportional to log(n*d) above the entropy of 1:1, while if you
include unreduced ratios, it's proportional to n*d above the entropy
of 1:1.

> To
> explain what I mean: using Shannon's entropy expression, the
>probability
> resulting from the Mann series for the fifth (HE ~ 2.5) turns out
>to be ~ 18
> % only, and for the major sixth (HE ~ 3) ~ 12 % only,

How are you obtaining these probabilities? You have to assume an
actual heard interval in addition to a putative ratio . . . Maybe
you're misinterpreting Shannon's expression? It's a sum of (-p*log(p))
over probabilites which themselves sum to 1, but it cannot relate to
a single probability.

🔗wallyesterpaulrus <wallyesterpaulrus@...>

2/3/2004 10:49:40 PM

--- In harmonic_entropy@yahoogroups.com, "wallyesterpaulrus"
<wallyesterpaulrus@y...> wrote:
> --- In harmonic_entropy@yahoogroups.com, "Bohlen, Heinz"
> <heinz.bohlen@c...> wrote:
> > This is a message for Paul.
> > Your answer to the "newbie question" made me look at your entropy
> >graphs
> > once more, and from that resulted two comments.
> >
> > 1. It would be helpful if the graphs didn't just end at 2:1, but
> >would
> > include 3:1. Not only because this would aid an assessment of the
> >BP scale
> > with regard to harmonic entropy, but also because it might
> >generally shed
> > some light on the perception of intervals that exceed the octave
> >span.
>
> Most of the graphs (including the one on the homepage of this
group)
> go to 4:1, some go even further.

This one goes nearly to 18:1 --

/harmonic_entropy/files/dyadic/heinz.gif