back to list

HE algorithms

🔗John Chalmers <JHCHALMERS@...>

7/7/2002 5:49:37 PM

I've had a few free days this week and wanted to program the harmonic
entropy algorithm(s) in TrueBasic. Alas, I couldn't find them in the
list archives; can someone post them or email them to me?

--John

🔗manuel.op.de.coul@...

7/8/2002 9:07:38 AM

John,

I've posted my code to this list on 1 Feb. 2001.
For the error function I use an approximation, this is the code:

function Erf (Of_Number : Long_Float) return Long_Float is

function Norm (Zin : in Long_Float) return Long_Float is
P : Long_Float;
Z : constant Long_Float := abs Zin;
begin
P := 1.0 + Z * (0.04986735 + Z * (0.02114101 + Z * (0.00327763 +
Z * (0.0000380036 + Z * (0.0000488906 + Z * 0.000005383)))));
P := P * P; P := P * P; P := P * P;
if Zin >= 0.0 then
return 1.0 - 0.5 / (P * P);
else
return 0.5 / (P * P);
end if;
end Norm;

begin
if Of_Number >= 5.0 then
return 1.0;
elsif Of_Number <= -5.0 then
return 0.0;
else
return Norm(Of_Number);
end if;
end Erf;

Manuel

🔗emotionaljourney22 <paul@...>

7/8/2002 4:14:29 PM

--- In harmonic_entropy@y..., manuel.op.de.coul@e... wrote:

> John,
>
> I've posted my code to this list on 1 Feb. 2001.

thanks manuel -- john, let us know if this can be clarified. i've
also made a number of posts that sketch, conceptually, how harmonic
entropy is calculated -- these should be helpful.

certain things, though, have developed since the code manuel posted.

firstly, i don't remember when the last time the issue was discussed,
but the issue of approximating the error integral would rarely be
important -- a simple rectangular approximation, where the width is
the mediant-to-mediant distance and the height is measured at the
candidate ratio itself, should normally be fine.

secondly, the farey limit has been almost entirely superceded by the
tenney limit (n*d), for which i've typically been using n*d < 65536
or n*d < 10000 (there's rather little difference between the two).
the reason is that the farey series, and other series that were tried
(all of which have superparticular "steps" [actually ratios between
adjacent interval sizes]), lead to a harmonic entropy curve with an
overall "trend". i observed, however, that regardless of the series
used and the trend observed, the local minima had a "strength" or
prominence that agreed very closely with an n*d ranking. when i tried
a tenney series (all ratios with n*d below a certain limit), which
also has superparticular "steps", the same harmonic entropy curve
showed up but without the "trend" -- i.e., the overall "trend" was
flat.

next, the 1/sqrt(n*d) approximation is often used for the "widths" in
conjuction with the tenney seeding above. the "widths"
are "traditionally" computed as mediant-to-mediant distances. i
observed (and it should be easy to prove) that, except for the most
complex ratios near the tenney limit, the mediant-to-mediant distance
surrounding each ratio n/d is approximately proportional to 1/sqrt
(n*d). any tenney limit is an arbitrary and artifical stopping point,
and if low enough this arbitrary choice will create visible artifacts
on the curve. using the 1/sqrt(n*d) tends to minimize the effects of
these artifacts. i've posted graphs that directly compare the results
of using this approximation vs. using the actual mediant-to-mediant
widths -- those graphs can be found in the files section of this list
and/or the tuning list.

(then there's the unreduced fraction variation, which i'll skip over
for now)

finally, the vos curve. i posted a summary of some of joos vos's
research to the tuning list -- basically he found that the "purity"
rating for harmonic intervals, even with sine waves, follows an
absolute exponential, rather than bell, curve (with the peak, of
course, at the just ratio). it's easy enough to incorporate this idea
into the harmonic entropy function by using a probability function of
the form exp(-|r|/s) instead of exp(-r^2/s). there may be some
justification for this in the neurological timing mechanisms for
pitch perception that have come to the forefront in the last 20
years. the resulting curves seem to be fractal-like; there appears to
be a infinitude of ever-finer local minima at more and more complex
ratios, but only big minima, corresponding as always to the simplest
ratios, are visible. the results have seemed to provide a much more
satisfying dissonance function for various listeners such as margo,
gene, and george.

here's the matlab code i'm using to calculate the latest curves,
which are tenney-seeded, 1/sqrt(n*d)-width-approximated, vos-curve
entropy functions:

function entropy=t2vosinp(cents,r,s);
in=cents/1200*log(2);
a=size(r,1);
p=r(:,1).*exp(-abs((in-log(r(:,2)./r(:,3)))/s));
n=sum(p);
p=p/n;
p(find(p==0))=ones(size(find(p==0)));
entropy=-sum(p.*log(p))';

the input arguments are cents (the independent variable which is the
actual interval in cents), s (the hearing resolution, usually .01 for
a 1% hearing resolution), and the seed-ratio array r. r has three
columns: "width", numerator, and denominator. for example, for the
latest graph i produced for george, which was seeded with tenney
limit n*d <=10000, r was an array with 63868 rows. here are the first
10 rows:

0.01 1 10000
0.0100005000375031 1 9999
0.010001000150025 1 9998
0.0100015003375844 1 9997
0.0100020006002001 1 9996
0.0100025009378908 1 9995
0.0100030013506754 1 9994
0.0100035018385725 1 9993
0.0100040024016011 1 9992
0.0100045030397799 1 9991

clear? this itself only required a few lines of code to generate. so
altogether, not much is required in the way of coding, and the entire
harmonic entropy function could probably be written as a single-line
expression if you have fancy enough symbology for taking sums over a
pair of indices which are restricted to be mutually prime, etc. . . .

happy computing!