back to list

RE: [harmonic_entropy] Re: Eureka part one (actually, complexity measures)

🔗Paul H. Erlich <PERLICH@...>

5/9/2001 2:59:49 PM

Robert Valentine wrote,

>take output into excel and ooo and ahhh when sensible
> results appear (like 204c being the 'best' B for
> the Ionian, followed by 194c)

Can I ask you where this is coming from? I always get something like 194¢ as
best.

>Okay, so one thing that is interesting for input from this
>list is 'step 2'. Paul has produced his lovely graphs, and
>from the documentation I down;oaded, I couldn't figure out
>a hack that produced similar ones.

Why don't you reproduce the calculation itself, as Manuel has?
Or if you really want a hack, you can probably just superimpose a bunch of
bell curves.

Thats okay, I'm an engineer, not a mathematician. If I can
hack a solution that produces 'good enough' results (like
the 204c and 194c mentioned above) then I have something I
can feel good about investigating.

>So, firstly, I believe that to produce something more like
>Pauls HE graph I would sum at each point, rather than taking
>minimum,

Yup.

>while still using the existing technique to identify
>the identity of the RI at that point on the graph?

Unnecessary.

>Regarding the complexity calculation. I am currently just
>using the product, with a pinch of fudging to favor otonal
>relationships

Any dyad is just as much "otonal" as "utonal". I think you mean "rooted"
relationships, with duality _not_ assumed.

>In point of fact, leaving
>out the otonal 'correction' does not have much affect on the
>final results once all rotations are considered.

That's because 5:4 and 8:5 must represent the same dyad, correct?

>It may have
>a more result on the modal minimums, which are also
>interesting,

I don't think the "modal minima" according to the algorithm you've described
will be useful except in a style of music where the modal tonic occurs as a
drone, a melody is played against that, and no further simultaneities
between notes occur.

>Oh, an IMPORTANT point (probably THE most important) is the
>data culling.

Not really understanding but it all sounds overly complicated.

>The point here is that harmonic entropy mixes complexity of
>the intervals being heard and the accuracy that they are being
>produced at. Until I feel that I am mixing them in a sensible
>manner, I'll get more trustworthy results by seperating them.

Oh, OK. I'm here to help you with the mixing. I don't think the two can be
separated, actually -- in harmonic entropy they're pretty much one and the
same.

>SOmething that came up on the other list, and is also something
>to consider here, is ways to make my algorithm produce results
>which are more 'lattice-like'. This may be an interesting are
>for harmonic entropy anyhow, as I believe that the 9/8 pocket
>SHOULD be deeper than the 8/7 or 10/9, although if this opens
>a debate that can be settled by creating this structure with a
>different name, I'm all for it.

I think you may be thinking as follows: 9/8 will occur more often than 8/7
or 10/9, because 9/8 is just two 3/2s. You're right. But I think this result
has to (and will) come out as a result of the kinds of scale investigations
you're doing. I don't believe it should be built in as a premise or feature
of the underlying harmonic entropy curve.

>SO (engineer, not scientist) a complexity measure that I
>posed in the other list was to consider both the prime factors
>and the distance on the lattice (exponents).

I'd agree with this in principle.

>For instance,
>just the product of these would be

>81/64 = 3^4/2^6 => 3*4*2*6 = 144

Why are you multiplying 3*4 with 2*6 rather than adding them? A lattice
approach would seem to indicate addition.

I'd argue that (as you can see on some of the plots I did) harmonic entropy
seems to indicate that the complexity associated with each prime factor p is
log(p). So I'd perform the calculation as follows:

81/64 = 3^4/2^6 => log(3)*4 + log(2)*6.

>The case I had which this seemed to solve was that 19/15 at
>409c swallowed (for better or worse) the 81/64 at 408. Although
>pythagorean major scales came out as a minimum in the program,
>the fact that they report the third as 19/15 suggests a useage
>model that may be misleading.

Aha -- now here's something we can agree on! You see, you shouldn't expect
every interval in the scale to come out as a local minimum of harmonic
entropy. For example, when you found B=194¢ for the diatonic scale, what
ratio is that? Answer: it's not a ratio! It's a mean-tone.

Similarly, if you're finding Pythagorean diatonic scales with your program,
it's only because your fifths are so strong. You _are_ getting 9/8s and
81/64s, but this is _unrelated_ to what may lie near these intervals in the
harmonic entropy graph. These intervals are _resultants_, _by-products_ of
the strong fifths. So 9/8 and 81/64 are important scale intervals because of
harmonic entropy . . . it's sort of an "emergent" feature of harmonic
entropy. That's a good thing. But it would be a mistake to attempt to go
back to the original harmonic entropy model and try to favor these intervals
from the start. They're already (implicitly, rather than explicitly) favored
enough!

>3*2* (4*6)^(1/2) uses a more accurate distance metric (2 comes
>from the number of dimensions one is travelling) and

>3*2* (4+6) uses more of a manhatten distance.

I don't get either of these. If the length of one step along each prime axis
is p, then the manhattan distance for 81/64 would be 3*4 + 2*6 -- right?

But again, I feel strongly that log(p) rather than p should be the length of
one step . . . because of Tenney and because of harmonic entropy.

>I haven't addresses octave equivalence here, and thats
>important.

You might want to look over the discussion of octave-equivalent harmonic
entropy models and graphs.