back to list

Poor man's ?tropy

🔗Gene Ward Smith <gwsmith@svpal.org>

1/20/2004 11:34:31 AM

If G_s(x) = sqrt(exp(-(x/2s)^2)/(sqrt(2 pi)s) is the Gaussian
distibution with standard deviation s, and ?' is the
distribution/generalized function which is the derivative of the
Minkowski ? function (for all reals, not just [0,1]), and if "*" as
usual denotes the convolution product, then the ? harmonic entropy
with parameter s is defined as

Ent_s = ?' * G_s

This is very nice, but the convolution product can be much faster to
compute if we use something other than G_s. In particular, the
uniform distibuition of width s, which is U_s(x) = 1/s when
-s/2 <= x <= s/2, and zero otherwise. We have

?' * U_s = (1/s) Integral_{y-s/2...y+s/2} d? = ?(x+s/2)-?(x-s/2)/(2s),

which is a central difference operator (in place of a derivative)
applied to ?(x) and easy to compute, since ? is easy to compute. If
you convolve U_s with itself, you get a triangular distribution, and
if we use ?' * U_s * U_s we get something closer to Gaussian
smoothing. If Q(x) is the integral of ?(x), this is like taking two
successive central difference operators on Q, so a question arises as
to how difficult it is to compute Q.