Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Adaptive step size for 1 bit entropie calculation

0 views
Skip to first unread message

pgm...@googlemail.com

unread,
Jul 30, 2006, 9:23:36 AM7/30/06
to
i need to find the step size required to get 1 bit of entropy from a
signal, in order to code it with 1 bit per sample.
this figure should clear up some questions:
http://www1.file-upload.net/30.07.06/etw564.png

so what i did is start from very large step size R and divide it by 2,
calculate the entropy and check if it is under 1.001 until i pass the
1.001 mark. Then i take a very small step size L, multiply it by 2
until the entropy is greater than 0.999 bits. I created two while loops
for each L and R step sizes. The problem is, I get two step sizes which
give an entropie close to one, but not close enough. How would I do it,
so that the entropie gets closer to one, that is, the stepsizes geta
adjusted so that the entropie is closer to 1. I'm perplexed.
I'm using this method of division by 2 because I was told to use it, I
had it explained up to the point where the values pass their bounds
(1.001 and 0.999), after that I was left with a blank.

If somebody has a better method or any suggestions, I would greatly
appreciate it. Thank you,

Paul

Roger Stafford

unread,
Jul 30, 2006, 3:43:39 PM7/30/06
to
In article <1154265816....@b28g2000cwb.googlegroups.com>,
pgm...@googlemail.com wrote:

-----------------
I think you may have misunderstood the advice you received. It sounds
as though it was meant to be the following "binary" strategy.

Get the entropies for a large step, R, and a small step L. Assuming
these straddle the desired value of 1, then do:

1. Choose the midpoint step size S = (R+L)/2.
2. Get the entropy for S, E(S).
3. If E(S) < 1, set L equal to S, and otherwise set R equal to S.
4. Repeat steps 1, 2, and 3 until E(S) is sufficiently close to 1.

You can do all of this with a single 'while' loop using the appropriate
if-else-end construct inside. The entropies for R and L will continue to
straddle 1, but the distance between R and L will be halved at each trip
through the loop, so the process should converge very rapidly.

Roger Stafford

Wiggie

unread,
Jul 31, 2006, 5:50:46 AM7/31/06
to

Thanks Roger, you were right, I totally misunderstood. It worked,
however, if E(S) < 1 then R needs to be set to S otherwise S increases
and the entropie tends towards zero.

Cheers,

Paul

Roger Stafford

unread,
Jul 31, 2006, 11:02:52 AM7/31/06
to
In article <1154339446.4...@i3g2000cwc.googlegroups.com>,
"Wiggie" <pgm...@googlemail.com> wrote:

> however, if E(S) < 1 then R needs to be set to S otherwise S increases
> and the entropie tends towards zero.

> Paul
---------------
Yes, apparently I had that backwards. If you start with an initial E(R)
<= 1 and E(L) >= 1, step 3 should always be such as to retain these
inequalities.

Roger Stafford

0 new messages