How do you find two differnt unbiased estimators of mu^2.
I know that E(x-bar)^2 is a biased estimator mu^2.
So far, i've came up with two methods,
#1. using (E(x-bar))^2
#2. using the sampling error square method. Let small q denotes x-bar
and big Q denotes mu.
since bias = (E(q)-Q)^2=0^2
bias=(Q-Q)^2=0^2
I don't know if what I've came up with is correct or not, please guide
me. And if they are correct, how do I prove them that one is more
efficient than the other?
Thank you very much for those who will reply this.
Angel
> How do you find two differnt unbiased estimators of mu^2.
You seek an unbiased estimator of a PRODUCT of population moments ...
* Since the population mean = the first cumulant, we have:
mu = k1
Hence, mu^2 = k1*k1 = a product of cumulants
* An unbiased estimator of a product of cumulants is
given by the polykays (generalised k-statistics).
Using mathStatica, the polykay unbiased estimator is:
In[1]:= PolyK[{1, 1}]
s1*s1 - s2
Out[1]= ___________
n(n-1)
where s1 = Sum[X_i, {i,1,n}]
s2 = Sum[(X_i)^2, {i,1,n}]
Cheers
Colin
Dr Colin Rose
mathStatica.com
_______________
One estimate is by the old-fashioned method of moments. From
E(xbar^2) = mu^2 + (sig^2)/N
E(s^2) = sig^2,
an unbiased estimate of mu^2 is
t = xbar^2 - (s^2)/N
Jack
Why do you want to find even one unbiased estimator of mu^2?
Whatever is an unbiased estimator of mu^2 will be a biased
estimator of mu.
What do we even want an unbiased estimator for mu, or for
any parameter, for that matter?
-- Bob.
I suppose that once you get down to the final
step of whatever you have, it is easy to opt
for 'minimum variance' instead of unbiased.
But, y'know, a lot of times we take a mean
and use it for computing something else,
and it can get nasty if those biases accumulate....
--
Rich Ulrich, wpi...@pitt.edu
http://www.pitt.edu/~wpilib/index.html
Not really.
>
> But, y'know, a lot of times we take a mean
> and use it for computing something else,
> and it can get nasty if those biases accumulate....
Ummmm... are you in a COMPUTING department or in a
STATISTICS related department? The notion of unbiasedness
has to do with the expected value of the estimator.
Did you know that the sample standard deviation s is
a BIASED estimator of the population standard deviation
sigma? s-squared is the unbiased estimate of the
population variance sigma-squared.
Just another silliness in the statistical criterion of
unbiasedness.
Do you know ANYONE who ever uses an unbiased estimate for
the population standard deviation sigma?
>
> --
> Rich Ulrich, wpi...@pitt.edu
> http://www.pitt.edu/~wpilib/index.html
-- Bob.
Your comment is quite understandable given your affiliation.
See my reply to Richard Ulrich on the notion of "unbiasedness"
in statistics.
-- Bob.
>
> Richard Ulrich wrote:
> > On 7 Feb 2005 17:31:15 -0800, Large_Nass...@Yahoo.com wrote:
> > [ ... ]
[ ... ]
>
> Did you know that the sample standard deviation s is
> a BIASED estimator of the population standard deviation
> sigma? s-squared is the unbiased estimate of the
> population variance sigma-squared.
>
[ ... ]
>
> Do you know ANYONE who ever uses an unbiased estimate for
> the population standard deviation sigma?
This seems to incidentally demonstrate the good
sense of my post.
Did you know ANYONE who does not choose to combine
sigmas, instead of sigma-squareds?
As I stated, unbiased isn't so important, once you get to
the final step of your conclusions. But you really don't
want bias for the numbers you combine further.
If seems you are evading my question.
>
> Did you know ANYONE who does not choose to combine
> sigmas, instead of sigma-squareds?
As in a pooled standard deivation? In the commonly used form,
the pooling is done on the VARIANCES, and the pooled standard
deviation is still biased.
So what's your point about the good sense of your post?
> --
> Rich Ulrich, wpi...@pitt.edu
> http://www.pitt.edu/~wpilib/index.html
-- Bob.
[...]
> >
> > Did you know ANYONE who does not choose to combine
> > sigmas, instead of sigma-squareds?
>
> As in a pooled standard deivation? In the commonly used form,
> the pooling is done on the VARIANCES, and the pooled standard
> deviation is still biased.
>
> So what's your point about the good sense of your post?
Oh. Sorry, I assumed you would immediately recognize
that the variances were unbiased. And that unbiasedness
was a main reason to combine variances, instead of
combining the deviations.
>Do you know ANYONE who ever uses an unbiased estimate for
>the population standard deviation sigma?
I actually do know someone who uses an unbiased estimate of sigma on
occasion,namely myself. When I need an unbiased estimate of the p-th
quantile of a normal distribution, particularly for small sample
sizes, I use
xbar + (z_p)*(s*a_N)
where a_N = Gamma(N/2)/[sqrt((N-1)/2)*Gamma((N-1)/2)].
Jack
You missed my original point -- that the unbiasedness criterion
is plain SILLY, especially when one needs and uses the sample
standard deviation s, and NOT the sample variance s-squared.
s-squared is unbiased for sigma-squared, but s is biased for
sigma!
> And that unbiasedness
> was a main reason to combine variances, instead of
> combining the deviations.
Sorry, that's not correct.
May I suggest that you go back and review the theory of
statistical estimation and the various criteria that are
commonly used, including unbiasedness. One of the
properties unbiasedness lacks is the property of
INVARIANCE under a non-linear transformation.
You can never have an estimate t that is an unbiased estimate
of a population theta, say, and have f(t) being an unbiased
estimate of f(theta) if f(.) is any non-linear function.
s and s-square is just one common example where one often
insists on unbiasedness, but NEVER use an unbiased estimate
for the quantity they use, namely, they always use the
BIASED estimate s for the stand deviation sigma.
Hope this clarified the issue.
>
> --
> Rich Ulrich, wpi...@pitt.edu
> http://www.pitt.edu/~wpilib/index.html
-- Bob.
Congratulations! You must be one of the handful of people in the
world who even know that s*a_N is an/the unbiased estimate of
sigma. :-) Your estimator looks complicated enough that I
assume is the correct one. I had seen it only once, years ago,
in the American Statistician.
-- Bob.
On 8 Feb 2005 11:21:01 -0800, Large_Nass...@Yahoo.com wrote:
>
> Richard Ulrich wrote:
> > On 8 Feb 2005 07:27:59 -0800, Large_Nass...@Yahoo.com wrote:
> >
[...]
>
> s-squared is unbiased for sigma-squared, but s is biased for
> sigma!
- exactly as I said.
>
me >
> > And that unbiasedness
> > was a main reason to combine variances, instead of
> > combining the deviations.
[ ... ]
- well, okay, probably not a 'main' reason in this instance.
Various reasons tie in together.
--
Rich Ulrich, wpi...@pitt.ed
http://www.pitt.edu/~wpilib/index.html