It's a perfectly good question, but we don't have anything ready made
specifically for the generalized extreme value distribution, or
similar extreme value analysis.
If your initial estimate for your data is good, then I would use it as
starting value for the optimization in the bootstrap samples. This
would start the optimization in the correct neighborhood and fmin_bfgs
should work, in many cases fmin_l_bfgs_b is more stable even without
bounds.
The longer answer:
I haven't looked at this in several years.
(I stopped working on distributions, when, for a while, I didn't feel
like giving away all my code.)
so, if I remember correctly:
estimating the parameters for the GEV by maximum likelihood can be
tricky. Many alternative estimators have been developed because MLE
had a bad reputation for this case. However, it's not so bad but needs
good starting values, especially if all parameters are estimated and
loc is not fixed.
IIRC, I got incorrect convergence if the sign of the starting values
was wrong, but relatively good convergence if the sign was right.
based on a quick look at the documentation of two R packages: `evd`
starts with moment estimators in reparameterized model while
`extRemes` seems to start with L-moments.
Scipy has basin-hopping as a global optimizer which could work quite
well, but I never tried for this case.
(I have maximum spacing, GMM based on quantiles or cdf, and minimum
distance based on characteristic functions in my trial code, parts of
it are in the sandbox, but those are not in an easily digestible,
clean form, and I don't think they will work better than MLE with
reasonable starting values.)
Josef
>
> Kind regards.