Hi there,
I think the long wait time is kind of expected given the number of hyperparameter combinations. Also, gradient boosting and random forests, which are ensemble methods themselves, are not particularly cheap to fit. I would recommend setting verbose to 2 (or maybe even 3; top off my head, I am not sure if 3 is different from 2 though) to get at least some output and feedback on how it's progressing.
Instead of grid search, maybe you also want to consider random search (
http://scikit-learn.org/stable/modules/generated/sklearn.model_selection.RandomizedSearchCV.html).
Best,
Sebastian
> On Jul 3, 2018, at 9:56 PM, devarsh raghnathbhai patel <
devarsh...@gmail.com> wrote:
>
> I am trying to run the attached code, i have tried training it for 4 hours on 8cores and 40 min in 32 cores, but didn't complete the training in either of the cases. It just keeps running, when i put verbose=1, it is showing me how many tasks are left and elapsed time, but i dont understand why does it so long to run and still not give any output, kindly help me out.
>
> --
> You received this message because you are subscribed to the Google Groups "mlxtend" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to
mlxtend+u...@googlegroups.com.
> To post to this group, send email to
mlx...@googlegroups.com.
> To view this discussion on the web visit
https://groups.google.com/d/msgid/mlxtend/cb96c05d-77a0-4dc9-a028-19de3d547a45%40googlegroups.com.
> For more options, visit
https://groups.google.com/d/optout.
> <20180703_205823.jpg>