Inquiry on batch_size in keras-tuner

1,046 views
Skip to first unread message

Sun Li

unread,
Feb 2, 2021, 9:43:39 PM2/2/21
to keras...@googlegroups.com, j...@tamu.edu
Good day!

I am studying in University of Johannesburg in South Africa.

Thank you very much for provide keras-tuner!

When I apply keras-tuner to train my model, I don't know how to set 'batch_size' in the model:

Before that, I don't set batch_size, it seems it is automatically, could you please help on how to read the results of batch_size of the optimised trail.

Best Regards
SunLi

François Chollet

unread,
Feb 3, 2021, 1:23:49 AM2/3/21
to Sun Li, keras...@googlegroups.com, j...@tamu.edu
You can only use `hp` from within the hypermodel building function (the function you pass to the tuner).

In this case, since you want the batch size to be a hyperparameter, you should create a custom tuner that does this. You can achieve this by subclassing the Tuner class and overriding the `run_trial` method.

The new method would look like this (the part that differs from the default method is highlighted):

```
   def run_trial(self, trial, *fit_args, **fit_kwargs):
        """Evaluates a set of hyperparameter values.
        This method is called during `search` to evaluate a set of
        hyperparameters.
        # Arguments:
            trial: A `Trial` instance that contains the information
              needed to run this trial. `Hyperparameters` can be accessed
              via `trial.hyperparameters`.
            *fit_args: Positional arguments passed by `search`.
            *fit_kwargs: Keyword arguments passed by `search`.
        """
        # Handle any callbacks passed to `fit`.
        copied_fit_kwargs = copy.copy(fit_kwargs)
        callbacks = fit_kwargs.pop('callbacks', [])
        callbacks = self._deepcopy_callbacks(callbacks)
        self._configure_tensorboard_dir(callbacks, trial)
        callbacks.append(tuner_utils.TunerCallback(self, trial))
        copied_fit_kwargs['callbacks'] = callbacks
        model = self.hypermodel.build(trial.hyperparameters)
        fit_kwargs['batch_size'] = trial.hyperparameters.Int('batch_size', 10, 40, step=4)
        model.fit(*fit_args, **fit_kwargs)
```

--
You received this message because you are subscribed to the Google Groups "Keras-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to keras-users...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/keras-users/AS8PR01MB7864E6345D23FB609AD5FDC4E8B49%40AS8PR01MB7864.eurprd01.prod.exchangelabs.com.

Sun Li

unread,
Feb 3, 2021, 4:55:16 AM2/3/21
to François Chollet, keras...@googlegroups.com, j...@tamu.edu
Good day!

Thanks for your quickly response!

if I still apply hypermodel building function, the default value of batch_size is 32? if not, how can I get the value of batch_size selected/trained?

Best Regards
SunLi

发件人: François Chollet <francois...@gmail.com>
发送时间: 2021年2月3日 14:23
收件人: Sun Li <SunLi...@outlook.com>
抄送: keras...@googlegroups.com <keras...@googlegroups.com>; j...@tamu.edu <j...@tamu.edu>
主题: Re: Inquiry on batch_size in keras-tuner
 

Sun Li

unread,
Feb 3, 2021, 10:17:57 PM2/3/21
to François Chollet, keras...@googlegroups.com, j...@tamu.edu
Good day!

Thanks for your  help!

How can I use the model to predict directly?



Best Regards
SunLi


发件人: Sun Li <SunLi...@outlook.com>
发送时间: 2021年2月3日 17:55
收件人: François Chollet <francois...@gmail.com>
主题: 回复: Inquiry on batch_size in keras-tuner
 

Lance Norskog

unread,
Feb 4, 2021, 12:41:52 AM2/4/21
to Sun Li, François Chollet, keras...@googlegroups.com, j...@tamu.edu
This is a basic example of training a model and using model.predict(). 
You should understand the parts of this before trying to use the Hypertuner.




--
Lance Norskog
lance....@gmail.com
Redwood City, CA
Reply all
Reply to author
Forward
0 new messages