You can only use `hp` from within the hypermodel building function (the function you pass to the tuner).
In this case, since you want the batch size to be a hyperparameter, you should create a custom tuner that does this. You can achieve this by subclassing the Tuner class and overriding the `run_trial` method.
The new method would look like this (the part that differs from the default method is highlighted):
```
def run_trial(self, trial, *fit_args, **fit_kwargs):
"""Evaluates a set of hyperparameter values.
This method is called during `search` to evaluate a set of
hyperparameters.
# Arguments:
trial: A `Trial` instance that contains the information
needed to run this trial. `Hyperparameters` can be accessed
via `trial.hyperparameters`.
*fit_args: Positional arguments passed by `search`.
*fit_kwargs: Keyword arguments passed by `search`.
"""
# Handle any callbacks passed to `fit`.
copied_fit_kwargs = copy.copy(fit_kwargs)
callbacks = fit_kwargs.pop('callbacks', [])
callbacks = self._deepcopy_callbacks(callbacks)
self._configure_tensorboard_dir(callbacks, trial)
callbacks.append(tuner_utils.TunerCallback(self, trial))
copied_fit_kwargs['callbacks'] = callbacks
model = self.hypermodel.build(trial.hyperparameters)
model.fit(*fit_args, **fit_kwargs)```