I am an AI engineer that also involves with Tensorflow and Pytorch as a deep learning library. one of the main issues that AI programmers always use in their codes is the "optimizer schedule". "ReduceLROnPlateau" is a schedule function to reduce the learning rate which is under the "tf.keras.callbacks" when using the training mode via the "fit" method. while, in Pytorch, "ReduceLROnPlateau" is the subset of "torch.optim.lr_scheduler" which is more general because it is capable to use in a custom training loop but unfortunately in TensorFlow should only use with "fit" function.
it is better to use this class method under the "tf.keras.optimizers.schedules" like the other scheduler (tf.keras.optimizers.schedules.ExponentialDecay) because this is so useful scheduler in training.