GPU memory error at inference time, TF 2.12

13 views
Skip to first unread message

giuseppe...@gmail.com

unread,
Apr 19, 2024, 11:31:19 AMApr 19
to Keras-users
Hi,

I have trained a model, but I cannot use model.predict, because I always get a GPU memory error (ResourceExhaustedError), even if I use the same batch size I used for the training:

preds = model.predict(test.forms, batch_size=16)

If I try other values for batch_size, it does not work either, but if I limit the number of forms (for example, test.forms[0:100]), it works: any idea of how to solve this problem?
Reply all
Reply to author
Forward
0 new messages