What is the size of your entire dataset? If it exceeds 8GB, then you will have to split up your dataset into smaller files.
A possible solution would be train your model in batches. Import first input dataset file, train on it and extract the model weights. Then initialize a new model with the next input dataset and weights from the previous model and train on it. This can be done recursively for all the input files.
A potential problem with this approach is that the overall model can overfit to the last input dataset file. You can overcome this training multiple iterations by choosing input files in different orders. Thereafter you can either average the weights of all the models or ensemble their predictions.