Hello all. Don't know if this is the right place to post this.
I am a beginner in ML and I'm trying to do a Fake News Classification project on my local machine. I am using 8GB of RAM on my laptop. However, when tried to perform lemmatization and convert the text to DataFrame it is giving me a MemoryError. It says that it cannot allocate 13GB of memory.
I tried reducing my dataset samples. But it still gave me the same error. My disk space usage also got increased by more than 20GB just after running the notebook. My laptop is suddenly slowing down also (It's actually pretty new).
Kindly help me with this issue. How can I clear those large files from my laptop?
--
You received this message because you are subscribed to the Google Groups "Discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss+u...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/discuss/a3cf6189-d6ae-4ef3-87e2-21a503d10873n%40tensorflow.org.