How to clean up files and data accumulated due to running jupyter notebooks?

26 views
Skip to first unread message

Jahnavi

unread,
May 18, 2021, 12:54:54 AM5/18/21
to Discuss

Hello all. Don't know if this is the right place to post this.

I am a beginner in ML and I'm trying to do a Fake News Classification project on my local machine. I am using 8GB of RAM on my laptop. However, when tried to perform lemmatization and convert the text to DataFrame it is giving me a MemoryError. It says that it cannot allocate 13GB of memory.

I tried reducing my dataset samples. But it still gave me the same error. My disk space usage also got increased by more than 20GB just after running the notebook. My laptop is suddenly slowing down also (It's actually pretty new).

Kindly help me with this issue. How can I clear those large files from my laptop?

RAADy's

unread,
May 18, 2021, 10:38:09 AM5/18/21
to Jahnavi, Discuss
How are you training the data ? I hope you are using a CPU not a GPU machine.

The amount of data that you are using is crossing beyond the memory size. When you have such a huge amount of data you cannot load all the data at once and that too when your memory is very limited (Even if you are using 8 GB, check by command "htop" in the command line, how much memory is left ? that really matters). You have to do training by batch processing.

Thanks,
Dileep Kumar. Appana, Ph.D
Data Scientist @  Arya A.I
GachiBowli, Hyderabad, India.


--
You received this message because you are subscribed to the Google Groups "Discuss" group.
To unsubscribe from this group and stop receiving emails from it, send an email to discuss+u...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/discuss/a3cf6189-d6ae-4ef3-87e2-21a503d10873n%40tensorflow.org.
Reply all
Reply to author
Forward
0 new messages