Need help with OutOfmemory when fs.readFile for large log file

59 views
Skip to first unread message

Anirban Bhattacharya

unread,
Mar 17, 2015, 9:42:13 AM3/17/15
to nod...@googlegroups.com, Anirban Bhattacharya
Hello All Again,

I know this issue been discussed multiple times in multiple places by many people. But quite not able to follow all responses.  I am posting the question here ho[ing to get same positive help i got from Aria and others last time here.
What I am planing to do is a logWatcher which will keep displaying http error log file of apache on a page , I used socket.io though for sending the updated file content whenever there is a change in file.

The problem is whenver i am doing fs.readFile of the log file (which is realy big) I am getting out of memory error. I know its the buffer which is blown up.

What is the best way of doing this? Atleast is there a way to get the last 100 lines each time?

Thanks,

Trevor Norris

unread,
Mar 17, 2015, 11:47:54 PM3/17/15
to nod...@googlegroups.com
You'll have to give additional information about usage. Best case would be something to reproduce the issue. I've read files GB in size without problems.

wankdanker

unread,
Mar 19, 2015, 10:59:53 AM3/19/15
to nod...@googlegroups.com, anirbanbhat...@yahoo.co.in
If you are reading really large files, it would be best to use a readable stream via `fs.createReadStream(path[, options])`. fs.readFile() loads the whole file into memory before calling the callback.

Anirban Bhattacharya

unread,
Mar 21, 2015, 10:35:01 AM3/21/15
to nod...@googlegroups.com
Thanks,
I did use readStream and i think i am good now.

I wonder if we so have the max buffer size which is consumed during readFile?
Reply all
Reply to author
Forward
0 new messages