How to parse huge log files and read it in line-by-line Node.js?

Sometimes, we want to parse huge log files and read it in line-by-line Node.js.

In this article, we’ll look at how to parse huge log files and read it in line-by-line Node.js.

How to parse huge log files and read it in line-by-line Node.js?

To parse huge log files and read it in line-by-line Node.js, we can use the event-stream package.

To install it, we run

npm install event-stream

Then we use it by writing

const fs = require('fs')
const es = require('event-stream');

let lineNr = 0;

const s = fs.createReadStream('very-large-file.csv')
  .pipe(es.split())
  .pipe(es.mapSync((line) => {
      s.pause();
      lineNr += 1;
      logMemoryUsage(lineNr);
      s.resume();
    })
    .on('error', (err) => {
      console.log('Error while reading file.', err);
    })
    .on('end', () => {
      console.log('Read entire file.')
    })
  );

to call fs.createReadStream with the file we want to read.

And we call pipe with es.split to split the lines from the files.

And then we call es.mapSync with a callback that calls s.pause to pause read the file.

And then we run the code we want to with line read from the file.

Then we call s.resume to resume reading the file.

Also, we call on with 'error' and 'end' to listen for those events.

'error' event is emitted when there’s an error reading the file.

'end' event is emitted when the file reading operation is done.

Conclusion

To parse huge log files and read it in line-by-line Node.js, we can use the event-stream package.