Comma-separated values more commonly known as CSV has been used for a long time as a standard text-based way to represent and transfer data. There are many ways to read and write CSV files in Node.js. In this post, we will learn how to read a CSV and write a CSV file using Node.js in an efficient way. Let’s get rolling.

If you have worked in building software for some years, possibly more than once the work has been either "some changes" on an existing project or a completely new greenfield project. Already working a.k.a brownfield products have users using it whereas new projects don't have the volume of users till it goes fully on production in some way. In this post, we will evaluate the differences in the mindset software engineers need to have for a stable software product vs a new greenfield project. Let's get started!

There are multiple ways to read a file line by line with Node.js. In Node.js files can be read in sync way or in an async way. With the async path, it is possible to read large files without loading all the content of the file into memory.

Reading the whole file at once will make the process memory intensive. With the ability to load and read a file line by line it enables us to stop the process at any step as per need. In this post, we will look into 3 ways to read a file line by line using Node.js with memory usage comparison.

Web scraping is the process of extracting data from a website in an automated way and Node.js can be used for web scraping. Even though other languages and frameworks are more popular for web scraping, Node.js can be utilized well to do the job too. In this post, we will learn how to do web scraping with Node.js for websites that don’t need and need Javascript to load. Let’s get started!

More posts can be found in the archive.

Stay Connected

Follow me on LinkedIn for new posts, engineering insights, and tech takes — straight from the trenches.

Follow on LinkedIn  →