Contact Us

Essentials of Node.js

Part-7 Node.js Buffers and Streams

What is a Buffer ?
In computer science and electronics terminology, a Buffer refers to space in memory which is used to store data temporarily. A buffer has traditionally been used between devices with speed mis-match so that they can keep on operating at their respective speeds without loss of data. In Node.js Buffers are used when dealing with file streams or tcp streams which are mainly octets of binary data.
Buffer operations in Node.js
Buffer is a global module, therefore we don't need to use require operator in order to use buffer functionality. Following code snippets show how we initialize buffer in Node.js along with some other manipulation functions.

		@name:        HandlingBuffers.js
		@description: A Simple Program understand different Buffer operations in Node.JS. 
		@author: 	  @rishabhio

// Create a buffer of size 256 
var databuff = new Buffer(256);

// write some data to newly created buffer 
var data = 'Me';
var wlen = databuff.write(data);
console.log(wlen); 		// length of data written is returned after write operation
console.log(databuff); 	// Buffer stores raw data in the hexa-decimal form 

// reading data from our buffer 
var result = databuff.toString('ascii',0,wlen);   // reading only the data we actually wrote in 'ascii'

Execution of above program.

you@yourpc ~
$ node HandlingBuffers.js

Buffer 4d 65 50 00 00 00 00 00 18 5c 4c 02 00 00 00 00 50 1a 4c 00 00 00 00 00
80 1a 4c 00 00 00 00 00 e2 b1 eb a9 00 00 00 00 00 00 00 00 00 00 00 00 98 84 ...

What are Streams ?
Streams are channels on which data can be sent or from which data can be received. Technically it means that we can have readable streams, writeable streams or duplex streams which can be used for both reading and writing. In some cases we can pipe data of one stream to other. Here we'll look at simple examples of readable and writeable streams.
How Streams work in Node.js ?
Streams in Node.js emit different events at different intervals in their life cycle. In order to get value out of those events, we write event handler functions which get executed whenever the stream emits the corresponding event. Common events for readable streams are close , data , end , error and readable . Similarly events for writeable streams are close , drain , error , finish , pipe , unpipe .
Following is a code snippet on reading and writing file streams and it also shows how and when different events are used.

		@name:        HandlingStreams.js
		@description: A Simple Program to understand readable and writeable streams. 
		@author: 	  @rishabhio

// require fs module for file system
var fs = require('fs');
// write data to a file using writeable stream
var wdata = "I am working with streams for the first time";

var myWriteStream = fs.createWriteStream('aboutMe.txt');

// write data 


// done writing

// write handler for error event 
myWriteStream.on('error', function(err){

myWriteStream.on('finish', function() {
    console.log("data written successfully using streams.");
	console.log("Now trying to read the same file using read streams ");
	var myReadStream = fs.createReadStream('aboutMe.txt');
	// add handlers for our read stream
	var rContents = '' // to hold the read contents;
	myReadStream.on('data', function(chunk) {
		rContents += chunk;
	myReadStream.on('error', function(err){
		console.log('read: ' + rContents);
	console.log('performed write and read using streams');



you@yourpc ~
$ node HandlingStreams.js
data written successfully using streams.
Now trying to read the same file using read streams
performed write and read using streams
read: I am working with streams for the first time
What we take ahead
We learned about buffers and streams which will prove to be very useful when we go on to explore the net module and http module of Node.js

developed & nourished with by rishabh.io