Sunday, September 15, 2013

closing mongodb connection in node.js while inserting lot of data

closing mongodb connection in node.js while inserting lot of data

I am trying write a program to parse and insert iis logs data in to
mongodb. The files aren't that huge it's around 600 lines. Trying to
convince my management nodejs and mongodb is better for this compared to
.net and sql server :).
Have a look at the below code in nodejs. Logic: I parse every line and
convert into json and insert the save in db. i am using mongonative
driver.
Issue : The db connection gets closed even before all lines are inserted
into the Db.
I see the log file has 6000 lines, but num of records in db is only arnd
4000. I understand it's nodejs's async characteristic, in this how can i
close the connection in more deterministic way (after checking if all
lines got inserted)?
var MongoClient = require('mongodb').MongoClient;
var mongoServer = require('mongodb').Server;
var serverOptions = {
'auto_reconnect': true,
'poolSize': 5
};
var fs = require('fs');
var readline = require('readline');
var rd = readline.createInterface({
input:
fs.createReadStream('C:/logs/Advisor_Metrics/UI/P20VMADVSRUI01/u_ex130904.log'),
output: process.stdout,
terminal: false
});
var mongoClient = new MongoClient(new mongoServer('localhost', 27017,
serverOptions));
var db = mongoClient.db('test');
var collection = db.collection('new_file_test');
var cntr = 0;
mongoClient.open(function (err, mongoClient) {
console.log(err);
if (mongoClient)
{
rd.on('line', function (line) {
if (line.indexOf('#') == -1) {
var lineSplit = line.split(' ');
var data =
{
d: lineSplit[0],
t: lineSplit[1],
sip: lineSplit[2],
met: lineSplit[3],
uri: lineSplit[4],
cip: lineSplit[8],
cua: lineSplit[9],
stat: lineSplit[10],
tt: lineSplit[13]
};
collection.insert(data, function (err, docs) {
console.log('closing connection');
//db.close();
});
}
});
}
})
rd.on('close', function () {
db.close();
});
Sol 1 : A solution would be parse the json objects and add into an array
and add the array to mongodb. i wouldn't like to do that since that would
like parsing the entire huge log file into memory!, Any other solution?

No comments:

Post a Comment