javascript - Node large data processing -


i have 500 mongodb databases , need connect databases , check record in collection based on condition , record.

one approach use loop , manipulate database one:

for(var i=0; i<databases.length;i++){  //connect database  //find query  // add record array  } 

it work fine take long time. there other way fast processing in optimizes way ?

at bare minimum, run more 1 query @ same time.

look @ this:

http://www.sebastianseilund.com/nodejs-async-in-practice

i need iterate on collection, perform asynchronous task each item, let x tasks run @ same time, , when they're done else

so, code become like:

var results =[];  async.foreachlimit(databases, 5, function(database, callback) {      //connect database      //find query      //add record array - results.push(result) }, function(err) {     //handle error here }); 

this running 5 concurrently... not sure maximum number of outbound connections be.


Comments

Popular posts from this blog

angularjs - ADAL JS Angular- WebAPI add a new role claim to the token -

php - CakePHP HttpSockets send array of paramms -

node.js - Using Node without global install -