Issue
I'm trying to understand how to maintain the mongo client in node application. The first thought I had is to create a client on every single collection retrieval. Something like this:
const getCollection = (collectionName) => {
return MongoClient.connect(url, {useNewUrlParser: true, useUnifiedTopology: true})
.then((client) => {
const database = client.db(databaseName);
return database.collection(collectionName);
})
.catch((err) => {
console.log(err);
});
};
And then use the returned promise for queries. Like this:
const executeFind = (collectionName, query, projection, skip, limit) => {
return getCollection(collectionName)
.then(collection => {
return collection.find(query, {projection: projection})
.skip(skip)
.limit(limit)
.toArray();
})
.catch((err) => {
console.log(err);
});
};
The problem with this approach is that number of open connections to mongo increases rapidly when running the application resulting in problems with the database operations and lot of alerts.
Possible causes of connection increase I considered:
- Large pool size - I tried adding
maxPoolSize=5
to URL. Also addingpoolSize: 5
tooptions
(second parameter of MongoClient'sconnect
function. The number connections still bursts. - Missing connection close - I can't find the doc now but I read somewhere that connections are managed by client itself so there is not need to thinking about
close()
ing them. But anyway, I tried to rewrite the code toclose()
the client aftercollection.find()
returns the result. I am gettingCannot use a session that has ended
Other than these, I don't have any other thoughts in mind to maintain the mongo client in a way that will be efficient in terms of resource allocation/running. I'd like to hear the answer on both:
- 1. What exactly can be done in this very approach to avoid open connection increase?
- 2. What is more general/optimal/best practiced way of maintaining mongo client.
Solution
I can partially answer my question.
I was able to solve the connection increase problem with client.close()
. The main problem seemed to be that the promises were missing await
so close()
was resulting in unexpected behavior (close()
sometimes happened before the actual invoking of the query, and resulted in session already closed
error).
The problem with above approach (open and close connection on every call) is that it's quite slow.
Still looking for general answer about optimally maintaining the client.
Update
I was finally able to find solution which doesn't open absurdly high amount of connections and is also optimal.
The trick here is to declare and invoke async function in data access layer (Or wherever the database code is located) that will set the database object. Something like this:
let database;
(async () => {
const client = await MongoClient.connect(url, {poolSize: 150, useNewUrlParser: true, useUnifiedTopology: true});
database = client.db(databaseName);
})();
And then just reuse the database object everywhere, like this:
database.collection(collectionName).insertMany(documents);
Pooling and connection open/close is handled by client
. In terms of optimality it's way faster (As expected). Not sure if having global database or immediate function invoke in dao (Though, as the file is imported with require
it shouldn't be invoked more than once) is the best practice but it definitely does the trick.
Answered By - Captain Lambda Answer Checked By - Marilyn (PHPFixing Volunteer)
0 Comments:
Post a Comment
Note: Only a member of this blog may post a comment.