Error: sorry, too many clients already

I am using PostgreSQL in my NodeJS backend service. All of sudden, when I start the service I am facing below error PostgresSQL connection config I am unable to query database because of the above error. I am unable to fix this issue. Can you please suggest the solution

I am using PostgreSQL in my NodeJS backend service. All of sudden, when I start the service I am facing below error

 connection error error: sorry, too many clients already. 

PostgresSQL connection config

 const pg = require(“pg”); const client = new pg.Client({ host: “txslmxxxda6z”, user: “mom”, password: “mom”, db: “mom”, port: 5025 }); 

I am unable to query database because of the above error. I am unable to fix this issue. Can you please suggest the solution

3

4 Answers

below query will help you.

select max_conn,used,res_for_super,max_conn-used-res_for_super res_for_normal from (select count(*) used from pg_stat_activity) t1, (select setting::int res_for_super from pg_settings where name=$$superuser_reserved_connections$$) t2, (select setting::int max_conn from pg_settings where name=$$max_connections$$) t3 
3

I experienced the same issue that " sorry, too many clients already. Request timeout". I managed to solve it. There are some ways which you can try

1)The first one that you can increase the number of maximum connections and other settings in sequilize like

 { max: 95, //maximum connection which postgresql or mysql can intiate min: 0, //maximum connection which postgresql or mysql can intiate acquire:20000, // time require to reconnect idle: 20000, // get idle connection evict:10000 // it actualy removes the idle connection } 

2). The second way is to use timescaledb because sometimes with complex queries we get timeout error for timescale data you can set timescaledb so you can run complex database queries. you can also set maximum clients for postgres according to your server requirement.

https://github.com/timescale/timescaledb-tune

3

If you would like to use PgAdmin (for me it is more than convenient), you could do these simple steps. And more than that, please use dashboard of this tool that will show you all you connections.

in the documentation https://node-postgres.com/api/pool we can see the "max" and " idleTimeoutMillis" variables, so play with your config, and start from this:

const poolCredentials = { user: process.env.DB_USER, host: process.env.DB_HOST, database: process.env.DB_NAME, password: process.env.DB_PW, port: process.env.DB_PORT, idleTimeoutMillis: 1, max: 10, connectionTimeoutMillis: 2000, ...ssl }; 

You simply look on you server constraints and change the config. Good luck and glad if this is helps

SELECT * FROM pg_stat_activity; 

example

Have you an idea how your program managed to open many connections to the DB? Which is your connection-management strategy:

  • Open a short-lived connection for each query/command/batch/call, or
  • Open a single (or few) connection(s) with a the lifespan of the entire service?
  • Are you sure you are closing every connection that you open? To avoid concurrency issues, I use the first strategy in my NodeJS applications, opening a dedicated connection to handle each request, and make absolutely sure to close it when the request has been handled. That way, the maximum number of open connections is the maximum number of simultaneously processed requests.

    ncG1vNJzZmirpJawrLvVnqmfpJ%2Bse6S7zGiorp2jqbawutJobGlxZGx9d4KOnqmrp6JiwLC%2B0bJkraefYrqiuthmmqWhlaPBtHnApamemZSu

     Share!