Re: 100 simultaneous connections, critical limit?
От | Jón Ragnarsson |
---|---|
Тема | Re: 100 simultaneous connections, critical limit? |
Дата | |
Msg-id | 400547B1.7000202@physicallink.com обсуждение исходный текст |
Ответ на | Re: 100 simultaneous connections, critical limit? (Christopher Browne <cbbrowne@acm.org>) |
Список | pgsql-performance |
Ok, connection pooling was the thing that I thought of first, but I haven't found any docs regarding pooling with PHP+Postgres. OTOH, I designed the application to be as independent from the DB as possible. (No stored procedures or other Postgres specific stuff) Thanks, J. Christopher Browne wrote: > Clinging to sanity, jonr@physicallink.com (Jón Ragnarsson) mumbled into her beard: > >>I am writing a website that will probably have some traffic. >>Right now I wrap every .php page in pg_connect() and pg_close(). >>Then I read somewhere that Postgres only supports 100 simultaneous >>connections (default). Is that a limitation? Should I use some other >>method when writing code for high-traffic website? > > > I thought the out-of-the-box default was 32. > > If you honestly need a LOT of connections, you can configure the > database to support more. I "upped the limit" on one system to have > 512 the other week; certainly supportable, if you have the RAM for it. > > It is, however, quite likely that the connect()/close() cuts down on > the efficiency of your application. If PHP supports some form of > "connection pooling," you should consider using that, as it will cut > down _dramatically_ on the amount of work done establishing/closing > connections, and should let your apps use somewhat fewer connections > more effectively.
В списке pgsql-performance по дате отправления: