"Out of memory" errors..
От | Lim Berger |
---|---|
Тема | "Out of memory" errors.. |
Дата | |
Msg-id | 69d2538f0708130409m46237b11w72da8b61238517ab@mail.gmail.com обсуждение исходный текст |
Ответы |
Re: "Out of memory" errors..
Re: "Out of memory" errors.. |
Список | pgsql-general |
Hi I am getting the following error while running queries such as "vacuum analyze TABLE", even on small tables with a piddly 35,000 rows! The error message: -- ERROR: out of memory DETAIL: Failed on request of size 67108860. -- My postgresql.conf is below. I am on a Dual Core server with 4GB or RAM, which runs MySQL as well (key_buffer for which is at around 800M). So I have allocated shared_buffers for postgresql based on that number. The server also runs Apache and other stuff, but I have never had any problem running the vacuum equivalent called "REPAIR TABLE" on MySQL. Thanks in advance for any inputs! ------POSTGRESQL.CONF------- #--- Some tuning ~ #--- http://www.opennms.org/index.php/Performance_tuning max_connections = 250 shared_buffers = 21000 effective_cache_size = 21000 max_fsm_relations = 1500 max_fsm_pages = 80000 sort_mem = 16348 work_mem = 16348 vacuum_mem = 16348 temp_buffers = 4096 authentication_timeout = 10s ssl = off autovacuum = on vacuum_cost_delay = 50 stats_start_collector = on stats_row_level = on #--- For COPY performance wal_buffers=64 checkpoint_segments=64 checkpoint_timeout=900 fsync = on maintenance_work_mem = 64MB
В списке pgsql-general по дате отправления: