Re: pg_dump out of memory
От | Adrian Klaver |
---|---|
Тема | Re: pg_dump out of memory |
Дата | |
Msg-id | d526cfbf-6331-761f-8ad5-b4bb0b5fc2ef@aklaver.com обсуждение исходный текст |
Ответ на | Re: pg_dump out of memory (Andy Colson <andy@squeakycode.net>) |
Список | pgsql-general |
On 07/03/2018 08:28 PM, Andy Colson wrote: > On 07/03/2018 10:21 PM, Adrian Klaver wrote: >> On 07/03/2018 07:43 PM, Andy Colson wrote: >>> Hi All, >>> >>> I moved a physical box to a VM, and set its memory to 1Gig. Everything >>> runs fine except one backup: >>> >>> >>> /pub/backup# pg_dump -Fc -U postgres -f wildfire.backup wildfirep >>> >>> g_dump: Dumping the contents of table "ofrrds" failed: PQgetResult() >>> failed. >>> pg_dump: Error message from server: ERROR: out of memory >>> DETAIL: Failed on request of size 1073741823. >>> pg_dump: The command was: COPY public.ofrrds (id, updateddate, bytes) TO >>> stdout; >>> >>> >>> >>> I'm not sure how to get this backup to run. Any hints would be >>> appreciated. >> >> Maybe: >> >> 1) Try: >> pg_dump -t ofrrds >> to dump only that table. >> > > It didnt work. I get the same error. Well all I can think of is to give the VM more memory. > > Also, I'm running Slackware 14.2, and PG 9.5.11 > > -Andy > > > -- Adrian Klaver adrian.klaver@aklaver.com
В списке pgsql-general по дате отправления: