Problem w/ dumping huge table and no disk space
От | David Ford |
---|---|
Тема | Problem w/ dumping huge table and no disk space |
Дата | |
Msg-id | 3B993392.1000809@blue-labs.org обсуждение исходный текст |
Ответы |
Re: Problem w/ dumping huge table and no disk space
Re: Problem w/ dumping huge table and no disk space Re: Problem w/ dumping huge table and no disk space |
Список | pgsql-general |
Help if you would please :) I have a 10million+ row table and I've only got a couple hundred megs left. I can't delete any rows, pg runs out of disk space and crashes. I can't pg_dump w/ compressed, the output file is started, has the schema and a bit other info comprising about 650 bytes, runs for 30 minutes and pg runs out of disk space and crashes. My pg_dump cmd is: "pg_dump -d -f syslog.tar.gz -F c -t syslog -Z 9 syslog". I want to dump this database (entire pgsql dir is just over two gigs) and put it on another larger machine. I can't afford to lose this information, are there any helpful hints? I'll be happy to provide more information if desired. David
В списке pgsql-general по дате отправления: