Re: Backup Large Tables
| От | Casey Duncan |
|---|---|
| Тема | Re: Backup Large Tables |
| Дата | |
| Msg-id | DFBEF250-DD70-4987-9443-660ABAC141A7@pandora.com обсуждение исходный текст |
| Ответ на | Re: Backup Large Tables ("Charles Ambrose" <jamjam360@gmail.com>) |
| Список | pgsql-general |
Are you dumping the whole database or just a single table? If it's the former, try the latter and see if you still get errors. If pg_dump is not working, maybe some system table is hosed. What errors are you getting? If you can get in via psql, log in as a superuser and execute: COPY mytable TO 'mytable.txt'; That will dump the table data to a text file which can be re-imported into a new database using the COPY FROM command. Basically you're just doing part of what pg_dump does for you by hand. -Casey On Sep 21, 2006, at 9:19 PM, Charles Ambrose wrote: > Hi! > > I encounter errors in dumping the database using pg_dump. The > database i think is corrupt. It was looking for triggers and stored > procedures that are now longer in the database. This is also the > reason why I opted to create a program to dump the database. > > On 9/22/06, Michael Nolan <htfoot@gmail.com> wrote: I have a table > with over 6 million rows in it that I do a dump on every night. It > takes less than 2 minutes to create a file that is around 650 MB. > > Are you maybe dumping this file in 'insert' mode? > -- > Mike Nolan > > > On 9/21/06, Charles Ambrose < jamjam360@gmail.com> wrote: Hi! > > I have a fairly large database tables (say an average of 3Million > to 4Million records). Using the pg_dump utility takes forever to > dump the database tables. As an alternative, I have created a > program that gets all the data from the table and then put it into > a text file. I was also unsuccessfull in this alternative to dump > the database. > > > >
В списке pgsql-general по дате отправления: