Re: Backup Large Tables
От | Charles Ambrose |
---|---|
Тема | Re: Backup Large Tables |
Дата | |
Msg-id | 61ca079e0609212119u5d44b14dh39bf28f76374942d@mail.gmail.com обсуждение исходный текст |
Ответ на | Re: Backup Large Tables ("Michael Nolan" <htfoot@gmail.com>) |
Ответы |
Re: Backup Large Tables
|
Список | pgsql-general |
Hi!
I encounter errors in dumping the database using pg_dump. The database i think is corrupt. It was looking for triggers and stored procedures that are now longer in the database. This is also the reason why I opted to create a program to dump the database.
I encounter errors in dumping the database using pg_dump. The database i think is corrupt. It was looking for triggers and stored procedures that are now longer in the database. This is also the reason why I opted to create a program to dump the database.
On 9/22/06, Michael Nolan <htfoot@gmail.com> wrote:
I have a table with over 6 million rows in it that I do a dump on every night. It takes less than 2 minutes to create a file that is around 650 MB.
Are you maybe dumping this file in 'insert' mode?
--
Mike NolanOn 9/21/06, Charles Ambrose < jamjam360@gmail.com> wrote:Hi!
I have a fairly large database tables (say an average of 3Million to 4Million records). Using the pg_dump utility takes forever to dump the database tables. As an alternative, I have created a program that gets all the data from the table and then put it into a text file. I was also unsuccessfull in this alternative to dump the database.
В списке pgsql-general по дате отправления: