Large file support needed? Trying to identify root of error.
От | Kris Kiger |
---|---|
Тема | Large file support needed? Trying to identify root of error. |
Дата | |
Msg-id | 40FC20F0.1000402@musicrebellion.com обсуждение исходный текст |
Ответ на | Connection pooling/sharing software help (Kris Kiger <kris@musicrebellion.com>) |
Ответы |
Re: Large file support needed? Trying to identify root of error.
Re: Large file support needed? Trying to identify root of |
Список | pgsql-admin |
I've got a database that is a single table with 5 integers, a timestamp with time zone, and a boolean. The table is 170 million rows in length. The contents of the tar'd dump file it produced using: pg_dump -U postgres -Ft test > test_backup.tar is: 8.dat (approximately 8GB), a toc, and restore.sql. No errors are reported on dump, however, when a restore is attempted I get: ERROR: unexpected message type 0x58 during COPY from stdin CONTEXT: COPY test_table, line 86077128: "" ERROR: could not send data to client: Broken pipe CONTEXT: COPY test_table, line 86077128: "" I am doing the dump & restore on the same machine. Any ideas? If the file is too large, is there anyway postgres could break it up into smaller chunks for the tar when backing up? Thanks for the help! Kris
В списке pgsql-admin по дате отправления: