error updating a very large table
От | Brian Cox |
---|---|
Тема | error updating a very large table |
Дата | |
Msg-id | 49E52D34.4080200@ca.com обсуждение исходный текст |
Ответы |
Re: error updating a very large table
Re: error updating a very large table |
Список | pgsql-performance |
ts_defect_meta_values has 460M rows. The following query, in retrospect not too surprisingly, runs out of memory on a 32 bit postgres: update ts_defect_meta_values set ts_defect_date=(select ts_occur_date from ts_defects where ts_id=ts_defect_id) I changed the logic to update the table in 1M row batches. However, after 159M rows, I get: ERROR: could not extend relation 1663/16385/19505: wrote only 4096 of 8192 bytes at block 7621407 A df run on this machine shows plenty of space: [root@rql32xeoall03 tmp]# df Filesystem 1K-blocks Used Available Use% Mounted on /dev/sda2 276860796 152777744 110019352 59% / /dev/sda1 101086 11283 84584 12% /boot none 4155276 0 4155276 0% /dev/shm The updates are done inside of a single transaction. postgres 8.3.5. Ideas on what is going on appreciated. Thanks, Brian
В списке pgsql-performance по дате отправления: