large numbers of inserts out of memory strategy
От | Ted Toth |
---|---|
Тема | large numbers of inserts out of memory strategy |
Дата | |
Msg-id | CAFPpqQGux3uT=CNN-z=zXr4qSmfBw9tDCURimELqEWUBrBpL7A@mail.gmail.com обсуждение исходный текст |
Ответы |
Re: large numbers of inserts out of memory strategy
Re: large numbers of inserts out of memory strategy Re: large numbers of inserts out of memory strategy |
Список | pgsql-general |
I'm writing a migration utility to move data from non-rdbms data source to a postgres db. Currently I'm generating SQL INSERT statements involving 6 related tables for each 'thing'. With 100k or more 'things' to migrate I'm generating a lot of statements and when I try to import using psql postgres fails with 'out of memory' when running on a Linux VM with 4G of memory. If I break into smaller chunks say ~50K statements then thde import succeeds. I can change my migration utility to generate multiple files each with a limited number of INSERTs to get around this issue but maybe there's another/better way? Ted
В списке pgsql-general по дате отправления: