Re: large numbers of inserts out of memory strategy
От | Tomas Vondra |
---|---|
Тема | Re: large numbers of inserts out of memory strategy |
Дата | |
Msg-id | ad918f32-4913-3dee-281a-5a3fee576a14@2ndquadrant.com обсуждение исходный текст |
Ответ на | large numbers of inserts out of memory strategy (Ted Toth <txtoth@gmail.com>) |
Ответы |
Re: large numbers of inserts out of memory strategy
|
Список | pgsql-general |
Hi, On 11/28/2017 06:17 PM, Ted Toth wrote: > I'm writing a migration utility to move data from non-rdbms data > source to a postgres db. Currently I'm generating SQL INSERT > statements involving 6 related tables for each 'thing'. With 100k or > more 'things' to migrate I'm generating a lot of statements and when I > try to import using psql postgres fails with 'out of memory' when > running on a Linux VM with 4G of memory. If I break into smaller > chunks say ~50K statements then thde import succeeds. I can change my > migration utility to generate multiple files each with a limited > number of INSERTs to get around this issue but maybe there's > another/better way? > The question is what exactly runs out of memory, and how did you modify the configuration (particularly related to memory). regards -- Tomas Vondra http://www.2ndQuadrant.com PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services
В списке pgsql-general по дате отправления: