Re: large numbers of inserts out of memory strategy
От | Ted Toth |
---|---|
Тема | Re: large numbers of inserts out of memory strategy |
Дата | |
Msg-id | CAFPpqQHY39wDEQm7JeBD6JSN_UgNi-19kdGdZKmhQ2gxTpw8Qw@mail.gmail.com обсуждение исходный текст |
Ответ на | Re: large numbers of inserts out of memory strategy (Tomas Vondra <tomas.vondra@2ndquadrant.com>) |
Ответы |
Re: large numbers of inserts out of memory strategy
Re: large numbers of inserts out of memory strategy |
Список | pgsql-general |
On Tue, Nov 28, 2017 at 11:22 AM, Tomas Vondra <tomas.vondra@2ndquadrant.com> wrote: > Hi, > > On 11/28/2017 06:17 PM, Ted Toth wrote: >> I'm writing a migration utility to move data from non-rdbms data >> source to a postgres db. Currently I'm generating SQL INSERT >> statements involving 6 related tables for each 'thing'. With 100k or >> more 'things' to migrate I'm generating a lot of statements and when I >> try to import using psql postgres fails with 'out of memory' when >> running on a Linux VM with 4G of memory. If I break into smaller >> chunks say ~50K statements then thde import succeeds. I can change my >> migration utility to generate multiple files each with a limited >> number of INSERTs to get around this issue but maybe there's >> another/better way? >> > > The question is what exactly runs out of memory, and how did you modify > the configuration (particularly related to memory). > > regards > > -- > Tomas Vondra http://www.2ndQuadrant.com > PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services I'm pretty new to postgres so I haven't changed any configuration setting and the log is a bit hard for me to make sense of :(
Вложения
В списке pgsql-general по дате отправления: