Re: Importing a Large .ndjson file

Поиск
Список
Период
Сортировка
От Michael Lewis
Тема Re: Importing a Large .ndjson file
Дата
Msg-id CAHOFxGotx8i1U+B6bFyw_zviqN0sFk6xKFA_MHtwr-3G8ScyRg@mail.gmail.com
обсуждение исходный текст
Ответ на Re: Importing a Large .ndjson file  (Sankar P <sankar.curiosity@gmail.com>)
Список pgsql-general
I spoke too soon. While this worked fine when there were no indexes
and finished within 10 minutes, with GIN index on the jsonb column, it
is taking hours and still not completing.

It is always recommended to create indexes AFTER loading data. Sometimes it can be faster to drop all indexes on the table, load huge data, and re-create the indexes but there's no hard & fast rule. If you are adding 100k records to an empty or near empty table, I would remove all indexes and create them after. Be sure you have sufficient maintenance_work_mem also. 

В списке pgsql-general по дате отправления:

Предыдущее
От: Sankar P
Дата:
Сообщение: Re: Importing a Large .ndjson file
Следующее
От: Tom Lane
Дата:
Сообщение: Re: Importing a Large .ndjson file