Skipping duplicate records?
От | Steve Micallef |
---|---|
Тема | Skipping duplicate records? |
Дата | |
Msg-id | 20010607094751.S20209-100000@toaster.syd.ot обсуждение исходный текст |
Ответы |
Re: Skipping duplicate records?
|
Список | pgsql-general |
Hi, I've recently migrated from MySQL to PostgreSQL and as impressed as I am with Postgres, I have found one seemingly missing feature to be a little bothersome.. 'mysqlimport' has the ability to skip duplicate records when doing bulk imports from non-binary files. PostgreSQL doesn't seem to have this feature, and it causes a problem for me as I import extremely large amounts of data into Postgres using 'copy' and it rejects the whole file if one record breaches the primary key. I have managed to get around this by hacking src/backend/access/nbtree/nbtinsert.c to call elog with NOTICE instead of ERROR, causing it to skip the duplicate record and continue importing. Is there a way to get around this without changing the code? If not, will a future release of Postgres optionally implement this? Thanks in advance, Steve Micallef
В списке pgsql-general по дате отправления: