Re: COPY from file to table containing unique index
От | Joel Burton |
---|---|
Тема | Re: COPY from file to table containing unique index |
Дата | |
Msg-id | Pine.LNX.4.21.0104102216270.31213-100000@olympus.scw.org обсуждение исходный текст |
Ответ на | COPY from file to table containing unique index ("Joe Johnson" <joej@generalsearch.net>) |
Список | pgsql-general |
On Tue, 10 Apr 2001, Joe Johnson wrote: > I have a table with over 1,000,000 records in it containing names and phone > numbers, and one of the indexes on the table is a unique index on the phone > number. I am trying to copy about 100,000 more records to the table from a > text file, but I get an error on copying because of duplicate phone numbers > in the text file, which kills the COPY command without copying anything to > the table. Is there some way that I can get Postgres to copy the records > from the file and just skip records that contain duplicates to the unique > index? I found that using PHP scripts to do inserts for a file of this size > take MUCH longer than I'd like, so I'd like to avoid having to do it that > way if I can. Any help is appreciated. Thanks! There are a few options. This was discussed yesterday, in the thread 'problem with copy command' -- Joel Burton <jburton@scw.org> Director of Information Systems, Support Center of Washington
В списке pgsql-general по дате отправления: