Re: prevent duplicate entries
От | David G Johnston |
---|---|
Тема | Re: prevent duplicate entries |
Дата | |
Msg-id | 1401372219275-5805419.post@n5.nabble.com обсуждение исходный текст |
Ответ на | Re: prevent duplicate entries (amul sul <sul_amul@yahoo.co.in>) |
Список | pgsql-novice |
amulsul wrote > On Thursday, 29 May 2014 3:20 PM, Thomas Drebert < > drebert@ > > wrote: > > >>Has postgresql a separate function to prevent duplicate records? > >>At time i filter records in php. > > you can directly load csv file date on postgres database using > pg_bulkload, which has functionality to avoid duplication > > pg_bulkload : http://pgbulkload.projects.pgfoundry.org/pg_bulkload.html > > Is this answer to your question? > > Regards, > Amul Sul You might find it better to just load the CSV data into a staging table then perform the necessary "INSERT INTO live ... SELECT ... FROM staging" query to migrate only the new data. It likely will not make much sense to accept (say 90%) of your data eating resources generating duplicate key errors. David J. -- View this message in context: http://postgresql.1045698.n5.nabble.com/prevent-duplicate-entries-tp5805373p5805419.html Sent from the PostgreSQL - novice mailing list archive at Nabble.com.
В списке pgsql-novice по дате отправления: