Re: how to continue after error in batch mode with psql
От | Jasen Betts |
---|---|
Тема | Re: how to continue after error in batch mode with psql |
Дата | |
Msg-id | hrbiu2$tt4$2@reversiblemaps.ath.cx обсуждение исходный текст |
Ответ на | how to continue after error in batch mode with psql (Urs Rau <urs.rau@uk.om.org>) |
Ответы |
Re: how to continue after error in batch mode with psql
|
Список | pgsql-novice |
On 2010-04-28, Urs Rau <urs.rau@uk.om.org> wrote: > I have a nightly process that pulls data over into postgresql 8.3.10 > from a progress server runing under8.x. unfortunately the progress db > does not enforce record size/length limitations. It does not care if a > record of 20 characters contains 21, or 100 for that matter. > > we have a simple pipe that taks the progress data export dump and > imports it into postgresql > > su - postgres -c "cat test.csv | psql -X -q test -c \"COPY t_test FROM > stdin WITH DELIMITER AS ',' NULL AS '?' CSV QUOTE AS '\\\"' ESCAPE AS > '\\\"'\"" > > If any of the fields are over-length, we do get a error message that > tells us which row needs fixing. > > ERROR: value too long for type character varying(20) > CONTEXT: COPY t_test, line 2, column t_comment_c: "'comment 3 that is > too long'" > > But how do we get psql to run through and continue after an error and > import as much as possible of the rest of the import data and give us > error messages about all lines with errors? why not just copy the whole lot into a temp table (like the target table but with text fields instead of fixed-width varchar) and then do a select on that to find the over-length lines and another select to insert the good data into the real target table.
В списке pgsql-novice по дате отправления: