Re: [HACKERS] Question about Scripting in Postgresql.
От | Richard Huxton |
---|---|
Тема | Re: [HACKERS] Question about Scripting in Postgresql. |
Дата | |
Msg-id | 200309050949.51070.dev@archonet.com обсуждение исходный текст |
Список | pgsql-general |
On Friday 05 September 2003 00:24, Nico King wrote: [moving this to pgsql-general] > The reason that I have to write a script to enter the > data into the tables is that what if I have to enter > 1000 lines of data into 200 rows?? > here is a piece of my script that works but not when I > enter lets' say a char instead of integer. > ========================================================= > copy accounts from stdin using delimiters ','; > 1,pass,mac,,, > 2,pass2,mac2,ip,test > 0,pass2,mac2,ip,test2 > \. > ======================================================= Sorry - don't understand. Assuming your values are the right type for the columns that looks OK to me. > I have written a script to import some data into > my database tables, with the delimiter ','. Now my > question is sometime the data being sent to my tables > might not match the data type or be corrupted and I > receive an error message. > One: how could I prevent that? Don't try and put bad data into the batch. It's designed so that if you automate importing batches of data the operation isn't left in some half-done state. > Two: how can I proceed with importing the rest of the > data into the next record even though some are > corrupted,'cause I get intrupted as soon as there is > an error in inserting the data? Sounds like you want to write a small Perl script to take your data, strip out anything obviously bad and then insert it in batches. If you have a lot of bad data you can do it one row at a time, if not transactions of say 100 rows at a time might be better. -- Richard Huxton Archonet Ltd
В списке pgsql-general по дате отправления: