Re: [GENERAL] import CSV file to a table
От | Rob Sargent |
---|---|
Тема | Re: [GENERAL] import CSV file to a table |
Дата | |
Msg-id | 82d63d16-6b56-44b3-738f-2b663c5a2c54@gmail.com обсуждение исходный текст |
Ответ на | Re: [GENERAL] import CSV file to a table (Karl Czajkowski <karlcz@isi.edu>) |
Ответы |
Re: [GENERAL] import CSV file to a table
|
Список | pgsql-general |
On 03/08/2017 09:52 AM, Karl Czajkowski wrote: > On Mar 08, Rob Sargent modulated: > >> Yes Karl, I agree. I admitted as much. But if it's clean, as in >> free of quoted commas, life is much more simple. I've lost site of >> whether or not the OP knows his situation w.r.t. to this. The awk >> line will tell him and for a one-off load this can make a world of >> difference in complexity - two bash lines and a COPY. >> > Maybe I didn't understand your awk? I thought it was counting commas > in lines. This isn't the same as counting commas in records. > > this,is,record,one > "this,,","is > ,,record","two > ,,," > > this has three commas on each line and definitely is not suitable > for naive CSV handling. > > > Karl In essence it does count commas but plus one :). $NF is number of fields defined by commas so one more field than number of commas. If you think/hope the file is simple and well formatted, this is a pretty quick check. But if you're looking for a general solution, you need a real csv parser. I recall being quite surprised and amused to learn there is an actual standard for csv format. (Naturally if you have one to hand, you don't need the awk line.)
В списке pgsql-general по дате отправления: