Re: Finding Errors in .csv Input Data
От | Dimitri Fontaine |
---|---|
Тема | Re: Finding Errors in .csv Input Data |
Дата | |
Msg-id | m239n0gf0v.fsf@2ndQuadrant.fr обсуждение исходный текст |
Ответ на | Finding Errors in .csv Input Data (Rich Shepard <rshepard@appl-ecosys.com>) |
Ответы |
Re: Finding Errors in .csv Input Data
|
Список | pgsql-general |
Rich Shepard <rshepard@appl-ecosys.com> writes: > I'm sure many of you have solved this problem in the past and can offer > solutions that will work for me. The context is a 73-column postgres table > of data that was originally in an Access .mdb file. A colleague loaded the > file into Access and wrote a .csv file for me to use since we have nothing > Microsoft here. There are 110,752 rows in the file/table. After a lot of > cleaning with emacs and sed, the copy command accepted all but 80 rows of > data. Now I need to figure out why postgres reports them as having too many > columns. Did you try pgloader yet? http://pgloader.projects.postgresql.org/ http://pgfoundry.org/projects/pgloader/ https://github.com/dimitri/pgloader http://packages.debian.org/sid/pgloader Regards, -- Dimitri Fontaine http://2ndQuadrant.fr PostgreSQL : Expertise, Formation et Support
В списке pgsql-general по дате отправления: