Re: Export tab delimited from mysql to postgres.
От | Christopher Browne |
---|---|
Тема | Re: Export tab delimited from mysql to postgres. |
Дата | |
Msg-id | m3d5zoyd9x.fsf@knuth.knuth.cbbrowne.com обсуждение исходный текст |
Ответ на | Re: Export tab delimited from mysql to postgres. (Theo Galanakis <Theo.Galanakis@lonelyplanet.com.au>) |
Список | pgsql-sql |
Quoth Theo.Galanakis@lonelyplanet.com.au (Theo Galanakis): > Could you provide a example of how to do this? > > I actually ended up exporting the data as Insert statements, > which strips out cf/lf within varchars. However it takes an eternity > to import 200,000 records... 24 hours infact???? Is this normal? I expect that this results from each INSERT being a separate transaction. If you put a BEGIN at the start and a COMMIT at the end, you'd doubtless see an ENORMOUS improvement. That's not even the _big_ improvement, either. The _big_ improvement would involve reformatting the data so that you could use the COPY statement, which is _way_ faster than a bunch of INSERTs. Take a look at the documentation to see the formatting that is needed: http://techdocs.postgresql.org/techdocs/usingcopy.php http://www.faqs.org/docs/ppbook/x5504.htm http://www.postgresql.org/docs/7.4/static/sql-copy.html -- output = ("cbbrowne" "@" "ntlug.org") http://www3.sympatico.ca/cbbrowne/lsf.html Question: How many surrealists does it take to change a light bulb? Answer: Two, one to hold the giraffe, and the other to fill the bathtub with brightly colored machine tools.
В списке pgsql-sql по дате отправления: