Re: Importing *huge* mysql database into pgsql
От | Chris |
---|---|
Тема | Re: Importing *huge* mysql database into pgsql |
Дата | |
Msg-id | 45EE124E.2030007@gmail.com обсуждение исходный текст |
Ответ на | Importing *huge* mysql database into pgsql (".ep" <erick.papa@gmail.com>) |
Список | pgsql-general |
.ep wrote: > Hello, > > I would like to convert a mysql database with 5 million records and > growing, to a pgsql database. > > All the stuff I have come across on the net has things like > "mysqldump" and "psql -f", which sounds like I will be sitting forever > getting this to work. If you can convert the database schema, then in mysql do a dump of the tables like this: select * from table into outfile '/tmp/filename'; (see http://dev.mysql.com/doc/refman/4.1/en/select.html) and then import it into postgres like this: \copy table from '/tmp/filename' (see http://www.postgresql.org/docs/8.2/interactive/sql-copy.html) That's much better because it creates a CSV like file which postgres can process in one go. Using complete inserts to do a conversion is horribly slow because postgres does a single transaction per insert - so you can either wrap a number of inserts inside a transaction, or do a copy like this (copy is best). -- Postgresql & php tutorials http://www.designmagick.com/
В списке pgsql-general по дате отправления: