Re: performance while importing a very large data set in to database
От | Ing . Marcos Luís Ortíz Valmaseda |
---|---|
Тема | Re: performance while importing a very large data set in to database |
Дата | |
Msg-id | 4B1A6B46.2080404@uci.cu обсуждение исходный текст |
Ответ на | performance while importing a very large data set in to database ("Ashish Kumar Singh" <ashishkumar.singh@altair.com>) |
Ответы |
Re: performance while importing a very large data set in to
database
|
Список | pgsql-performance |
Ashish Kumar Singh escribió: > > Hello Everyone, > > > > I have a very bit big database around 15 million in size, and the dump > file is around 12 GB. > > While importing this dump in to database I have noticed that initially > query response time is very slow but it does improves with time. > > Any suggestions to improve performance after dump in imported in to > database will be highly appreciated! > > > > > > > > Regards, > > Ashish > My suggestion is: 1- Afterward of the db restore, you can do a vacuum analyze manually on your big tables to erase all dead rows 2- Then you can reindex your big tables on any case that you use it. 3- Then apply A CLUSTER command on the right tables that have these indexes. Regards -- ------------------------------------- "TIP 4: No hagas 'kill -9' a postmaster" Ing. Marcos Luís Ortíz Valmaseda PostgreSQL System DBA Centro de Tecnologías de Almacenamiento y Anális de Datos (CENTALAD) Universidad de las Ciencias Informáticas Linux User # 418229 http://www.postgresql-es.org http://www.postgresql.org http://www.planetpostgresql.org
В списке pgsql-performance по дате отправления: