Large object insert performance.
От | Peter Haight |
---|---|
Тема | Large object insert performance. |
Дата | |
Msg-id | 200008232117.OAA34910@wartch.sapros.com обсуждение исходный текст |
Ответы |
Re: Large object insert performance.
|
Список | pgsql-general |
I'm populating a new database from some text files. I'm using large objects to store the body of the text files. I have a little thing setup to monitor how fast the inserts are going. They started out at about 20/sec and have been slowly dropping. I'm about 6% through my data and I'm already down to 2/sec and dropping. All I'm doing is inserting the large objects. No other action is happening. Here's the portion of the script that is populating my database: self.db.query('begin') body_lo = self.db.locreate(pg.INV_READ | pg.INV_WRITE) body_lo.open(pg.INV_WRITE) body_lo.write(puff.get('message/body', '')) body_oid = body_lo.oid body_lo.close() self.db.query('end') That is the full extent of my queries to the database. There are no tables or indexes defined. The average size of a body is about 300 bytes, but it goes as high as 30k. Is there any way to speed this up? If the handling of large objects is this bad, I think I might just store these guys on the file system.
В списке pgsql-general по дате отправления: