Re: BUG #2167: Performance degradation
От | Qingqing Zhou |
---|---|
Тема | Re: BUG #2167: Performance degradation |
Дата | |
Msg-id | dq8scb$2egu$1@news.hub.org обсуждение исходный текст |
Ответ на | BUG #2167: Performance degradation ("Sunil Basu" <sunil.basu@esspl.com>) |
Список | pgsql-bugs |
""Sunil Basu"" <sunil.basu@esspl.com> wrote > > Previously I used to insert records into the postgresql database > unconditionally. That is everytime a data comes it is stored in the > postgresql. So I land up with some redundant data always. > But the operation was smooth and near about 600 records could be inserted > per second. > > Now I have made a check in the postgresql database that whether a record > exists depending on criteria which is set as per the index order defined > for > my postgre table. I used a sql "Select 1 from ... where ..." statement for > checking in the postgresql. > Depending on the record count from the select query, I decide whether to > insert or to update. > Now I have noticed a considerable degradation in performance. Now near > about > 60-75 records can be updated/inserted per second. > A performance degradation is expected because your new program do two more things: query the index and maintain the index. But I am not sure how much. Your method will not work if you do the insertition concurrently. There are alternatives to prevent duplicates: (1) build a unique index on the attributes and let PostgreSQL to prevent duplicates; (2) do it later in a batch since your PostgreSQL is a backup - after you insert a lot of data with some duplates, do a "INSERT INTO another_table SELECT DISTINCT FROM this_table ...". Regards, Qingqing
В списке pgsql-bugs по дате отправления: