Performance issues when the number of records are around 10 Million

Поиск
Список
Период
Сортировка
От venu madhav
Тема Performance issues when the number of records are around 10 Million
Дата
Msg-id AANLkTinFR92eAjVJcukjHV8Nrb4CyjKRwayUTLYTcjMv@mail.gmail.com
обсуждение исходный текст
Ответы Re: Performance issues when the number of records are around 10 Million  (Guillaume Lelarge <guillaume@lelarge.info>)
Список pgadmin-support
Hi all,<br />       In my database application, I've a table whose records can reach 10M and insertions can happen at a
fasterrate like 100 insertions per second in the peak times. I configured postgres to do auto vacuum on hourly basis. I
havefrontend GUI application in CGI which displays the data from the database. When I try to get the last twenty
recordsfrom the database, it takes around 10-15  mins to complete the operation.This is the query which is used:<br
/><br/><b> select e.cid, timestamp, s.sig_class, s.sig_priority, s.sig_name, e.sniff_ip, e.sniff_channel, s.sig_config,
e.wifi_addr_1,<br/> e.wifi_addr_2, e.view_status, bssid  FROM event e, signature s WHERE s.sig_id = e.signature   AND
e.timestamp>= '1270449180' AND e.timestamp < '1273473180'  ORDER BY e.cid DESC,  e.cid DESC limit 21 offset
10539780;<br/></b><br /> Can any one suggest me a better solution to improve the performance.<br /><br />Please let me
knowif you've any further queries.<br /><br /><br /> Thank you,<br /><font color="#888888">Venu</font> 

В списке pgadmin-support по дате отправления:

Предыдущее
От: Sandeep Thakkar
Дата:
Сообщение: Re: error on getting sources of pgadmin3 using rsync
Следующее
От: Guillaume Lelarge
Дата:
Сообщение: Re: Performance issues when the number of records are around 10 Million