Re: measuring most-expensive queries
От | Gourish Singbal |
---|---|
Тема | Re: measuring most-expensive queries |
Дата | |
Msg-id | 674d1f8a0505012211d858cb6@mail.gmail.com обсуждение исходный текст |
Ответ на | measuring most-expensive queries (Enrico Weigelt <weigelt@metux.de>) |
Список | pgsql-admin |
You could have made the following change in your conf file and reload the postgresql server. log_min_duration_statement = 10000 This will help you log the queries that take more than 10000 mili seconds to execute in your postgresql sever log file regards Gourish Singbal On 5/2/05, Enrico Weigelt <weigelt@metux.de> wrote: > > Hi folks, > > I'd like to find out which queries are most expensive (taking very > long or producing high load) in a running system, to see what > requires further optimization. (the application is quite large > and some more folks involved, so I cant check evrything manually). > > Well, the postmaster can log ev'ry single statement, but its > really too for a human person, to read the log files. > > Is there any tool for that ? > > thx > -- > --------------------------------------------------------------------- > Enrico Weigelt == metux IT service > phone: +49 36207 519931 www: http://www.metux.de/ > fax: +49 36207 519932 email: contact@metux.de > --------------------------------------------------------------------- > Realtime Forex/Stock Exchange trading powered by postgresSQL :)) > http://www.fxignal.net/ > --------------------------------------------------------------------- > > ---------------------------(end of broadcast)--------------------------- > TIP 2: you can get off all lists at once with the unregister command > (send "unregister YourEmailAddressHere" to majordomo@postgresql.org) > -- Best, Gourish Singbal
В списке pgsql-admin по дате отправления: