Re: random_page_cost = 2.0 on Heroku Postgres

Поиск
Список
Период
Сортировка
От Joshua Berkus
Тема Re: random_page_cost = 2.0 on Heroku Postgres
Дата
Msg-id 873667728.6754.1329076919645.JavaMail.root@mail-1.01.com
обсуждение исходный текст
Ответ на Re: random_page_cost = 2.0 on Heroku Postgres  (Jeff Janes <jeff.janes@gmail.com>)
Ответы Re: random_page_cost = 2.0 on Heroku Postgres  (Peter van Hardenberg <pvh@pvh.ca>)
Список pgsql-performance
> Is there an easy and unintrusive way to get such a metric as the
> aggregated query times?  And to normalize it for how much work
> happens
> to have been doing on at the time?

You'd pretty much need to do large-scale log harvesting combined with samples of query concurrency taken several times
perminute.  Even that won't "normalize" things the way you want, though, since all queries are not equal in terms of
theamount of data they hit. 

Given that, I'd personally take a statistical approach.  Sample query execution times across a large population of
serversand over a moderate amount of time.  Then apply common tests of statistical significance.  This is why Heroku
hasthe opportunity to do this in a way that smaller sites could not; they have enough servers to (probably) cancel out
anyrandom activity effects. 

--Josh Berkus

В списке pgsql-performance по дате отправления:

Предыдущее
От: Jeff Janes
Дата:
Сообщение: Re: random_page_cost = 2.0 on Heroku Postgres
Следующее
От: Peter van Hardenberg
Дата:
Сообщение: Re: random_page_cost = 2.0 on Heroku Postgres