Re: poor performance when recreating constraints on large tables
От | Samuel Gendler |
---|---|
Тема | Re: poor performance when recreating constraints on large tables |
Дата | |
Msg-id | BANLkTim5GO2k0m7E2=KevdZrMPwH-9aCDg@mail.gmail.com обсуждение исходный текст |
Ответ на | Re: poor performance when recreating constraints on large tables ("Kevin Grittner" <Kevin.Grittner@wicourts.gov>) |
Ответы |
Re: poor performance when recreating constraints on large
tables
|
Список | pgsql-performance |
On Wed, Jun 8, 2011 at 12:53 PM, Kevin Grittner <Kevin.Grittner@wicourts.gov> wrote:
Samuel Gendler <sgendler@ideasculptor.com> wrote:And it is sometimes off by orders of magnitude. How much remaining
> The planner knows how many rows are expected for each step of the
> query plan, so it would be theoretically possible to compute how
> far along it is in processing a query based on those estimates,
> wouldn't it?
time do you report when the number of rows actually processed so far
is five times the estimated rows that the step would process? How
about after it chugs on from there to 20 time she estimated row
count? Of course, on your next query it might finish after
processing only 5% of the estimated rows....
Sure, but if it is a query that is slow enough for a time estimate to be useful, odds are good that stats that are that far out of whack would actually be interesting to whoever is looking at the time estimate, so showing some kind of 'N/A' response once things have gotten out of whack wouldn't be unwarranted. Not that I'm suggesting that any of this is a particularly useful exercise. I'm just playing with the original thought experiment suggestion.
-Kevin
В списке pgsql-performance по дате отправления: