Re: Inefficient queryplan for query with
От | Ron |
---|---|
Тема | Re: Inefficient queryplan for query with |
Дата | |
Msg-id | 6.2.3.4.0.20050827104222.05e36150@pop.earthlink.net обсуждение исходный текст |
Ответ на | Re: Inefficient queryplan for query with intersectable (Tom Lane <tgl@sss.pgh.pa.us>) |
Список | pgsql-performance |
At 10:27 AM 8/27/2005, Tom Lane wrote: >Arjen van der Meijden <acmmailing@tweakers.net> writes: > > But appareantly there is a bug in the explain mechanism of the 8.1devel > > I'm using (I downloaded a nightly 25 august somewhere in the morning > > (CEST)), since it returned: > > ERROR: bogus varno: 9 > >Yeah, someone else sent in a test case for this failure (or at least one >with a similar symptom) yesterday. I'll try to fix it today. > > > Is a nested loop normally so much (3x) more costly than a hash join? Or > > is it just this query that gets estimated wronly? > >There's been some discussion that we are overestimating the cost of >nestloops in general, because we don't take into account that successive >scans of the inner relation are likely to find many pages already in >cache from the earlier scans. So far no one's come up with a good cost >model to use for this, though. > > regards, tom lane It certainly seems common in the EXPLAIN ANALYZE output I see that the (estimated) cost of Nested Loop is far higher than the actual time measured. What happened when someone tried the naive approach of telling the planner to estimate the cost of a nested loop based on fitting whatever entities are involved in the nested loop in RAM as much as possible? When there are multiple such mappings, use whichever one results in the lowest cost for the NL in question. Clearly, this should lead to an underestimate of the cost of the constant of operation involved, but since nested loops have the only polynomial growth function of the planner's choices, NL's should still have a decent chance of being more expensive than other choices under most circumstances. In addition, if those costs are based on actual measurements of how long it takes to do such scans then the estimated cost has a decent chance of being fairly accurate under such circumstances. It might not work well, but it seems like a reasonable first attempt at a solution? Ron Peacetree
В списке pgsql-performance по дате отправления: