Re: DBT-3 with SF=20 got failed
| От | Tom Lane |
|---|---|
| Тема | Re: DBT-3 with SF=20 got failed |
| Дата | |
| Msg-id | 12848.1443114295@sss.pgh.pa.us обсуждение исходный текст |
| Ответ на | Re: DBT-3 with SF=20 got failed (Tomas Vondra <tomas.vondra@2ndquadrant.com>) |
| Ответы |
Re: DBT-3 with SF=20 got failed
|
| Список | pgsql-hackers |
Tomas Vondra <tomas.vondra@2ndquadrant.com> writes:
> But what about computing the number of expected batches, but always
> start executing assuming no batching? And only if we actually fill
> work_mem, we start batching and use the expected number of batches?
Hmm. You would likely be doing the initial data load with a "too small"
numbuckets for single-batch behavior, but if you successfully loaded all
the data then you could resize the table at little penalty. So yeah,
that sounds like a promising approach for cases where the initial rowcount
estimate is far above reality.
But I kinda thought we did this already, actually.
regards, tom lane
В списке pgsql-hackers по дате отправления: