Re: n_distinct off by a factor of 1000
От | Michael Lewis |
---|---|
Тема | Re: n_distinct off by a factor of 1000 |
Дата | |
Msg-id | CAHOFxGqo9biZf0knrZyhjFxFye9fS1Ea-BaU2rwcG3Mm8j-Bwg@mail.gmail.com обсуждение исходный текст |
Ответ на | Re: n_distinct off by a factor of 1000 (Pavel Luzanov <p.luzanov@postgrespro.ru>) |
Ответы |
Re: n_distinct off by a factor of 1000
|
Список | pgsql-general |
On Thu, Jun 25, 2020 at 7:27 AM Pavel Luzanov <p.luzanov@postgrespro.ru> wrote:
I have tried to increase the statistics target to 5000, and it helps, but it reduces the error to 100X. Still crazy high.
As far as I know, increasing default_statistics_target will not help. [1]I have considered these fixes:- hardcode the statistics to a particular ratio of the total number of rows
You can hardcode the percentage of distinct values:
ALTER TABLE bigtable ALTER COLUMN instrument_ref SET ( n_distinct=-0.06 ); /* -1 * (33385922 / 500000000) */
[1] https://www.postgresql.org/message-id/4136ffa0812111823u645b6ec9wdca60b3da4b00499%40mail.gmail.com
Thanks for sharing. Very interesting read. If anyone has reference to the papers alluded to, that would be appreciated. I had forgotten about the option to set negative values.
В списке pgsql-general по дате отправления: