Re: Slow query with a lot of data
От | Moritz Onken |
---|---|
Тема | Re: Slow query with a lot of data |
Дата | |
Msg-id | 24E3FBFF-E0B8-435E-ACE1-652E93FE2213@houseofdesign.de обсуждение исходный текст |
Ответ на | Re: Slow query with a lot of data (Matthew Wakeling <matthew@flymine.org>) |
Ответы |
Re: Slow query with a lot of data
|
Список | pgsql-performance |
>> > As far as I can tell, it should. If it is clustered on an index on > domain, and then analysed, it should no longer have to sort on domain. > > Could you post here the results of running: > > select * from pg_stats where attname = 'domain'; > schemaname | tablename | attname | null_frac | avg_width | n_distinct | most_common_vals | most_common_freqs | histogram_bounds | correlation public | result | domain | 0 | 4 | 1642 | {3491378,3213829,3316634,3013831,3062500,3242775,3290846,3171997,3412018,3454092 } | {0.352333,0.021,0.01,0.00766667,0.00566667,0.00533333,0.00533333,0.005,0.00266667,0.00266667 } | {3001780,3031753,3075043,3129688,3176566,3230067,3286784,3341445,3386233,3444374,3491203 } | 1 No idea what that means :) >> > > Sounds like an awfully long time to me. Also, I think restricting it > to 280 users is probably not making it any faster. If I hadn't restricted it to 280 users it would have run ~350days... Thanks for your help! moritz
В списке pgsql-performance по дате отправления: