Re: Are 50 million rows a problem for postgres ?

Поиск
Список
Период
Сортировка
От Ron Mayer
Тема Re: Are 50 million rows a problem for postgres ?
Дата
Msg-id POEDIPIPKGJJLDNIEMBEAEDFDJAA.ron@intervideo.com
обсуждение исходный текст
Ответ на Are 50 million rows a problem for postgres ?  (Vasilis Ventirozos <vendi@cosmoline.com>)
Список pgsql-admin
> Hi all, i work in a telco and i have huge ammount of data, (50 million)
> but i see a lack of performance at huge tables with postgres,
> are 50 million rows the "limit" of postgres ? (with a good performance)

I have worked on a datawarehouse (postgresql 7.3) with a
pretty standard star schema with over 250 million rows on
the central 'fact' table, and anywhere from 100 to 10+ million
records in the surrounding 'dimension' tables.

The most common queries were simple joins between 3 tables, with
selects on one of the ids.  These took a few (1-60) seconds.
About 500,000 new records were loaded each night; and the ETL
processing and creating some aggregates took about 11 hours/night
with 7.3, and 9 hours/night with 7.4beta.

Hope this helps.


В списке pgsql-admin по дате отправления:

Предыдущее
От: Achilleus Mantzios
Дата:
Сообщение: Re: Conditional row grained replication with DBMirror
Следующее
От: "Gaetano Mendola"
Дата:
Сообщение: Report Generator Proposal