Re: Data Warehousing
От | Scott Marlowe |
---|---|
Тема | Re: Data Warehousing |
Дата | |
Msg-id | dcc563d10709030052w16d02d8esce2be0adfd129388@mail.gmail.com обсуждение исходный текст |
Ответ на | Data Warehousing ("Rob Kirkbride" <rob.kirkbride@gmail.com>) |
Ответы |
Re: Data Warehousing
|
Список | pgsql-general |
On 9/3/07, Rob Kirkbride <rob.kirkbride@gmail.com> wrote: > Hi, > > I've got a postgres database collected logged data. This data I have to keep > for at least 3 years. The data in the first instance is being recorded in a > postgres cluster. This then needs to be moved a reports database server for > analysis. Therefore I'd like a job to dump data on the cluster say every > hour and record this is in the reports database. The clustered database > could be purged of say data more than a week old. > > So basically I need a dump/restore that only appends new data to the reports > server database. > > I've googled but can't find anything, can anyone help? You might find an answer in partitioning your data. There's a section in the docs on it. Then you can just dump the old data from the newest couple of partitions if you're partitioning by week, and dump anything older with a simple delete where date < now() - interval '1 week' or something like that.
В списке pgsql-general по дате отправления: