Re: optimized counting of web statistics
От | Rudi Starcevic |
---|---|
Тема | Re: optimized counting of web statistics |
Дата | |
Msg-id | 42C2D7B5.4070707@wildcash.com обсуждение исходный текст |
Ответ на | Re: optimized counting of web statistics (Matthew Nuzum <mattnuzum@gmail.com>) |
Ответы |
Re: optimized counting of web statistics
|
Список | pgsql-performance |
Hi, >I do my batch processing daily using a python script I've written. I >found that trying to do it with pl/pgsql took more than 24 hours to >process 24 hours worth of logs. I then used C# and in memory hash >tables to drop the time to 2 hours, but I couldn't get mono installed >on some of my older servers. Python proved the fastest and I can >process 24 hours worth of logs in about 15 minutes. Common reports run >in < 1 sec and custom reports run in < 15 seconds (usually). > > When you say you do your batch processing in a Python script do you mean a you are using 'plpython' inside PostgreSQL or using Python to execut select statements and crunch the data 'outside' PostgreSQL? Your reply is very interesting. Thanks. Regards, Rudi.
В списке pgsql-performance по дате отправления: