Re: Large Result and Memory Limit
От | Scott Marlowe |
---|---|
Тема | Re: Large Result and Memory Limit |
Дата | |
Msg-id | dcc563d10710041347y4a38c943g58dea986d61917e5@mail.gmail.com обсуждение исходный текст |
Ответ на | Re: Large Result and Memory Limit (Mike Ginsburg <mginsburg@collaborativefusion.com>) |
Список | pgsql-general |
On 10/4/07, Mike Ginsburg <mginsburg@collaborativefusion.com> wrote: > This is for the export only. Since it is an export of ~50,000 registrants, > it takes some time to process. We also have load balanced web servers, so > unless I want to create identical processes on all webservers, or write some > crazy script to scp it across the board, storing it as a text file is not an > option. I realize that my way of doing it is flawed, which the reason I > came here for advice. The CSV contains data from approximately 15 tables, > several of which are many-to-ones making joins a little tricky. My thought > was to do all of the processing in the background, store the results in the > DB, and allowing the requester to download it at their convenience. > > Would it be a good idea to create a temporary table that stored all of the > export data in it broken out by rows and columns, and when download time > comes, query from their? Yeah, I tend to think that would be better. Then you could use a cursor to retrieve then and serve them one line at a time and not have to worry about overloading your php server.
В списке pgsql-general по дате отправления: