Re: [GENERAL] BIG Data and Perl
От | Charles Tassell |
---|---|
Тема | Re: [GENERAL] BIG Data and Perl |
Дата | |
Msg-id | 4.1.19991018002646.00992850@mailer.isn.net обсуждение исходный текст |
Ответ на | Re: [GENERAL] BIG Data and Perl (Lincoln Yeoh <lylyeoh@mecomb.com>) |
Список | pgsql-general |
This is slightly unrelated (well, maybe more than slightly) but what is the advantage to using cursors over normal SELECT statements? I know from experience that just using an execute("SELECT...") and fetchrow_array doesn't go wild with memory usage, as long as you remember to close your statement handles. At 11:48 PM 10/17/99, Lincoln Yeoh wrote: >At 09:52 AM 15-10-1999 -0500, Andy Lewis wrote: >>I've got a fairly good size database that has in one table around 50,000 >>records in it. >> >>It starts of and processes the first 300-400 rows fast and then gets >>slower in time and eventually just quits. It'll run for about 4-6 hours >>before it quits. >> >>Any idea what may be going on here? > >Maybe you're running out of memory. Your perl script may be reading too >much into memory. > >When using the perl DBI module, I get the impression that the perl script >reads in all the results when you do >$cursor->execute > >I don't know if there are any ways around this. It can be a bit >inconvenient if the result is large ;). > >Cheerio, > >Link. > > >************ >
В списке pgsql-general по дате отправления: