On Mon, Mar 28, 2005 at 11:32:04AM -0600, Yudie Gunawan wrote:
> I have table with more than 4 millions records and when I do select
> query it gives me "out of memory" error.
What's the query and how are you issuing it? Where are you seeing
the error? This could be a client problem: the client might be
trying to fetch all rows before doing anything with them, thereby
exhausting all memory. If that's the case then a cursor might be
useful.
> Does postgres has feature like table partition to handle table with
> very large records.
Let's identify the problem before guessing how to fix it.
--
Michael Fuhr
http://www.fuhr.org/~mfuhr/