Re: process large tables
От | Nelson Arapé |
---|---|
Тема | Re: process large tables |
Дата | |
Msg-id | 200504141650.21673.narape@ica.luz.ve обсуждение исходный текст |
Ответ на | process large tables (Kristina Magwood <kristina@nanometrics.ca>) |
Список | pgsql-jdbc |
From the documentation (http://jdbc.postgresql.org/documentation/80/query.html#query-with-cursor) "By default the driver collects all the results for the query at once. This can be inconvenient for large data sets so the JDBC driver provides a means of basing a ResultSet on a database cursor and only fetching a small number of rows. ..." With a cursor you fetch rows by pieces. It is well explained in the documentation. Bye Nelson Arapé El Jue 14 Abr 2005 16:37, Kristina Magwood escribió: > Hi, > I am trying to process a large table. Unfortunately, using select * from > table gives me a ResultSet that is too large. > The java runs out of memory even if I boost the vm memory. > Is there any way I can programmatically (in java) retrieve say 10,000 > records at a time without knowing anything specific about the table? Then, > when I am done with those records, retrieve the next 10,000, etc? > > Thank you in advance for any help you can spare. > Kristina
В списке pgsql-jdbc по дате отправления: