Re: Working with very large datasets
От | Stephan Szabo |
---|---|
Тема | Re: Working with very large datasets |
Дата | |
Msg-id | 20030211160155.O19527-100000@megazone23.bigpanda.com обсуждение исходный текст |
Ответ на | Working with very large datasets (Wilkinson Charlie E <Charlie.E.Wilkinson@irs.gov>) |
Список | pgsql-sql |
On Tue, 11 Feb 2003, Wilkinson Charlie E wrote: > Greetings, > Can anyone enlighten me or point me at resources concerning use of pgsql > with > very large datasets? > > My specific problem is this: > > I have two tables, one with about 100 million rows and one with about 22,000 > rows. My plan was to inner join the two tables on an integer key and output > the 4 significant columns, excluding the keys. (Those with a better > understanding > of pgsql internals, feel free to laugh.) The result was a big angry psql > that > grew to 800+MB before I had to kill it. Was it psql that grew to 800Mb or a backend? If the former, how many rows do you expect that to return? You probably want to look into using cursors rather than returning the entire result set at once.
В списке pgsql-sql по дате отправления: