Processing database query-results piecemeal
От | Stephen R. van den Berg |
---|---|
Тема | Processing database query-results piecemeal |
Дата | |
Msg-id | 20080630111742.GA19746@cuci.nl обсуждение исходный текст |
Ответы |
Re: Processing database query-results piecemeal
|
Список | pgsql-hackers |
I'm looking at the most efficient and lean way to interface with the DB in a least-overhead scenario to process large(r) amounts of binary data. For simplicity, I want to avoid using the Large-Object facility. It seems that the most efficient way to communicate with the DB would be through PQexecParams(), which avoids the whole bytea-encoding issues. However, two questions spring to mind: - The docs say that you can use $1, $2, etc. to reference parameters. What happens if you have more than 9 parameters? Doesit become $10 or ${10} or $(10) or is it simply not possible te reference more than nine parameters this way? - Say that the SELECT returns 1000 rows of 100MB each, is there a way to avoid PQexecParams() from wanting to allocate 1000*100MB= 100GB at once, and somehow extract the rows in smaller chunks? (Incidentally, MySQL has such a facility). I.e.we call libpq several times, and get a few rows at a time, which are read from the DB-stream when needed. -- Sincerely, Stephen R. van den Berg.
В списке pgsql-hackers по дате отправления: