Re: BUG #4527: Prepare of large multirow insert fails without error
От | Vincent Kessler |
---|---|
Тема | Re: BUG #4527: Prepare of large multirow insert fails without error |
Дата | |
Msg-id | 491C5DDA.7090909@quantec-networks.de обсуждение исходный текст |
Ответ на | Re: BUG #4527: Prepare of large multirow insert fails without error (Tom Lane <tgl@sss.pgh.pa.us>) |
Список | pgsql-bugs |
Tom Lane schrieb, Am 13.11.2008 16:28: > "Vincent Kessler" <vincent.kessler@quantec-networks.de> writes: >> i am trying to do large multirow inserts using PQsendPrepare. I have not >> found a limit in the number of parameters or the size of the querystring, so >> i assume memory is the limit. >> When executing the PQsendPrepare function using a querystring of about 100kb >> in size and about 10000 parameters the function returns after timeout. A >> tcpdump shows a "parse" message with a length of 100kb but the transfer >> stops after roughly 30kb. > > With such a large statement it's unlikely that the PQsendPrepare call > would have been able to push all the data out immediately. Since it's > intended to not block the application, it would return with some data > still unsent. You need to call PQflush periodically until the data is > all transmitted, if you want to run in nonblocking mode. > Thank you very much, that was exactly the problem. Everything works perfectly now. Regards, Vincent
В списке pgsql-bugs по дате отправления: