Re: JDBC driver's (non-)handling of InputStream:s
От | Dave Cramer |
---|---|
Тема | Re: JDBC driver's (non-)handling of InputStream:s |
Дата | |
Msg-id | 1080611745.550.85.camel@localhost.localdomain обсуждение исходный текст |
Ответ на | JDBC driver's (non-)handling of InputStream:s (Peter Schuller <peter.schuller@infidyne.com>) |
Ответы |
Re: JDBC driver's (non-)handling of InputStream:s
Re: JDBC driver's (non-)handling of InputStream:s |
Список | pgsql-jdbc |
Peter, It would be great if you could supply a test case which exhibits this behaviour. Dave On Mon, 2004-03-29 at 20:10, Peter Schuller wrote: > Hello, > > Tonight I rewrote a part of an application that deals with http uploads, > because it turned out it has to handle larger files than originally intended > - and one was getting OutOfMemory errors. > > So I rewrote evcerything so that an InputStream is passed to the JDBC driver > and the files are never completely loaded into memory. However I am still > getting an OutOfMemory error for large files. While it is difficult to > pinpoint exactly where due to the lack of a stack trace, it does look like > the driver is causing it. > > Does the JDBC driver handle InputStream:s intelligently at all? If so, does it > do so under all circumstances? In this case I am putting data into a column > of type 'bytea' and am using PreparedStatement.setBinaryStream(). > > The backend is PostgreSQL 7.4.1, and I am using the driver for 7.4.1 > (pg74.1jdbc3.jar). Running under JDK 1.4.2. > > Do I need to use some other type in the database in order for input streams to > be handled properly? Do I have to use some PostgreSQL specific API? Does the > JDBC driver need to be changed to support this? > > I can always fall back to using files on the filesystem, but then I will loose > all the niceties that come with ACID transactions which I automatically get > if I keep it all in the database. > > Thanks! -- Dave Cramer 519 939 0336 ICQ # 14675561
В списке pgsql-jdbc по дате отправления: