JDBC Blob API bug?
От | David Wall |
---|---|
Тема | JDBC Blob API bug? |
Дата | |
Msg-id | 002801c24f98$a217fea0$3201a8c0@expertrade.com обсуждение исходный текст |
Список | pgsql-jdbc |
It's hard to fault the PG JDBC library for this, but it does appear to be a problem with the java.sql.Blob API (or at least it's not documented well). I'm running 7.2.2. If you retrieve a Blob and then use the Blob.getBytes(0,blob.size()) method to suck in the entire blob into a byte array, there is no mechanism to "close" the Blob. So, with PG JDBC, the routine does a seek and read against the LargeObject, but there's no mechanism to close it, so the stream stays open. This results in strange errors in subsequent calls (like sql exception "No results were returned by the query."). The only workaround I've seen is to use the Blob.getBinaryStream(),suck in the data, then close the stream which then closes the underlying LargeObject. Here's a utility routine I used for converting a Blob into a byte[] when doing a SELECT: public byte[] blobToBytes(java.sql.Blob b) { java.io.InputStream is = null; try { is = b.getBinaryStream(); byte[] bytes = new byte[(int)b.length()]; is.read(bytes); return bytes; } catch( java.sql.SQLException e ) { return null; } catch( java.io.IOException e ) { return null; } finally { try { if ( is != null ) is.close(); } catch( Exception e ) {} } } David
В списке pgsql-jdbc по дате отправления: