JDBC + large objects problem
От | Patrick Goodwill |
---|---|
Тема | JDBC + large objects problem |
Дата | |
Msg-id | Pine.BSF.4.21.0008251424240.1475-100000@cheese.stanford.edu обсуждение исходный текст |
Ответы |
Re: JDBC + large objects problem
Re: JDBC + large objects problem |
Список | pgsql-interfaces |
I'm trying to use JDBC and BLOBS to store large amounts of text in a database. I get a strange error when I try to use the conventional JDBC interface... it comes out with a SQL Exceptions of: "InputStream as parameter not supported" for the code: Connection conn = pool.getConnection(); PreparedStatement pstmt = conn.prepareStatement("INSERT INTO t"+ book + "_data (author_id, title, text, type) VALUES ( ?, ?, ?,?)"); pstmt.setInt(1, userId); pstmt.setString(2, title); InputStream textStream = stringToStream(text); pstmt.setBinaryStream(3,textStream, text.length()); pstmt.setInt(4, type); pstmt.executeUpdate(); pstmt.close(); ... with some helper functions.... private InputStream stringToStream(String string) {byte[] bytes = string.getBytes();ByteArrayInputStream stream = newByteArrayInputStream(bytes);return (InputStream) stream; } private String streamToString(InputStream stream) {try { int length = stream.available(); byte[] bytes = newbyte[length]; stream.read(bytes); return new String(bytes);} catch (IOException e) { System.out.println("No Stream");}returnnull; } with an abbreviated schema of.... >> \d t1_data Table "t1_data"Attribute | Type | Modifier -----------+---------+-------------------------------------------------------data_id | integer | not null default nextval('t1_data_data_id_seq'::text)author_id | integer |title | text |text | oid |type | integer |time | time | Index: t1_data_pkey .... using postgresql 7.0 and the newest JDBC driver from retep.org.uk if ya'll have any ideas why it does what it does, i just might kiss your feet. =) -Patrick.
В списке pgsql-interfaces по дате отправления: