Re: Inserting large BLOBs via JDBC - OutOfMemoryError

Поиск
Список
Период
Сортировка
От hhaag@gmx.de
Тема Re: Inserting large BLOBs via JDBC - OutOfMemoryError
Дата
Msg-id 19217.1029485671@www9.gmx.net
обсуждение исходный текст
Ответ на Inserting large BLOBs via JDBC - OutOfMemoryError  (hhaag@gmx.de)
Список pgsql-jdbc
There might be a workaround:

From JDBC 2.0 on, the interface java.sql.Blob allows to manipulate BLOBs.
AbstractJdbc2Statement.setBlob() (code posted below) seems to create a BLOB
upfront storing it in the database. The actual INSERT command will then only
contain the OID, avoiding all memory problems.

I'll let you know how it goes.



    public void setBlob(int i, Blob x) throws SQLException
    {
        InputStream l_inStream = x.getBinaryStream();
        int l_length = (int) x.length();
        LargeObjectManager lom = connection.getLargeObjectAPI();
        int oid = lom.create();
        LargeObject lob = lom.open(oid);
        OutputStream los = lob.getOutputStream();
        try
        {
            // could be buffered, but then the OutputStream returned by LargeObject
            // is buffered internally anyhow, so there would be no performance
            // boost gained, if anything it would be worse!
            int c = l_inStream.read();
            int p = 0;
            while (c > -1 && p < l_length)
            {
                los.write(c);
                c = l_inStream.read();
                p++;
            }
            los.close();
        }
        catch (IOException se)
        {
            throw new PSQLException("postgresql.unusual", se);
        }
        // lob is closed by the stream so don't call lob.close()
        setInt(i, oid);
    }



--
GMX - Die Kommunikationsplattform im Internet.
http://www.gmx.net


В списке pgsql-jdbc по дате отправления:

Предыдущее
От: hhaag@gmx.de
Дата:
Сообщение: Re: Inserting large BLOBs via JDBC - OutOfMemoryError
Следующее
От: Barry Lind
Дата:
Сообщение: Re: Inserting large BLOBs via JDBC - OutOfMemoryError