fyi: reading large BYTEA

Поиск
Список
Период
Сортировка
От Vance Maverick
Тема fyi: reading large BYTEA
Дата
Msg-id DAA9CBC6D4A7584ABA0B6BEA7EC6FC0B5D31FE@hq-exch01.corp.pgp.com
обсуждение исходный текст
Список pgsql-jdbc
I'm working on reading large BYTEA fields from PostgreSQL 8.1.  As
various people have pointed out
<http://archives.postgresql.org/pgsql-jdbc/2005-06/msg00138.php>,
ResultSet.getBinaryStream runs out of memory for large BYTEAs.
(PreparedStatement.setBinaryStream works fine.)

But there's a workaround: use the SUBSTRING function to read only a
chunk at a time, e.g.

  SELECT SUBSTRING(my_field FROM 1 FOR 10000) FROM my_table WHERE ...

This was suggested by Karsten Hilbert
<http://archives.postgresql.org/pgsql-general/2005-01/msg00032.php>.
I've had no trouble getting this to work -- as it happens, the chunkwise
reading strategy is consistent with what I was doing in the surrounding
code anyway.

In discussion today
<http://archives.postgresql.org/pgsql-general/2007-08/msg01127.php>, Tom
Lane pointed out that

  Recent releases will [perform better] if the column has been marked
  SET STORAGE EXTERNAL (before storing anything in it...)  See the
  ALTER TABLE reference page.

I tried this, and with the test sizes I was running (15 Mb data, 250K
chunk), the difference was not significant.  Reading was a bit faster,
but writing was considerably slower, adding up to a wash.  Your mileage
may vary.

    Vance

В списке pgsql-jdbc по дате отправления:

Предыдущее
От: Владимир Баранов
Дата:
Сообщение: COPY FROM STDIN BINARY
Следующее
От: "Kalle Hallivuori"
Дата:
Сообщение: Re: COPY
FROM STDIN BINARY