lo_import for bytea columns
От | Jonathan Bartlett |
---|---|
Тема | lo_import for bytea columns |
Дата | |
Msg-id | Pine.GSU.4.44.0311200812090.5850-100000@eskimo.com обсуждение исходный текст |
Ответы |
Re: lo_import for bytea columns
|
Список | pgsql-general |
Is there an equivalent function for bytea columns that works like lo_import? Alternatively, is there a way to copy from a large object to a bytea column from SQL? Or maybe someone has another way of attacking this problem: I've got some Perl code that does this: undef $/; $data = <FHFOR89MBFILE>; $sth = $dbh->prepare("insert into data (bigbyteacolumn) values (?)"); $sth->bind_param(1, $data, DBI::SQL_BINARY); $sth->execute; Which has worked fine for a while, with file sizes around 10MB. However, now I have someone who wants to use this for a file that's 89MB, and it's taking up about 500M of memory before crashing. I'm trying to find a less-memory-consuming way of handling this, even if just for a temporary hack for this one file. I think what's happening is that Perl is reading in the 89M, and then I'm guessing that either Perl or the driver is converting that into a fully-escaped string for transfer, and this is where the problem is occuring. Any ideas? Thanks, Jonathan Bartlett
В списке pgsql-general по дате отправления: