Re: Using BLOBs with PostgreSQL
От | Martin A. Marques |
---|---|
Тема | Re: Using BLOBs with PostgreSQL |
Дата | |
Msg-id | 00100720110301.24769@math.unl.edu.ar обсуждение исходный текст |
Ответ на | Using BLOBs with PostgreSQL (Tim Kientzle <kientzle@acm.org>) |
Список | pgsql-general |
On Sat, 07 Oct 2000, Tim Kientzle wrote: > > I've been using MySQL for initial development; it has pretty > clean and easy-to-use BLOB support. You just declare a BLOB > column type, then read and write arbitrarily large chunks of data. > In Perl, BLOB columns work just like varchar columns; in JDBC, > the getBinaryStream()/setBinaryStream() functions provide support > for streaming large data objects. If you're talking about BLOB texts, just declare the column as text and thats all. In the case of binary data, I don't have an idea. I only work we text data. > How well-supported is this functionality in PostgreSQL? > I did some early experimenting with PG, but couldn't > find any column type that would accept binary data > (apparently PG's parser chokes on null characters?). > > I've heard about TOAST, but have no idea what it really > is, how to use it, or how well it performs. I'm leery > of database-specific APIs. As far as I have listen, it looks like a nice way to optimize searches in blobs. Don't know anything else. Saludos... :-) -- "And I'm happy, because you make me feel good, about me." - Melvin Udall ----------------------------------------------------------------- Mart�n Marqu�s email: martin@math.unl.edu.ar Santa Fe - Argentina http://math.unl.edu.ar/~martin/ Administrador de sistemas en math.unl.edu.ar -----------------------------------------------------------------
В списке pgsql-general по дате отправления: