Re: COPY to table with array columns (Longish)
От | Aaron Bono |
---|---|
Тема | Re: COPY to table with array columns (Longish) |
Дата | |
Msg-id | bf05e51c0606121958i54a7833bt76bb8756dd41aec1@mail.gmail.com обсуждение исходный текст |
Ответ на | Re: COPY to table with array columns (Longish) (Tom Lane <tgl@sss.pgh.pa.us>) |
Ответы |
Re: COPY to table with array columns (Longish)
Re: COPY to table with array columns (Longish) |
Список | pgsql-sql |
I agree with Tom. Personally I cannot think of a time I would use an array column over a child table. Maybe someone can enlighten me on when an array column would be a good choice.
What language are you using to do the export if I may ask?
-Aaron
What language are you using to do the export if I may ask?
-Aaron
On 6/12/06, Tom Lane <tgl@sss.pgh.pa.us> wrote:
"Phillip Smith" <phillips@weatherbeeta.com.au> writes:
> The whole sys file is variable length records like this - they range =
> from 1
> to over 17,000 fields per record.
17000? I think you really need to rethink your schema. While you could
theoretically drop 17000 elements into a PG array column, you wouldn't
like the performance --- it'd be almost unsearchable for instance.
I'd think about two tables, one with a single row for each SYS record
from the original, and one with one row for each detail item (the
invoice numbers in this case). With suitable indexes and a foreign key
constraint, this will perform a lot better than an array-based
translation.
And no, in neither case will you be able to import that file without
massaging it first.
regards, tom lane
В списке pgsql-sql по дате отправления: