Re: out of memory during COPY .. FROM
От | Robert Haas |
---|---|
Тема | Re: out of memory during COPY .. FROM |
Дата | |
Msg-id | AANLkTikgnZT5Tpct9GiaVka6HgX+pQJSF=ofi6qyjG=Z@mail.gmail.com обсуждение исходный текст |
Ответ на | out of memory during COPY .. FROM (Tom Lanyon <tom+pgsql-hackers@oneshoeco.com>) |
Список | pgsql-hackers |
On Tue, Feb 1, 2011 at 5:32 AM, Tom Lanyon <tom+pgsql-hackers@oneshoeco.com> wrote: > List, > > Can anyone suggest where the below error comes from, given I'm attempting to load HTTP access log data with reasonablysmall row and column value lengths? > > logs=# COPY raw FROM '/path/to/big/log/file' DELIMITER E'\t' CSV; > ERROR: out of memory > DETAIL: Cannot enlarge string buffer containing 1073712650 bytes by 65536 more bytes. > CONTEXT: COPY raw, line 613338983 > > It was suggested in #postgresql that I'm reaching the 1GB MaxAllocSize - but I would have thought this would only be aconstraint against either large values for specific columns or for whole rows. It's worth noting that this is after 613million rows have already been loaded (somewhere around 100GB of data) and that I'm running this COPY after the "CREATETABLE raw ..." in a single transaction. > > I've looked at line 613338983 in the file being loaded (+/- 10 rows) and can't see anything out of the ordinary. > > Disclaimer: I know nothing of PostgreSQL's internals, please be gentle! Is there by any chance a trigger on this table? Or any foreign keys? What version of PostgreSQL? -- Robert Haas EnterpriseDB: http://www.enterprisedb.com The Enterprise PostgreSQL Company
В списке pgsql-hackers по дате отправления: