Re: My Experiment of PG crash when dealing with huge amount of data
От | 高健 |
---|---|
Тема | Re: My Experiment of PG crash when dealing with huge amount of data |
Дата | |
Msg-id | CAL454F0wXbvXSmiV7qm0dvGtArbR0jp2yrgjMi1uzqm8AE0eig@mail.gmail.com обсуждение исходный текст |
Ответ на | Re: My Experiment of PG crash when dealing with huge amount of data (Jeff Janes <jeff.janes@gmail.com>) |
Ответы |
Re: My Experiment of PG crash when dealing with huge amount of data
Re: My Experiment of PG crash when dealing with huge amount of data |
Список | pgsql-general |
>To spare memory, you would want to use something like:
>insert into test01 select generate_series,
>repeat(chr(int4(random()*26)+65),1024) from
>generate_series(1,2457600);
>insert into test01 select generate_series,
>repeat(chr(int4(random()*26)+65),1024) from
>generate_series(1,2457600);
Thanks a lot!
What I am worrying about is that:
If data grows rapidly, maybe our customer will use too much memory , Is ulimit command a good idea for PG?
Best Regards
2013/9/1 Jeff Janes <jeff.janes@gmail.com>
On Fri, Aug 30, 2013 at 2:10 AM, 高健 <luckyjackgao@gmail.com> wrote:The construct "values (srf1,srf2)" will generate its entire result set
>
>
> postgres=# insert into test01 values(generate_series(1,2457600),repeat(
> chr(int4(random()*26)+65),1024));
in memory up front, it will not "stream" its results to the insert
statement on the fly.
To spare memory, you would want to use something like:
insert into test01 select generate_series,
repeat(chr(int4(random()*26)+65),1024) from
generate_series(1,2457600);
Cheers,
Jeff
В списке pgsql-general по дате отправления: