Re: Tuning Postgres for single user manipulating large amounts of data
От | Andy Colson |
---|---|
Тема | Re: Tuning Postgres for single user manipulating large amounts of data |
Дата | |
Msg-id | 4D00EEE8.70408@squeakycode.net обсуждение исходный текст |
Ответ на | Re: Tuning Postgres for single user manipulating large amounts of data (Andy Colson <andy@squeakycode.net>) |
Ответы |
Re: Tuning Postgres for single user manipulating large
amounts of data
|
Список | pgsql-general |
On 12/9/2010 8:50 AM, Andy Colson wrote: > On 12/9/2010 6:25 AM, Paul Taylor wrote: >> Hi, Im using Postgres 8.3 on a Macbook Pro Labtop. >> I using the database with just one db connection to build a lucene >> search index from some of the data, and Im trying to improve >> performance. The key thing is that I'm only a single user but >> manipulating large amounts of data , i.e processing tables with upto 10 >> million rows in them, so I think want to configure Postgres so that it >> can create large temporary tables in memory >> >> I've tried changes various parameters such as shared_buffers, work_mem >> and checkpoint_segments but I don't really understand what they values >> are, and the documentation seems to be aimed towards configuring for >> multiple users, and my changes make things worse. For example my machine >> has 2GB of memory and I read if using as a dedicated server you should >> set shared memory to 40% of total memory, but when I increase to more >> than 30MB Postgres will not start complaining about my SHMMAX limit. >> >> Paul >> > > You need to bump up your SHMMAX is your OS. sorry: SHMMAX _in_ your OS. its an OS setting not a PG one. -Andy
В списке pgsql-general по дате отправления: