Re: Using Postgres to store high volume streams of sensor readings
| От | Ciprian Dorin Craciun |
|---|---|
| Тема | Re: Using Postgres to store high volume streams of sensor readings |
| Дата | |
| Msg-id | 8e04b5820811210852k6ce7a7b6ub43b4368de33fc8c@mail.gmail.com обсуждение исходный текст |
| Ответ на | Re: Using Postgres to store high volume streams of sensor readings (Tom Lane <tgl@sss.pgh.pa.us>) |
| Ответы |
Re: Using Postgres to store high volume streams of sensor readings
|
| Список | pgsql-general |
On Fri, Nov 21, 2008 at 6:06 PM, Tom Lane <tgl@sss.pgh.pa.us> wrote:
> "Ciprian Dorin Craciun" <ciprian.craciun@gmail.com> writes:
>> In short the data is inserted by using COPY sds_benchmark_data
>> from STDIN, in batches of 500 thousand data points.
>
> Not sure if it applies to your real use-case, but if you can try doing
> the COPY from a local file instead of across the network link, it
> might go faster. Also, as already noted, drop the redundant index.
>
> regards, tom lane
Hy!
It won't be that difficult to use a local file (now I'm using the
same computer), but will it really make a difference? (I mean have you
seen such issues?)
Thanks,
Ciprian Craciun.
В списке pgsql-general по дате отправления: