General 'big data' advice....
От | James David Smith |
---|---|
Тема | General 'big data' advice.... |
Дата | |
Msg-id | CAMu32ADDXJcEeW0uH1HTq74R5tZWtDHGFu9P-y2BbbpCn6GjPg@mail.gmail.com обсуждение исходный текст |
Ответы |
Re: General 'big data' advice....
Re: General 'big data' advice.... |
Список | pgsql-novice |
Hi, Bit of an abstract question I appreciate, however I just thought I'd see what people thought. I have an anonymosied dataset of travel behaviour of some people in a major city (I'd rather not go into details if that's ok). What I intend to do is to work out where they each are for every minute of the day. So for ~80,000 people x 1440 minutes = 115,200,000 rows of data! So a few questions: 1) Is PostgreSQL going to be able to cope with this? In terms of the table size? I think so... 2) My columns will be something like person_id integer, person_timestamp timestamp, person_location_geom geometry Any thoughts on those? The format of the columns? 3) I'll probably create a Primary Key which is a combination of person_id and person_timestamp. Does this sound like a good idea? 4) Should I use some indexes to improve performance maybe? Best wishes James
В списке pgsql-novice по дате отправления: