Re: Indexing large table of coordinates with GiST
От | Paul Ramsey |
---|---|
Тема | Re: Indexing large table of coordinates with GiST |
Дата | |
Msg-id | etPan.54b7f878.2443a858.5b48@Butterfly.local обсуждение исходный текст |
Ответ на | Re: Indexing large table of coordinates with GiST (Rémi Cura <remi.cura@gmail.com>) |
Ответы |
Re: Indexing large table of coordinates with GiST
|
Список | pgsql-general |
As Remi notes, going with a pointcloud approach might be wiser, particularly if you aren’t storing much more about the points that coordinates and other lidar return information. Since you’re only working with points, depending on your spatial distribution (over poles? dateline?) you might just geohash them and index them with a btree instead. The index will work better than a rtree for points, efficiencywise, however you’ll still have a multi-billion record table, which could cause other slowdowns, depending on your plans for accessing this data once you’ve indexed it.
P.
Paul Ramsey
http://cleverelephant.ca
http://postgis.net
On January 15, 2015 at 8:44:03 AM, Rémi Cura (remi.cura@gmail.com) wrote:
Hey,
You may want to post this on postGIS list.If you really want to keep that much geometry,
you may want to partition your data on a regular grid.
Cheers,
Rémi-C
2015-01-15 15:45 GMT+01:00 Andy Colson <andy@squeakycode.net>:
On 1/15/2015 6:44 AM, Daniel Begin wrote:Hi, I'm trying to create an index on coordinates (geography type) over a
large table (4.5 billion records) using GiST...
CREATE INDEX nodes_geom_idx ON nodes USING gist (geom);
The command ran for 5 days until my computer stops because a power outage!
Before restarting the index creation, I am asking the community if there are
ways to shorten the time it took the first time :-)
Any idea?
Daniel
Set maintenance_work_mem as large as you can.
-Andy
--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general
В списке pgsql-general по дате отправления: