Re: GIN, pg_trgm and large table
От | Max Fomichev |
---|---|
Тема | Re: GIN, pg_trgm and large table |
Дата | |
Msg-id | c2c70eec-8d69-504a-bbaa-fd5c4fecc4a3@gmail.com обсуждение исходный текст |
Ответ на | Re: GIN, pg_trgm and large table (Tom Lane <tgl@sss.pgh.pa.us>) |
Список | pgsql-novice |
Hello Tom, thank you for your reply. On 10/03/2018 19:00, Tom Lane wrote: > Max Fomichev <max.fomitchev@gmail.com> writes: >> I have the following table and index with about 15 billion records. >> ... PostgreSQL 9.5.12 on x86_64-pc-linux-gnu, compiled by gcc (Ubuntu >> 5.4.0-6ubuntu1~16.04.9) 5.4.0 20160609, 64-bit ... It there any way >> to improve GIN/pg_tgrm performance on a such large table? > There was some work done in 9.6 to improve pg_trgm's performance when > dealing with queries involving very common trigrams. So maybe an > update to 9.6 or v10 would help you. I'll try 10.x version. > I have a bad feeling though that 15 billion rows is too many for an > index based on trigrams to be really useful --- there are just not > enough distinct trigrams. It's too bad we don't have a more general > N-gram indexing facility. Could you please advise what is the correct approach/index type for my case? I have about 15 billion ngram records (each ngram contains from 1 to 5 words). I'd like to find all ngrams where search word is a part of it. -- Best regards, Max Fomichev
В списке pgsql-novice по дате отправления: