Обсуждение: pgsql: Avoid parsing catalog data twice during BKI file construction.

Поиск
Список
Период
Сортировка

pgsql: Avoid parsing catalog data twice during BKI file construction.

От
Tom Lane
Дата:
Avoid parsing catalog data twice during BKI file construction.

In the wake of commit 5602265f7, we were doing duplicate-OID detection
quite inefficiently, by invoking duplicate_oids which does all the same
parsing of catalog headers and .dat files as genbki.pl does.  That adds
under half a second on modern machines, but quite a bit more on slow
buildfarm critters, so it seems worth avoiding.  Let's just extend
genbki.pl a little so it can also detect duplicate OIDs, and remove
the duplicate_oids call from the build process.

(This also means that duplicate OID detection will happen during
Windows builds, which AFAICS it didn't before.)

This makes the use-case for duplicate_oids a bit dubious, but it's
possible that people will still want to run that check without doing
a whole build run, so let's keep that script.

In passing, move down genbki.pl's creation of its temp output files
so that it doesn't happen until after we've done parsing and validation
of the input.  This avoids leaving a lot of clutter around after a
failure.

John Naylor and Tom Lane

Discussion: https://postgr.es/m/37D774E4-FE1F-437E-B3D2-593F314B7505@postgrespro.ru

Branch
------
master

Details
-------
https://git.postgresql.org/pg/commitdiff/a0854f10722b20a445f5e67a357bd8809b32f540

Modified Files
--------------
doc/src/sgml/bki.sgml          |  4 +--
src/backend/catalog/Catalog.pm |  2 ++
src/backend/catalog/Makefile   |  3 +-
src/backend/catalog/genbki.pl  | 71 ++++++++++++++++++++++++++++++++----------
4 files changed, 60 insertions(+), 20 deletions(-)