Re: robots.txt on git.postgresql.org
От | Andres Freund |
---|---|
Тема | Re: robots.txt on git.postgresql.org |
Дата | |
Msg-id | 20130709153049.GA4886@alap2.anarazel.de обсуждение исходный текст |
Ответ на | robots.txt on git.postgresql.org (Greg Stark <stark@mit.edu>) |
Ответы |
Re: robots.txt on git.postgresql.org
Re: robots.txt on git.postgresql.org Re: robots.txt on git.postgresql.org |
Список | pgsql-hackers |
On 2013-07-09 16:24:42 +0100, Greg Stark wrote: > I note that git.postgresql.org's robot.txt refuses permission to crawl > the git repository: > > http://git.postgresql.org/robots.txt > > User-agent: * > Disallow: / > > > I'm curious what motivates this. It's certainly useful to be able to > search for commits. Gitweb is horribly slow. I don't think anybody with a bigger git repo using gitweb can afford to let all the crawlers go through it. Greetings, Andres Freund -- Andres Freund http://www.2ndQuadrant.com/PostgreSQL Development, 24x7 Support, Training & Services
В списке pgsql-hackers по дате отправления: