Christof Meerwald
2013-11-24 22:51:58 UTC
Hi,
as I have now noticed for the second time that Google is trying to
crawl through the Perforce web interface and in the process bringing
down the server to a standstill due to overloading, could someone
(with the right permissions) please check in a file robots.txt
(directly under //depot) with something like:
User-agent: *
Disallow: /
This should then hopefully be returned when accessing
http://perforce.openwatcom.org:4000/robots.txt and stop the useless
Google crawler.
Christof
as I have now noticed for the second time that Google is trying to
crawl through the Perforce web interface and in the process bringing
down the server to a standstill due to overloading, could someone
(with the right permissions) please check in a file robots.txt
(directly under //depot) with something like:
User-agent: *
Disallow: /
This should then hopefully be returned when accessing
http://perforce.openwatcom.org:4000/robots.txt and stop the useless
Google crawler.
Christof
--
http://cmeerw.org sip:cmeerw at cmeerw.org
mailto:cmeerw at cmeerw.org xmpp:cmeerw at cmeerw.org
http://cmeerw.org sip:cmeerw at cmeerw.org
mailto:cmeerw at cmeerw.org xmpp:cmeerw at cmeerw.org