Folks:
This days I working on HVNaughtieWiki , send info for feedback to squeakros and to aida list.
Besides some squeakers looking this work in progress, this morning I have the attention of Google
This is the print string of request.
HttpRequest (URL=/file/squeak.sts; protocol=HTTP/1.1; header=a Dictionary('accept'->'*/*' 'accept-encoding'->'gzip,deflate' 'connection'->'Keep-alive' 'from'->'googlebot(at)googlebot.com' 'host'->'190.193.89.80:8085' 'user-agent'->'Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)' ); getFields=a Dictionary(); postFields=a Dictionary())
All is Comanche + HV2 and some on top.
How deal with this?
Refuse the connect ? Send a error page?
Any hints is welcomed
Edgar
I think, you should define rules for bots in robots.txt file, and put there anything which you want robots to index, and what's not. See http://www.robotstxt.org/robotstxt.html
On 29 January 2010 11:14, Edgar J. De Cleene edgardec2001@yahoo.com.ar wrote:
Folks:
This days I working on HVNaughtieWiki , send info for feedback to squeakros and to aida list.
Besides some squeakers looking this work in progress, this morning I have the attention of Google
This is the print string of request.
HttpRequest (URL=/file/squeak.sts; protocol=HTTP/1.1; header=a Dictionary('accept'->'*/*' 'accept-encoding'->'gzip,deflate' 'connection'->'Keep-alive' 'from'->'googlebot(at)googlebot.com' 'host'->'190.193.89.80:8085' 'user-agent'->'Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)' ); getFields=a Dictionary(); postFields=a Dictionary())
All is Comanche + HV2 and some on top.
How deal with this?
Refuse the connect ? Send a error page?
Any hints is welcomed
Edgar
On 1/29/10 8:17 AM, "Igor Stasenko" siguctua@gmail.com wrote:
I think, you should define rules for bots in robots.txt file, and put there anything which you want robots to index, and what's not. See http://www.robotstxt.org/robotstxt.html
Very thanks!
On 1/29/10 8:17 AM, "Igor Stasenko" siguctua@gmail.com wrote:
I think, you should define rules for bots in robots.txt file, and put there anything which you want robots to index, and what's not. See http://www.robotstxt.org/robotstxt.html
Fail, but the info is useful, thanks Igor I attach my way to avoid Google using Comanche + HV
Edgar
squeak-dev@lists.squeakfoundation.org