| Re: [uruk] Spam from Uruk mailing list | 
[ Thread Index | 
Date Index
| More lists.tuxfamily.org/uruk Archives
] 
- To: uruk@xxxxxxxxxxxxxxxxxxx
 
- Subject: Re: [uruk] Spam from Uruk mailing list
 
- From: alimiracle@xxxxxxxxxx
 
- Date: Fri, 19 Aug 2016 03:24:17 -0700
 
- Dkim-signature: v=1; a=rsa-sha256; c=relaxed/simple; d=riseup.net; s=squak;	t=1471602257; bh=6g7yFnfyJfGrrY03qMQTYmSS07CM3VhgDF0MZDiE4XA=;	h=Date:From:To:Subject:In-Reply-To:References:From;	b=dqPmtoFWM32QnWyDE7Or1LxGgAmS4AquiOgZmuy59GTst1Z0BFSDoA3pqgjYz3KpF	 p6sDMM8ql46bukaIX7Oo6xi3lQHr/eb85AXrAOHIuKAzgbv4HCXhEtFMbhCUJyFdc6	 jlLhANafv1ZZposjuXOwckLdvLLuOlu28FK/qNTc=
 
hi
roll up your sleeves, clean your guns, lock and load. We’re gonna kill 
us some bad crawlers and bot..!
I have good idea
 I willCreate a link that is invisible for the human user which points 
to a subdirectory of the server. we can then consider
all ips from which requests to that subdirectory are made as spiders.
then we  tell bots not to go there via robots.txt. Bots that go there 
anyway can be considered as "malicious"
and blocked e.g. through nginx and iptables
have fun and be free
ali miracle