About Me

My photo
Working as Technical Lead in CollabNET software private limited.

Thursday, 18 February 2010

Preventing DOS attack

DOS attack is an attempt to make a computer resource unavailable to its intended users.

In a public site, the usual vulnerability we see is a DOS attack. Most of the time it used to be a crawler/spiders/bots/web-rippers. What we do know to stop them from attacking our site is to block them by individual or range of IP at network level.

Some spider/bots are intelligent enough to spoof IPs and so after some time, we again see a threat from them since they change their IP/IP range.

One other way to stop such crawler/spiders/bots/web-rippers is to block them by using useragent control.

#This was achieved using below snippet in httpd.conf

----------------------------------------
#Block access to robots
BrowserMatch emailsiphon badrobot
BrowserMatch BPFTP badrobot
BrowserMatch MSIECrawler badrobot
BrowserMatch WebStripper badrobot
BrowserMatch Offline badrobot
BrowserMatch Teleport badrobot
BrowserMatch Alkaline badrobot
BrowserMatch DLExpert badrobot
BrowserMatch HTTrack badrobot
BrowserMatch Controller badrobot


deny from env=badrobot


Also, for more information on apache access control http://httpd.apache.org/docs/2.2/howto/access.html

No comments: