Fight Against Bots To Protect Your Site

In the last few years, internet has become such an important part of our life that it is hard to imagine how our parents lived without it. I mean, sending a letter and waiting weeks for the response, or going to a library just to hear that they are really sorry but they do not have the information you need so much. Paying bills, booking hotels, buying and selling things, learning, dating – modern generation does just everything via internet.

Of course the always will be people trying to make a good hand of such a popular thing. As internet develops offering more and more options to the users, the hackers develop their bots to take advantage of sites. According to Brian Hughes, the founder and CEO of Integrity Marketing & Consulting, nowadays the majority of sites are visited more by bots than by humans.  That makes the web look more like a big rainforest full of wild animals and dangerous insects trying to attack you whenever possible.

So what shall we do? How can we survive in the internet infested with bots?

Well, first, not all the bots are bad. For instance, Googlebot, also known as “spider” as created by Google to search for new pages and add them to Google’s index. It crawls sites and creates a filing system of the whole internet, so it is not harmful at all. When it gets to your site, first it wants to get to your site’s robot.txt file, so make sure it is easy for the bots to get to your robot.txt file, and be careful not to block any pages of your site that you should not block.  After that, Googlebot needs access to all the important pages of your site, and, as far is it does not crawl DHTML, Flash, Ajax nor JavaScript, so you should not use this code for the important pages of your site. Also, in order to crawl your site effectively, Googlebot needs a good and logical internal linking structure, so you should better check yours. If you want to see how efficient is the work of Googlebot at your site, go to Webmaster Tools -> Crawl and analyze the statistics of errors, site maps and blocked URLs.

Unfortunately, not all the bots are so harmless as Googlebot. There are four types of harmful bots which may give you lots of troubles with your site if you are not careful enough.

Scrapers arebots that steal and duplicate content like email addresses or personal information from different sites like messageboards, e-commerce sites, airlines sites, etc. These bots grab your RSS feed and they know when you post something at your site, and usually you do not even notice that. But duplicate content can lead your site to being penalized and even to its disappearance from search engine rankings.

To find out if your site has been attacked by a scraper, you can use a service like Copyscape to detect duplicate content. The other option is WordPress’s trackback feature, which allows you to create internal links within your own content and see what sites are using it. Thus, if you see that your content is being used by some spam site without your permission, you can file a DMCA-complaint with Google.

If you know the IP address of the scraper attacking your site, you can just block it. You must be very careful here, as this way you can damage your own site, so if you do not know how to do it, you should better request a web developer’s assistance.

Hacking Tools. These attack sites and servers dealing with e-commerce. The target of hacking tools are credit cards and other personal information. It goes without saying that if your site is attacked with such bot, your clients will lose their money and you will lose clients and face serious problems.

I order to protect your site from being attacked by hacking tools, you need to make some small basic modifications to your.htaccess file (normally, it is in public_html directory). Here you have a starter list of common hacking bots. Copy and paste this list into the .htaccess file to block any of these bots from accessing your site. You can add bots to the list, remove them and modify the list as necessary.

Spammers. Oh, spammers. These bots fill sites with all types of informational garbage, which may not only cause people stop visiting your site, but also lead to your site being blocked in search results, destroying all your hard work and everything you have achieved with your site.

In order to protect your site from spambots, it is a good idea for those using WordPress is to install Akismet, a spam protecting service;use a goodsecurity plugin and setup automatic backups of your database. Also, you could set  legitimate registration with CAPTCHAs for all visitors wanting to comment or reply at your site.  Finally, follow wordpress.org to learn what’s new in the world of security.

Click Fraud bots make you advertisement useless, repeatedly “clicking” on it, which means that you waste all your advertising budget but do not get any response from a real customer. Not only it will make you lose your money, it will also deprive your site from any credibility, so that it will be complicated to find customers in the future.

What should you do? Download and install the Google AdSense Click Fraud monitoring plugin, which counts all the clicks on your ad and blocks the IP of the bot or person clicking on the ad when the number of the clicks exceeds some certain number. This plugin blocks a list of bot IP addresses, as well. It should be said here thatthis plugin is specifically for the Adsense customers; AdWords customers cannot install this plugin at their sites. Unfortunately, this plugin, however useful, cannot protect your site from DDoS attacks, but there exist tech security companies working with anti-DDoS tools and services. You can find out about the news in the field of security at wordpress.org.

Always remember that in order to make your site work successfully, you must be very attentive to it at any moment. This is the only way to defend your site in the web swarming with wicked and dangerous bots.