Home » Articles » Search engine optimization |
|
Search engine cloaking and stealth technology By David Callan Search engines play a very big part in whether company A or company B gets a visitor and potential customer. Webmasters and Internet marketers know this and hence competition for search engine traffic is fierce. These days it's almost impossible to keep up with the search engines, one day your site could be near the top the next day your competition could be there and you could be gone from the results completely. One particular method however is being used by webmasters to enable their
sites to rank high and stay high. The method is highly controversial and
risky. It's called search engine cloaking. In other words browsers such as Internet Explorer, Netscape and Opera are served one page and spiders visiting the same address are served a different page. The page the spider will see is a bare bones HTML page optimized for the search engines. It won't look pretty but will be configured exactly the way the search engines want it to be for it to be ranked high. These 'ghost pages' are never actually seen by any real person except for the webmasters that created them of course. When real people visit a site using cloaking the cloaking technology which is usually based on Perl/CGI will send them to the real page that look's good and is just a regular webpage. The search engine cloaking technology is able to tell the difference between a human and spider because it knows the spiders IP address. No IP address is the same so when an IP address visits a site which is using cloaking the script will compare the IP address with the IP addresses in its list of search engine IP's. If there's a match the script knows that it's a search engine visiting and sends out the bare bones HTML page setup for nothing but high rankings. Once a list of all the search engines spiders IP addresses have been stored, it's simply a case of writing a script that says something like: - If IP request = google(Spider IP) then show googlepage.html This means that when the Google spider comes to visit a site, it'll be shown a page that is optimized with keywords, heading tags and optimized content. Since the optimized page is never seen by a casual user design is not an important issue. When a user comes to the site the server performs the same check and finding that the IP address does not match any in its list shows the standard page. Search engine cloaking is also a great way of protecting the source code that's enabling you to rank high on the search engines. Ever read a search engine ranking tutorial that recommends you to model your keyword density, layout, etc on pages that are already high ranking? Well technically that's stealing and your competition might want to do it to you some day. With search engine cloaking however you can protect your code because when your competition visits they'll be sent to the regular page and not the page that's giving you those precious good rankings. Different types of search engine cloaking User Agent Cloaking is similar to IP cloaking in the sense that the cloaking
script compares the User Agent text string which is sent when a page is
requested with its list of search engine User Agent names and then serves
the appropriate page. Conclusion I would recommend you read my SEO tutorial entitled Search
engine optimization guide and try regular search engine optimization
first, if after a few months you're still not seeing good results then
you should at least consider using cloaking technology to improve your
rankings.
|
|