0

http://beginlinux.com

A few tips I put together while re-creating the robots.txt file on my Linux web server. The robots.txt is used to provide crawling instructions to web robots using the Robots Exclusion Protocol. When a web robots visits your site it will check this file, robots.txt, to discover any directories or pages you want to exclude from the web robot listing on the search engine. This is an important file which determines SEO for search engines and can help rankings.

Full story »
aweber's picture
Created by aweber 12 years 20 weeks ago – Made popular 12 years 20 weeks ago
Category: Beginner   Tags:

Best karma users