Whenever we discuss SEO of blogs, WordPress robots.txt file plays a vital role in Google search engine crawling and indexing. It helps Bots to crawling and indexing certain part of website and blog content for search engine optimization. Sometimes a mismatch configured robots.txt file can let your downranking completely go away from search engines. So, it’s really important when you make changes in your robots.txt file make sure, it’s should be well optimized and should not block access to an important part of our blog.
What is Robots.txt and How To Optimize Robots.txt For WordPress Site?
- Robots.txt is text file is the entry point for any webmaster bost that helps how which pages & directory should be crwal and which part is to be skipped.
- robots.txt file helps the google search engine to which part of our blog to be the crawl and which part has to skip in crawling and index.
- when search engine bots come on our blog, They first look for robots.txt and start their process of crawling and indexing.
- If you are using WordPress for blogger, robots.txt will exist in the root folder of the server public_html directory of your website.
How to Make Sure Changed in Robots.txt file?
- Go to Google webmaster tool https://www.google.com/webmasters/.
- Click on Crawl > Fetch as Google.
- Click on Fetch button and request for the index.
- You can see all post which has submitted to you for forcefully indexing and crawling.
Default Robots File (robots.txt) For WordPress
SEO Optimized Robots File (robot.txt)
Tips You must know about Robots.txt
- it must be placed in the website root directory, else will not work
- Robots.txt file is case sensitive and it must be robots.txt not (Robots.txt)
- Each sub-domain should use separate robots.txt file
Hope this would help, anything knows submit your question!