Index coverage issues
It is very common to see index coverage issues on new website or blogs. If you have received a message in your Search Console which says “New Index coverage issue detected for site “yourdomain” then there are couple of things which you need to check.
Check the robots file
As the message says “Indexed, though blocked by robots.txt”. You need to check the robots file and see if your robots file is allowing crawler to crawl and index the website. If you have specified the search engines in robots file then please check if you have disalowed any URL which you are not suppose to disallow. Most commonly people follow the practice to disallow codes to make it clean for crawlers. But this practice might give you issues in website indexing. Take the example of robots file in my blog.
I have disallowed extensions but rest of the url’s are allowed and no extra parameter is added. Try to be more specific with robots file and only use disallow when it is required. Using disallow tag unnecessarily would create problems in website indexing.
Use Fetch as Google
This would be the next step if everything is fine in your robots file. You can use the feature to request indexing for particular URL which crawlers are not able to index. Open the webmaster. From left hand side menu, click on Crawl and choose “Fetch as Google”. Put the URL in the box and click fetch. Once completed, hit on request for indexing.
If the problem persists then you can try adding 301 and redirect URL to another location and try for step 2 (Fetch as google). Adding redirection would be the final step in this situation. This would solve your problem of indexing. But make sure to use the code wisely because if you made any changes to .htaccess file, it would affect your entire website/blog. The code for using 301 is “Redirect 301 /sample-blog-post.html to https://example.com”
PS: Here example(dot)com is the home directory. You can use your own location.
Once you are done with redirection, use fetch as google and submit your sitemap and have some patience. Your index coverage issues should be resolved after this.
In order to check the index status, you can use the new search console which has pretty cool feature of checking index status.
Open the new search console and click on “URL inspection” paste the url in the box and you will get whether the URL is on google or not. However, you can also try to old trick.
Open google type Info:your website url and hit enter. You will get the index status and if you check the cached option then you can also see the crawl date for that URL.
In New Search Console you get the option to test the inspect the URL by putting it in search console. You can test the performance of URL and request indexing. If you find errors in live test then work on it and get rid of maximum errors before requesting indexing.
In the month of May. On May 28 2020, Google released an update Core Web Vitals which is designed to test the URL performance and get rid of indexing issues. The short summary of update is, webmasters could use Page Speed Insights and Google light box to test the URLs and work on the suggestions to improve URL performance.
You can check the official announcement here.