Main Website with Sub Directory = 2 Robots & Sitemaps respectively?

I have a website example.com and a subdirectory with example.com/blog/. (I can consider it as two websites)

The main website example.com is the service based and it is not a WordPress site. So, I created an example.com/robots.txt and example.com/sitemap.xml manually without using any plugins and submitted it to webmasters.

And also I created robots and sitemap for subdirectory which was a WordPress one like example.com/blog/robots.txt and example.com/blog/sitemap.xml with Yoast plugin and submitted to the webmaster.

Can the main site and its subdirectory can have separate robots and sitemap or we need to use only one robot and sitemap for the main site (or Subdirectory).

Also instead of these for the main site robots can we submit the main site sitemap and subdirectory sitemap.

Here is Solutions:

We have many solutions to this problem, But we recommend you to use the first solution because it is tested & true solution that will 100% work for you.

Solution 1

It is rather recommended to have own GSC properties, own robots and own sitemaps for such construction as you have. On this way you can be enough flexibly setup all entities of your site.

But don’t let something out: own GSC property for the site AND blog, own robots for both, and own sitemaps for both. Robots and sitemaps should be implemented in their respective GSC property. So you are on secure side.

Note: Use and implement solution 1 because this method fully tested our system.
Thank you 🙂

All methods was sourced from stackoverflow.com or stackexchange.com, is licensed under cc by-sa 2.5, cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply