search engine - Sitemap for a site with a large number of dynamic subdomains -
i'm running site allows users create subdomains. i'd submit these user subdomains search engines via sitemaps. however, according sitemaps protocol (and google webmaster tools), single sitemap can include urls single host only.
what best approach?
at moment i've following structure:
- sitemap index located @ example.com/sitemap-index.xml lists sitemaps each subdomain (but located @ same host).
- each subdomain has own sitemap located @ example.com/sitemap-subdomain.xml (this way sitemap index includes urls single host only).
- a sitemap subdomain contains urls subdomain only, i.e., subdomain.example.com/*
- each subdomain has subdomain.example.com/robots.txt file:
--
user-agent: * allow: / sitemap: http://example.com/sitemap-subdomain.xml
--
i think approach complies sitemaps protocol, however, google webmaster tools give errors subdomain sitemaps: "url not allowed. url not allowed sitemap @ location."
i've checked how other sites it. eventbrite, instance, produces sitemaps contain urls multiple subdomains (e.g., see http://www.eventbrite.com/events01.xml.gz). this, however, not comply sitemaps protocol.
what approach recommend sitemaps?
i struggled through , got working. see thread more details:
summary:
- use dns verification verify site , all it's subdomains in 1 fell swoop
- make robots.txt on subdomains point main sitemap on www domain
- you may need wait several days google update it's cached copies of robot.txt on subdomains. still show errors until then.
Comments
Post a Comment