Here's the problem I'm trying to solve: I have a dynamic php-driven website that is constantly being updated with new content, and I want my XML sitemap to stay up to date automatically. Two options I see:

  1. Write a php script that queries my database to get all my content and outputs to http://mysite.com/sitemap.xml, execute the script regularly using a cron job.
  2. Simply create my sitemap as a php file (sitemap.php), query the db and write directly to that file, and use the htaccess rewrite rule RewriteRule ^sitemap.xml$ sitemap.php so that whenever someone requests sitemap.xml they're directed to the php file and get a fresh sitemap file.

I'd much rather go with option #2 since it's simpler and doesn't require setting up a cron, but I'm wondering if Googlebot will not recognize sitemap.xml as valid if it's actually a php file?

Does anyone know if option #2 would work, and if not whether there's some better way to automatically create an up-to-date sitemap.xml file? I'm really surprised how much trouble I've had with this... Thanks!

Comments

While you can generate it with a script easily as the others have said, I would cache the result for a limited amount of time though to avoid unnecessary overhead.

Written by Wrikken

Accepted Answer

Just make sure your script generates the appropriate Content-Type header. You can do so with header().

This page was build to provide you fast access to the question and the direct accepted answer.
The content is written by members of the stackoverflow.com community.
It is licensed under cc-wiki