To help out those Web site owners, I've written a pretty simple PHP script generating dynamic Google XML Sitemaps as well as pseudo-static HTML site maps from one set of page data. Both the XML sitemap and the viewable version pull their data from a plain text file, where the site owner or Web designer adds a new a line per page after updates.
The Google XML Sitemap is a PHP script reflecting the current text files's content on request. It writes a static HTML site map page to disk. Since Googlebot downloads XML site maps every 12 hours like a clockwork, the renderable sitemap gets refreshed at least twice per day.
The site owner or Web designer just needs to change a simple text file on updates, and after the upload Googlebot recreates the sitemaps. Ain't that cute?
Curious? Here is the link: Simple Sitemaps 1.0 BETA
Although this free script provides a pretty simple sitemap solution, I wouldn't use it with Web sites containing more than 100 pages. Why not? Site map pages carrying more than 100 links may devalue the links. On the average Web server my script will work with hundreds of pages, but from a SEOs point of view that's counter productive.
Please download the script and tell me what You think. Thanks!
Tags: Search Engine Optimization (SEO) Google Sitemaps
Post it to