Google has added some features to Sitemaps.
First off, you can now check your robots.txt file in Sitemaps. You can see whether your robots.txt file is blocking specific URLs, and why. Danny Sullivan explains that this feature just helps expose that the same syntax is interpreted different ways by different search engines, and we need to standardize things.
From the Sitemaps blog, here's the analysis of the robots.txt:
And the URL testing:
They've also released a feature that shows the most common words used on your site and in anchor text pointing at it. Here are my mine: