It has tools that let the webmasters: * Submit and check a sitemap * Check and set the crawl rate, and view statistics about how Googlebot accesses a particular site * Generate and check a robots.txt file. It also helps to discover pages that are blocked in robots.txt by chance. * List internal and external pages that link to the site * See what keyword searches on Google led to the site being listed in the SERPs, and the click through rates of such listings * View statistics about how Google indexes the site, and if it found any errors while doing it * Set a preferred domain (e.g. prefer example.com over www.example.com or vice versa), which determines how the site URL is displayed in SERPs
| Attributes | Values |
|---|---|
| rdfs:label |
|
| rdfs:comment |
|
| dcterms:subject | |
| dbkwik:google/prop...iPageUsesTemplate | |
| abstract |
|