What is Webmaster Tool?
In simple Google Webmaster tool is an interface/communication between website owner and the Google Crawler/Bot, Google Bot crawls website and reports here on the webmaster tool for any uneven activities against the google search norms noticed on the website. The website owner or user can take action for the same, apart from this the webmaster tools helps to control and communicate several aspects to the search bots, gives data or insight of the search activities on the internet related to the business/Web pages. The more about webmaster is explained below in detail.
Tool: google.com/tools/webmaster, A free tool from Google.
To have a webmaster account for any website the website owner have to sign up and submit the website, which approves and provides the tracking script. The tracking script is then uploaded to the website root directory on the hosting server, so that tracking of all the webpages is possible.
In simple if a text or image is not clear on the website, anybody can notice that and a website owner can fix that. But for the components which crawls or bots see on website are not visible to anybody except the crawler itself, crawler maintains a record of each and everything and we create an account and access all the crawl details, here the great feature is we can control or communicate crawl/bots behaviour.
Features or components of Webmaster Tool:
Messages: Whenever something goes wrong with the website the Google team sends us a message to let us know of the same.
Note: We receive messages only if the bots crawl and notice the issue on the website.
Webmaster will not send any messages in real time, there will be a delay of two to three days at least.
For eg: your server is down for 5 min and during the time no bot reached any of the webpage and you will not receive any message.
In case if any of the bot crawls and notices the webpages are not accessible you will be intimated and will receive a message, but later after 2 or three days.
The messages will be sent only for the prolonged issues or high impact issues which can definitely cause a harm to the website performance. The alerts usually sent are about the external spam links, to many status errors (Includes all the server issues), phishing notification etc.
Eg of webmaster message:
Dear site owner or webmaster of http://example.com/,
We recently discovered that some pages on your site look like a possible phishing attack, in which users are encouraged to give up sensitive information such as login credentials or banking information. We have removed the suspicious URLs from Google.com search results and have begun showing a warning page to users who visit these URLs in certain browsers that receive anti-phishing data from Google.
Below are one or more example URLs on your site which may be part of a phishing attack:
Here is a link to a sample warning page: http://www.google.com/interstitial?url=http://www.example.com/~alldata2/datass/contact/bank.barclays.co.uk/
We strongly encourage you to investigate this immediately to protect users who are being directed to a suspected phishing attack being hosted on your web site. Although some sites intentionally host such attacks, in many cases the webmaster is unaware because:
1) the site was compromised
2) the site doesn’t monitor for malicious user-contributed content
If your site was compromised, it’s important to not only remove the content involved in the phishing attack, but to also identify and fix the vulnerability that enabled such content to be placed on your site. We suggest contacting your hosting provider if you are unsure of how to proceed.
Once you’ve secured your site, and removed the content involved in the suspected phishing attack, or if you believe we have made an error and this is not actually a phishing attack, you can request that the warning be removed by visiting this page, and reporting an “incorrect forgery alert.” We will review this request and take the appropriate actions.
Apart from messages webmaster tool notifies you to check on several aspects when a major change is noticed. The change can be related to search organic traffic, Index pages, External links and sometimes a technical error impacting SEO.
The appearance of our website on the search results page can be customized here, we have three different components like
The data types implemented and their status and errors are found here. Here in the below image we can see there were no data types on the website up to April 2014, later in April the two data types are implemented, one is breadcrumb and the other is Image Object(Image Schema).
As the bots observes and records the same and can be seen in the graph below and currently the number of such pages with structured data found are 445851.
Is an interface which allows the website owner to Highlight or to communicate to the search engines to show specific data type to the users, such as rating, reviews, author name, Price, event timing etc.
In the below example, the website owner has highlighted the user rating, the number or reviews, the price and product availability.
An interface where the duplicate content on the website and the content whose length in characters doesn’t follow the norms is recorded and the website owner can explore and fix the issue.
Any user searching for a website will see some of the related links/webpages related to the website and the same can be controlled. Let’s say’s google don’t want to show up the 4th site link “Images”, the same can be blocked under the “site links” section.
Note: The site links which are not required can be blocked, but which to be shown up is purely a search engine decision based on traffic to those webpages in relation to the user interest/activity on the page (Time Spent), CTR’s for the page, user satisfaction, season of the product search etc.
This is data rich section, where you can see for what all the search terms/Queries a website is shown on search engines, from where else the website is accessible(External links), either by bot’s or by user and visit our website. How bot’s navigate internally once they reach any of the webpage.
Search terms: Available by device type (Mobile and Web), result type (Mobile, Web, Images).
Here the data like no of times website shown up for the searches, number of visitors from that search term, the average position and some calculated data like the CTR is seen.
The valid web pages which we intend the user to access through search engines and their status are shown here.
The pages which ever crawled by search engines and indexed to serve for the internet searches and their number is seen here.
The top variation of words used more number of times on any content and whichever have been crawled by bots are shown here as the content keywords.
The Webpages which ever have to be removed from the index are submitted here as a request to google team.
The section allows you to take an action on the crawl rate/frequency of the website.
One can ask the bots to crawl specific set of webpages/urls.
Informs not to crawl the specific set of webpages.
And can check the url’s blocked by robots.
Also provides a section where we can just specify the parameter to block the urls containing them.
Is a section where you can look for more support tools to enrich your website appearance on the internet searches.