SEOToolSet Tool Review: Check Server Tool
SEO is not a profession for the weak-kneed. The job requires constant attention, continuous monitoring, on-demand creativity and a passion for problem solving. Thankfully, there are more than a few tools available that make the tasks of SEO go by a little quicker, if not easier.
As you may know, at Bruce Clay, Inc. we’re happy to offer our SEOToolSet to search marketers looking for solutions to necessary but tedious day-to-day tasks. At SEOToolSet.com, you can learn about the various tools available, including brief descriptions of what each does. However, our tools continue to grow and mature. We’re always developing new tools and improving our current tools, so we thought it would be useful to break down the use case, functionality and how-to for each tool in a series here on the blog.
First up: the Check Server Tool.
Why Check Your Server
Every Web site relies on a Web server to deliver pages to users and spiders traversing the site. When a page is requested, the server receives that request and responds with the content of that page. The page response and the content are displayed in your browser. A site couldn’t function without a Web server dedicated to serving up the site when it’s requested, so it’s very important to the health of an online business that the Web server does what the webmaster expects it to do.
There are several tools a webmaster can use to tell a search engine spider how to react when requesting pages on your site. For instance, a robots.txt file indicates specific pages that are not intended for search engine indexing. Also, redirect commands in place for individual pages or whole domains tell the server to direct the requestor to a different location to get the content that they want. If either of these tactics are implemented incorrectly or unknowingly, a webmaster or SEO could see some unexpected results. Another problem that can be traced back to a site’s server is IP blocking. Some sites reside on shared server blocks, and if a site sharing the server was caught performing spam tactics, the entire IP and all the site’s that share it could be put on a blocklist.
What the Check Server Tool Checks
Understanding the behavior of a site’s Web server is critical to the greater health and efficiency of the site. That’s why a tool like the SEOToolSet Check Server Tool, which analyzes the status of any potential server issues, is a great help. The Check Server Tool:
- Checks a site’s server for a shared IP and provides a way for you to check blocklists.
- Shows a site’s robots.txt file, which reports any pages or agents that have been disallowed.
- Reports a variety of information found in the server header returned for the page, including the page status, the content type and the new location in the case that the status is a redirect.
What to Look for in the Check Server Tool Report
After you input the address of the page you’d like to analyze, you’ll receive a report. The top of the report has a table that offers a quick overview of important results.
You can see what page has been analyzed, as well as the overall domain and the specific file where the page resides. Below that, the IP address is listed, as well as the result of pinging the page. A successful ping indicates the site is active and online. Under that you’ll see whether or not that page has been disallowed for search engine indexing or if any agent has been disallowed from visiting the site. In the case seen above, this page is closed to anyone and everyone for indexing.
The next part of the report is the server response header. When a user or spider requests a Web page, the server sends a response with the page contents. The header of that response tells the browser or spider how to behave for that page. Understandably, it’s important to know how your server tells browsers and spiders to behave on your page.
In this example, you can see that the server returned a “200 OK” status for the page, which means the request for the page succeeded. There are many other Server Response Codes (SRC) that can be returned, and some of them will alert you to a problem. For instance, a 404 means that a page cannot be found. A 403 indicates that the page requires a login and a password before it is served back to the browser. A 500 signals that the site’s Web server has performed an error. An SRC in the 300s means that a redirection is in place.
At the end of the Check Server Tool report are three sections. The first is called “Contents from Spider Page Read”, the next is “Contents from Request Page Read”, and the last is “Contents from Browser Page Get”. Here is what the first section looks like in my report:
The two sections that follow look exactly alike except for the title of each — and this is the ideal result. These three sections are reported to show the results of different request types from your server; the Check Server Tool uses a request from a spider, a raw HTTP request and a request from a browser. If one of the three sections is different than the others, then the search engine spider may be indexing the wrong content.
So that’s the Check Server Tool, along with how and why you’d use it. We’ll be adding to this series of tool guides with more explanations of current tools, as well as the how-tos of new tools as they’re launched.