An Inside Look at the New ‘Fetch as Google’ Feature [VIDEO]
On Tuesday, Google released an update to its Fetch as Google feature within Webmaster Tools. This valuable tool, which has been around for years, allows webmasters to view the source code and headers of their site the way Googlebot sees them. Tuesday’s update enhanced the tool’s capability, enabling webmasters to see what Googlebot sees in a browser window, as well.
[EDITOR’S NOTE: The Fetch as Google tool is located under the Crawl menu in Google Search Console, which is the new name for Google Webmaster Tools as of spring 2015.]
What Fetch as Google’s Render Option Shows Webmasters
Since Tuesday’s update, we’ve fielded questions from clients such as:
- Will Google render all the JavaScript on my site?
- Does this negate the need for Ajax-enabled crawling?
- Google stated on Friday that “Sometimes the JavaScript may be too complex or arcane for us to execute, in which case we can’t render the page fully and accurately.” What exactly does that mean?
Knowing that the SEO industry at large has vested interest in these answers, I interviewed Bruce Clay, Inc.’s Director of Software Development Aaron Landerkin. In the video below, Landerkin answers these questions and more.
It’s only been two days since the tool’s update and Google’s Head of Webspam Matt Cutts has already touted the tool several times on Twitter:
Our “Fetch as Google” feature is getting *much* better: http://t.co/RQlNhRb3hn Now renders page–useful for debugging!
— Matt Cutts (@mattcutts) May 27, 2014
Tweeting the new Fetch as Google feature again because I love it that much: http://t.co/XPvmLkCTks Have you tried it yet? — Matt Cutts (@mattcutts) May 27, 2014
Have more questions about Fetch as Google? Ask us in the comments.
2 Replies to “An Inside Look at the New ‘Fetch as Google’ Feature [VIDEO]”
Hi Amit! Aaron Landerkin shared the following thoughts:
There’s not a “single fix” for temporarily unavailable items — the reasons that the server may not be able to fulfill requests could be anything: the host, the server hardware, the website itself, etc. That being said, I would do the following:
1. Run some page speed analysis tools and make sure all of the css/js/html is loading in a quick amount of time. If it’s not, then these tests should have some good feedback to make the page load faster or reduce the number of requests needed to render the page.
2. Run some load testing on the site and make sure the server can handle multiple requests. If your site can only handle a handful of requests per second, then you’ll need to either upgrade your server or fix your site.
3. If the site seems OK based on the other tests then it may be the host. If you’re on shared, cheap hosting, you may have some problems with either server hardware or bandwidth, and you should look at upgrading.
When I do fetch and render on one of my product pages, the tool takes a while. Around a minute..sometimes even more. Eventually it does render correctly. Should I be concerned over the time taken?
My next question is related to the results of the render. When it says “partial”, it gives a list of css and js resources that googlebot was unable to load. Some are font resources on googlefonts. ..others are css resources on my site. The reason is “temporarily unavailable”. Nothing is being blocked by my robots.txt.
Any ideas?