Super Session: Search Engines and Webmasters – aka: The Search Engine Smackdown
I could not be more excited for this session. Not because this is an easy session to recap, because it’s totally not, but because it’s the very last one and after this one is over, I get to go home.
Brett Tabke has a whole group of search engine reps on the stage right now: Matt Cutts, Software Engineer, Google Inc.; Sean Suchter, VP, Yahoo! Search Technology Engineering, Yahoo!; and Nathan Buggia, Live Search Webmaster Central, Lead Program Manager, Microsoft. Brett’s getting ready to torture them to give up all their secrets.
Brett just called Matt the last man standing. Then the last engine standing. Ouch. He means that everyone else is a new face on this panel, but the truth, she hurts.
Nathan Buggia is up first to give us a Live Search State of the Union. He’s going to reframe Satya’s keynote from this morning in a more technical way. Oh dear.
Relevance is key. They measure internally in a representative sample of query terms with “how good it is?” from all three engines. Are they in the ball park? They’ve found they’re very similar. Some things they do well, some not so good. Freshness is a factor for them.
- Improved crawling performance
- Standardization of Robots Exclusion Protocol rule — MSNbot supports regular expressions in robots.txt
- Sitemaps anywhere — they don’t have to be hosted under your root domain anymore. They use them for canonicalization issues as well as for page discovery.
- “Significant” increase in crawling activity
Their webmaster tools are useful for troubleshooting. You can find reports on:
- 404 errors
- Too many parameters
- Blocked by REP
- Unsupported content
- Malware infected — they won’t allow clicks on malware from Live Search
They can crawl up to two sub-domains and two directories down.
You can do an audit to find all your URLs and all the pages linked to those pages. Once you fix a malware issue, you can request a re-crawl and they’ll get it done in a couple of days.
Ranking:
- Static ranking
- Dynamic rank within site
- Backlinks
- Penalties — and steps to resolution
They provide direct support through their forums. They’ll get back to you within three days.
The adCenter Excel Keyword research tool pulls data from adCenter and from Passport and you should use it.
They’ve studied use patterns and they found there are several: targeted, exploratory to targeted, exploratory to multi-targeted, etc. As a result they’re providing more rich media to try to give more information upfront for the exploratory pattern. This gives you more ways to reach users: products, reviews, links and videos. This is showing up not just in search results but in other products like Hotmail.
Back to Project Silk Road:
- Increase engagement
- Encance your site with Live Search Web results
- Customize 404 error pages with Web Error Toolkit
- Create rich user experiences with Virtual Earth and Silverlight
Generate traffic:
- Optimize your site with webmaster tools
- Deep content partnerships
[Missed the last pillar]
Live Search API: Based on feedback from publishers, they’re giving back control.
- Reorder the results
- Skin results and ads
- Filter out 300 ad providers
Maximum flexibility:
- Unlimited queries (unless you’re a scraper)
- Rich query language (advanced queries like site:)
- Many types of content
- Web
- News
- Images
- Encarta Answers
- Spelling
- Implement all standard protocols (REST, JSON, RSS, SOAP)
Check out: APIs, Webmaster Center, adCenter
Sean Suchter is next for Yahoo!
The main thing they’re working on is getting past the ten blue links, as well as getting past the limited choice.
They’ve gotten a lot of response on their search assist features.
They’re trying to get from “to do” to “done”. They’re trying to get people to the answer, to reduce frustration, and to bring out structured information from the Web. He thinks the music integration feature was cool — full songs off of artist searches.
They’re focused on more information in one search. Drawing out deep links, news information, rich media.
The ecosystem is looking at building a richer, more relevant and more personal search experience. SearchMonkey brings outside in and gives control outside of the core search. What does opening search mean? It’s a clear win for developers, site owners, users and Yahoo. Before, you had ten simple blue links. Afterwards, you get an engaging look at more information that gets you straight to the answers, increases the quality of site traffic, increases usability and fosters loyalty.
They’re continually testing and revising the presentation and testing out different ways to present structured information to provide a better experience.
They have a developer tool to create the applications, pulling data from publishers and sending it to users who can opt in or out.
Some publishers using the application:
- People
- Rotten Tomatoes
- Flickr
- Yelp
- Yahoo! properties
The reverse of this is BOSS. It’s sending data out and opening it up to help with the indexing and the crawling. No matter where you are on the Web, search experiences are relevant, comprehensive and fresh.
Sites using BOSS:
- 4hoursearch: it is very simple and open
- PlayerSearch: focused on sports
- Newsline: timeline-based news information pulled out of the search engine
- Tianamo: …I don’t even know. It’s topographical. Neat.
Last it’s Matt Cutts. His presentation is “State of the Index: What’s going on with Google?”
Google Chrome: wicked fast browser — competition makes everyone better
Android: open-source operating system (He has a Google phone. Jealous.)
- Better machine translation
- Better voice translation
- Google Suggest
- Improving personalization and Universal/blended search
Lots of small things:
- 2001 search index
- Video and voice chat in Gmail
- Ability to track the flu
Google Trends: You can use this to figure out how to target your keywords, compare web site interest.
Google Ad Planner — doesn’t have to be Google ads
2008 Webmaster launches:
- Optical character recognition in PDF docs
- Better crawling of Flash , — this is NOT license to build your pages entirely in Flash
- Mobile will still break
They’re getting better at keyword spam and gibberish. He brings up the 404 link finder in Webmaster Tools and jokes that it’s free links when you get them fixed. He also points out the 404 help page.
- Advanced segmentation on Google Analytics
- On-demand indexing for Google Custom Search Engines — get ten pages re-indexed free immediately
- Webmaster APIs for hosters and Gdata
- Translation gadget for your Web site
Google held three webmaster chats in the last year (700+ people in the last call). They are blogging more and doing more videos. Google added new languages for blogs. Yesterday they released a 30-page beginning SEO guide. Google does not hate SEO.
2009 black hat trends
Jeevesretirement.com — DO NOT GO THERE — was bought by Ask when Jeeves retired, but they didn’t get it renewed. Now it’s a porn site.
He thinks hacking is actually going to get worse — like real illegal hacking. A Googler actually got hacked. They’re getting hacked and linking over to other sites that they’re hacked. Black hat moves toward the outright illegal.
Matt has a complicated example of hacking. You’ll just have to imagine it until he posts it somewhere for me to link to. Did you hear me, Matt? Post the slide.
- SEOs need to decide on risk tolerance.
- Google will keep communicating with webmasters.
- Google will provide tools to help webmasters. They’re working on a tool for canonicalization or preferred URLs.
Q&A
Are you going to change what you present to people in terms of intent?
Sean: Yes.
Matt: There are three types of searches: navigational, informational and transactional. Site links are for the navigational.
Can you send out acknowledgements to reconsideration requests?
Matt: Part of the problem is that you don’t want to tell a spammer that you did or did not catch a problem, so Google doesn’t want to do that. But he says he thought it might be nice to just say “a computer looked at it at such and such time”. They’re looking at that.
Can we get a default SearchMonkey format so that we don’t have to wait for users to opt in?
Sean: They’re trying to test and determine how to auto-on the right things. You need to create the most useful possible added-value apps with your clients so that users can get a better experience. The better that happens, the closer it gets to auto-on.
Matt wants to talk about Rip-off Report. He doesn’t think they’re spam. They only remove for spam or court order. There’s not enough material on Rip-Off Report for him to remove it as spam. If you really hate it that much, get a court order and then they can take action on it. There are free speech and first amendment right issues there. They will take strong action when there is spam but it’s not there yet. They already get enough “Google is just taking them out because they don’t like them” criticisms.
How about a negative link?
Matt says it’s a thought.
Are you guys going to be looking at a way to tell Google News something different than Googlebot?
Matt: You have to prioritize by feedback. [They’re going to talk later.]
The cheapest product seems to win in Live Search. Doesn’t that compete with my actually good PPC/SEO links?
Nathan: Those products get in through productupload.live.com. They’re free and he thinks it’s a good feature. [He really did answer the question but I didn’t catch it.]
I hear First Click Free is in jeopardy? How do we stay in compliance with that?
Matt: [He explains briefly how First Click Free works.] He thinks they’re at a good balance right now, but the challenge is how do they regulate it. They’re still doing it. They’ve just about finalized their policies for it.
I paid for links. [The entire audience turns to look at him.] All our sites were penalized in Google Webmaster Tools. How much privacy is there? Second question, when are you going to share revenue from BOSS with publishers?
Matt: They have tools where they can pretty much always find the related sites.
Sean: He doesn’t know the details but there are people actively working on the monetization plan. It’s definitely being worked on.
Nathan jumps in to talk about monetization opportunities at Live Search. Poor Nathan, no one cares.
Why is there no way to authenticate that a bot is a bot?
Matt: You can. It’s a kind of a hassle, but it’s a two step process. Search for “verifying googlebot”. It works for all the engines.