Better Backlink Data for Site Owners
Yinnon Haviv explains Google will now be taking into account feedback, giving us a better sample overview of sites’ backlink data. Previously we were given alphabetical, top heavy weighted sample of links, but this has now given way to a more uniform sample of links, including TLD’s and domain names. This should give a broader, more diverse selection of links to help more accurately target consumers or even for those who want to do a little spring backlink cleaning.
Relevance Web Marketing’s latest blog post highlights the key points in the Summer roundup of Google Webmaster’s Central Blog.
rel=”author” Frequently Asked (Advanced) Questions
The premise of authorship is to point search engine users in the direction of authors that can be considered authorities in a particular industry, compared to your average Joe blog writer. Maile Ohye’s article aims to answer a few technical rel=’author’ questions on the subject.
• Ensuring articles are written by one, ‘real’ person’s point of view, with translated versions all linking back to the original author
• For those that do not want to appear in SERP’s, simply make your G+ profile non-discoverable or remove the linked websites
• rel=author is for the person who wrote the article, rel=publisher is for a group to denote they publish articles – e.g. a company blog
Making Smartphone Sites Load Fast
Bryan McQuade presents new guidelines and insight tools for mobile load times to gain a better render performance. He explains ways to decrease the loading time of above the fold content, whilst below the fold loads up by the time users have scrolled down.
View Manual Webspam Actions in Webmaster Tools
Google predominantly use algorithms to combat spam, however one of Google’s newest features is to allow Webmaster Tools’ users to see whether their site has been affected a Google employee manually flagging spam. Called Manual Webspam Action, it can target either a whole site or just partial pages, although this only concerns a very small minority of websites (less than 2%). Once corrections adhering to quality guidelines have been made, a reconsideration request can be sent in order to have the site re indexed.
In-Depth Articles in Search Results
We may often turn to the internet for a quick answer, however research shows that more often than not it’s like opening up a can of worms on a much broader topic. Google’s answer to this is for its algorithm to target more high quality and in-depth articles and recommends adding code to our websites to enhance the whole user experience.
A Reminder About Manipulative or Deceptive Behavior
This short blog has Michael Wyszomierski giving us a quick reminder about adhering to Google quality guidelines. The most recent ‘trick’ is inserting a page, when pressing the ‘back’ button that looks like an SERP, but is actually an advertising page.