The Fresh Egg blog

Latest digital marketing news

SEO Quotes of the Month - October 2012

October saw a major feature release from Google that helped some, but didn't prove a panacea to others. More erosion of search data was noted, with increasing amounts of organic traffic hidden from view by secure search and a host of updates to popular social network sites caused one very vocal writer and early adopter to re-assess how valuable these were . . .


Google Webmaster Central team [source] on the new disavowal tool:

"Today we’re introducing a tool that enables you to disavow links to your site. If you’ve been notified of a manual spam action based on 'unnatural links' pointing to your site, this tool can help you address the issue."

For those responsible for cleaning up backlink profiles containing low quality links, one of the most frustrating aspects was the difficulty communicating with websites where links needed to be removed or altered. The disavowal tool was a rare case of Google following Bing’s lead, thus providing a tool to ask them to ignore links you wanted to get rid of but, despite repeated attempts, hadn’t been able to.

Some saw the disavowal tool as a quick fix solution enabling them to carpet bomb links without any documented attempts or success at cleanup. However, this was soon confirmed by numerous reports of rejected reconsideration requests, as ineffective. Although the tool is a great way of getting rid of those last stubborn stains, it’s no replacement for giving a backlink profile a good, hard scrubbing.

Matt Cutts has provided numerous interviews and guidelines on its use: like the canonical tag, the disavowal tool has the potential to seriously mess up a site if applied incorrectly. It will also hinder a reconsideration if misidentified good links are included in the disavowal, so it's definitely no replacement for using some SEO elbow grease to track down problematic links and get them removed. For those that have exhausted all attempts at link removal, however, it’s a handy tool to use to at least indicate which links you would like Google to ignore.


Virginia Nussey from Bruce Clay Inc. [source] on tactics to address hidden search intent:

"The best way to direct your marketing efforts is to understand your consumer and use that to guide your content strategy."

Virginia’s comments were initially addressed at social media traffic that was unaccounted and unattributed, with certain platforms having an abundance of reporting and tracking and the rest falling into the 'dark web'. Her article rightly went on to mention the segment of traffic displaying as ‘not provided’ within Google Analytics: initially predicted to be a single digit percentage of web visitors, this can now be 40% or even higher across some profiles.

With up to half of search intent now masked, it is more important than ever to understand a web user, what their journey is and their expectations of a site. It’s no longer as easy to pull some keyword trends from Analytics without serious misgivings pertaining to significance given the percentage of data hidden. Attribution across multiple channels, even offline ones, seems to be the aspirational goal of most web marketers.


Warren Ellis, writer and futurist [source], with a brutally frank observation on the development of G+ and other social media platforms:

"Of the 150+ people I had in circles, precisely three of them posted content I could see. When I posted content, only a thin fraction of those 11,000 people could see it, because at some point I got tuned out by the system. G+ is therefore useless to me, and I just nuked my circles."

He continues with his idiomatically colourful language to dispense with Facebook, too, summarising that unless brands are so big they need to pay for promotion in order to be seen, the drive for monetisation has “killed engagement”.

Like Google, Facebook needs cash to survive and even pre-flotation there were furious debates as to how it could manage this (with some raising eyebrows at the initial share price). I believe it would be fair to say that Google has restricted certain areas of search that, intentionally or not, have favoured its paid model. Facebook now seems to be doing likewise, with content being hidden from users who have liked or are fans of a certain page.

All this is contrary to a democratic model of web search, where distribution networks gain value from the depth of the content they index and present. Google knocked Alta Vista off its perch due to both search result relevance and the extent of web content it had access to. Is this the start of a trend and, if so, will this open the door to other, less well-known search platforms?




Share this post


comments powered by Disqus