Monthly Archives: January 2015

Why SEOs Need to Care About Correlation as Much (or More) than Causation

correlation does not equal causation

Posted by randfish

Today I'm going to make a crazy claim—that in SEO today, there are times, situations, and types of analyses where correlation is actually MORE interesting and useful than causality. I know that sounds insane, but stick with me until the end and at least give the argument a chance. And for those of you who like visuals, our friend AJ Ghergich and his intrepid team of designers created some nifty graphics to accompany the piece.

Once upon a time, SEO professionals had a reasonable sense of many (or perhaps even most) of the inputs into the search engine's ranking systems. We leveraged our knowledge of how Google interpreted various modifications to keywords, links, content, and technical aspects to hammer on the signals that produced results.

But today, there can be little argument—Google's ranking algorithm has become so incredibly complex, nuanced, powerful, and full-featured, that modern SEOs have all but given up on hammering away at individual signals. Instead, we're becoming more complete marketers, with greater influence on all of the elements of our organizations' online presence.

Web marketers operate in a world where Google:

  • Uses machine learning to identify editorial endorsements vs. spam (e.g. Penguin)
  • Measures and rewards engagement (e.g. pogo-sticking)
  • Rewards signals that correlate with brands (and attempts to remove/punish non-brand entities)
  • Applies thousands of immensely powerful and surprisingly accurate ways to analyze content (e.g. Hummingbird)
  • Punishes sites that produce mediocre content (intentionally or accidentally) even if the site has good content, too (e.g. Panda)
  • Rapidly recognizes and accounts for patterns of queries and clicks as rank boosting signals (e.g. this recent test)
  • Makes 600+ algorithmic updates each year, the vast majority of which are neither announced nor known by the marketing/SEO community

how Google works

Given this frenetic ecosystem, the best path forward isn't to exclusively build to the signals that are recognized and accepted as having a direct impact on rankings (keyword-matching, links, etc). Those who've previously pursued such a strategy have mostly failed to deliver on long-term results. Many have found their sites in serious trouble due to penalization, more future-focused competitors, and/or a devaluing of their tactics.

Instead, successful marketers have been engaging in the tactics that Google's own algorithms are chasing—popularity, relevance, trust, and a great overall experience for visitors. Very frequently, that means looking at correlation rather than causation.

Google ranking factors

[Via Moz's 2013 Ranking Factors - the new 2015 version is coming this summer!]

We'll engage in a thought experiment to help highlight the issue:

Let's say you discover, as a signal of quality, Google directly measures the time a given searcher spends on a page visited from the SERPs. Sites with pages searchers spend more time on get a rankings boost, while those with quick abandonment find their pages falling in the rankings. You decide to press your advantage with this knowledge by using some clever hacks to keep visitors on your page longer and to make clicking the back button more difficult. Sure, it may suck for some visitors, but those are the ones you would have lost anyway (and they would have hurt your rankings!), so you figure they're not worth worrying about. You've identified a metric that directly impacts Google's algorithm, and you're going to make the most of it.

Meanwhile, your competitor (who has no idea about the algorithmic impact of this factor) has been working on a new design that makes their website content easier, faster, and more pleasurable to consume. When the new design launches, they initially see a fall in rankings, and don't understand why. But you're pretty sure you know what's happened. Google's use of the time-on-site metric is hurting them because visitors are now getting the information they want from your competitor's new design faster than before, and thus, they're leaving more quickly, hurting the site's rankings. You cackle with delight as your fortune swells.

But what happens long term? Google's quality testers see diminished happiness among searchers. They rework their algorithms to reward sites that successfully deliver great experiences more quickly. At the same time, competitors gain more links, amplification, social sharing, and word of mouth because real users are deriving more positive experiences from their site than yours. You found an algorithmic loophole and exploited it briefly, but by playing the "where's Google weak?" game rather than the "where's Google going?" game, you've ultimately lost.

Over the last decade, in case after case of marketers optimizing for the causal elements of Google's algorithm, this pattern of short-term gain leading to long-term loss continually occurs. That's why, today, I suggest marketers think about what correlates with rankings as much as what actually causes them.

If many high-ranking sites in your field are offering mobile apps for Android and iOS, you may be tempted to think there's no point to considering an app-strategy just for SEO because, obviously, having an app doesn't make Google rank your site any higher. But what if those mobile apps are leading to more press coverage for those competitors, and more links to their site, and more direct visits to their webpages from those apps, and more search queries that include their brand names, and a hundred other things that Google maybe IS counting directly in their algorithm?

And, if many high ranking sites in your field engage in TV ads, you may be tempted to think that it's useless to investigate TV as a channel because there's no way Google would reward advertising as a signal for SEO. But what if those TV ads drive searches and clicks, which could lead directly to rankings? What if those TV ads create brand-biasing behaviors through psychological nudges that lead to greater recognition and a higher likelihood of searchers click on, link to, share, talk about, write about, buy from, etc. your TV-advertising competitor?

Thousands of hard-to-identify, individual signals, mashed together through machine learning, are most likely directly responsible for your competitor's website outranking yours on a particular search query. But even if you had a list of the potential inputs and the mathematical formulas Google's process considers most valuable for that query's ranking evaluation, you'd be little closer to competently beating them. You may feel smugly satisfied that your own SEO knowledge exceeded that of your competitor, or of their SEO consultants, but smug satisfaction does not raise rankings. In fact, I think some of the SEO field's historic obsession with knowing precisely how Google works and which signals matter is, at times, costing us a broader, deeper understanding of big-picture marketing*.

Time and again, I've seen SEO professionals whom I admire, respect, and find to be brilliant analysts of Google's algorithms lose out to less-hyper-SEO-aware marketers who combine that big picture knowledge with more-basic/fundamental SEO tactics. While I certainly wouldn't advise anyone to learn less about their field nor give up their investigation of Google's inner workings, I am and will continue to strongly advise marketers of all specialties to think about all the elements that might have a second-order or purely correlated effect on Google's rankings, rather than just concentrate on what we know to be directly causal.

-----------------

* No one's guiltier than I am of obsessing over discovering and sharing Google's operations. And I'll probably keep being that way because that's how obsession works. But, I'm trying to recognize that this obsession isn't necessarily connected to being the most successful marketer or SEO I can be.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |January 20th, 2015|MOZ|0 Comments

10 homepages that teach something new every time you open a tab

Inspiringhomepages-1
Feed-fb

If your homepage just loads to Google, you're missing a trick. You could start every day by learning something new, getting motivated by the creative arts or enjoying an inspiring visual

We've found some awesome options to save as your homepage, so hit up your browser's settings and get galvanized each and every time you go online

Take a look through our suggestions below. Do you Internet with something even more awesome? Share it in the comments.

1Merriam-Webster's "Word of the Day"


w3

Any language-lovers might want to consider a "Word of the Day" option. Increase your vocab by checking in with Merriam-Webster every morning ...

More about Web, Internet, Utility, Features, and Tech

By |January 19th, 2015|Apps and Software|0 Comments

Uber looks to start fresh in Europe

Travis-kalanick
Feed-fb

The rise of ride-sharing service Uber has been nothing but stellar in the past couple of years, with the service currently being available in more than 200 cities. In Europe, however, it's been banned in several major cities, and some EU countries slapped it with court injunctions for violating taxi licensing regulations

In 2015, the company CEO Travis Kalanick wants to change that

“We want to make 2015 the year when we create new partnerships with European cities. If we can make those partnerships happen, we could create 50,000 new jobs," Kalanick said at the ...

More about Europe, Uber, Tech, Apps Software, and World

By |January 19th, 2015|Apps and Software|0 Comments

Technical Site Audit Checklist: 2015 Edition

Check Box

Posted by GeoffKenyon

Back in 2011, I wrote a technical site audit checklist, and while it was thorough, there have been a lot of additions to what is encompassed in a site audit. I have gone through and updated that old checklist for 2015. Some of the biggest changes were the addition of sections for mobile, international, and site speed.

This checklist should help you put together a thorough site audit and determine what is holding back the organic performance of your site. At the end of your audit, don't write a document that says what's wrong with the website. Instead, create a document that says what needs to be done. Then explain why these actions need to be taken and why they are important. What I've found to really helpful is to provide a prioritized list along with your document of all the actions that you would like them to implement. This list can be handed off to a dev or content team to be implemented easily. These teams can refer to your more thorough document as needed.


Quick overview

Check indexed pages
  • Do a site: search.
  • How many pages are returned? (This can be way off so don't put too much stock in this).
  • Is the homepage showing up as the first result?
  • If the homepage isn't showing up as the first result, there could be issues, like a penalty or poor site architecture/internal linking, affecting the site. This may be less of a concern as Google's John Mueller recently said that your homepage doesn't need to be listed first.

Review the number of organic landing pages in Google Analytics

  • Does this match with the number of results in a site: search?
  • This is often the best view of how many pages are in a search engine's index that search engines find valuable.

Search for the brand and branded terms

  • Is the homepage showing up at the top, or are correct pages showing up?
  • If the proper pages aren't showing up as the first result, there could be issues, like a penalty, in play.
Check Google's cache for key pages
  • Is the content showing up?
  • Are navigation links present?
  • Are there links that aren't visible on the site?
PRO Tip:
Don't forget to check the text-only version of the cached page. Here is a bookmarklet to help you do that.

Do a mobile search for your brand and key landing pages

  • Does your listing have the "mobile friendly" label?
  • Are your landing pages mobile friendly?
  • If the answer is no to either of these, it may be costing you organic visits.

On-page optimization

Title tags are optimized
  • Title tags should be optimized and unique.
  • Your brand name should be included in your title tag to improve click-through rates.
  • Title tags are about 55-60 characters (512 pixels) to be fully displayed. You can test here or review title pixel widths in Screaming Frog.
Important pages have click-through rate optimized titles and meta descriptions
  • This will help improve your organic traffic independent of your rankings.
  • You can use SERP Turkey for this.
Check for pages missing page titles and meta descriptions
The on-page content includes the primary keyword phrase multiple times as well as variations and alternate keyword phrases
There is a significant amount of optimized, unique content on key pages
The primary keyword phrase is contained in the H1 tag
Images' file names and alt text are optimized to include the primary keyword phrase associated with the page.
URLs are descriptive and optimized
  • While it is beneficial to include your keyword phrase in URLs, changing your URLs can negatively impact traffic when you do a 301. As such, I typically recommend optimizing URLs when the current ones are really bad or when you don't have to change URLs with existing external links.
Clean URLs
  • No excessive parameters or session IDs.
  • URLs exposed to search engines should be static.
Short URLs
  • 115 characters or shorter – this character limit isn't set in stone, but shorter URLs are better for usability.

Content

Homepage content is optimized
  • Does the homepage have at least one paragraph?
  • There has to be enough content on the page to give search engines an understanding of what a page is about. Based on my experience, I typically recommend at least 150 words.
Landing pages are optimized
  • Do these pages have at least a few paragraphs of content? Is it enough to give search engines an understanding of what the page is about?
  • Is it template text or is it completely unique?
Site contains real and substantial content
  • Is there real content on the site or is the "content" simply a list of links?
Proper keyword targeting
  • Does the intent behind the keyword match the intent of the landing page?
  • Are there pages targeting head terms, mid-tail, and long-tail keywords?
Keyword cannibalization
  • Do a site: search in Google for important keyword phrases.
  • Check for duplicate content/page titles using the Moz Pro Crawl Test.
Content to help users convert exists and is easily accessible to users
  • In addition to search engine driven content, there should be content to help educate users about the product or service.
Content formatting
  • Is the content formatted well and easy to read quickly?
  • Are H tags used?
  • Are images used?
  • Is the text broken down into easy to read paragraphs?
Good headlines on blog posts
  • Good headlines go a long way. Make sure the headlines are well written and draw users in.
Amount of content versus ads
  • Since the implementation of Panda, the amount of ad-space on a page has become important to evaluate.
  • Make sure there is significant unique content above the fold.
  • If you have more ads than unique content, you are probably going to have a problem.

Duplicate content

There should be one URL for each piece of content
  • Do URLs include parameters or tracking code? This will result in multiple URLs for a piece of content.
  • Does the same content reside on completely different URLs? This is often due to products/content being replicated across different categories.
Pro Tip:
Exclude common parameters, such as those used to designate tracking code, in Google Webmaster Tools. Read more at Search Engine Land.
Do a search to check for duplicate content
  • Take a content snippet, put it in quotes and search for it.
  • Does the content show up elsewhere on the domain?
  • Has it been scraped? If the content has been scraped, you should file a content removal request with Google.
Sub-domain duplicate content
  • Does the same content exist on different sub-domains?
Check for a secure version of the site
  • Does the content exist on a secure version of the site?
Check other sites owned by the company
  • Is the content replicated on other domains owned by the company?
Check for "print" pages
  • If there are "printer friendly" versions of pages, they may be causing duplicate content.

Site architecture and internal linking

Number of links on a page
Vertical linking structures are in place
  • Homepage links to category pages.
  • Category pages link to sub-category and product pages as appropriate.
  • Product pages link to relevant category pages.
Horizontal linking structures are in place
  • Category pages link to other relevant category pages.
  • Product pages link to other relevant product pages.
Links are in content
  • Does not utilize massive blocks of links stuck in the content to do internal linking.
Footer links
  • Does not use a block of footer links instead of proper navigation.
  • Does not link to landing pages with optimized anchors.
Good internal anchor text
Check for broken links
  • Link Checker and Xenu are good tools for this.

Technical issues

Proper use of 301s
  • Are 301s being used for all redirects?
  • If the root is being directed to a landing page, are they using a 301 instead of a 302?
  • Use Live HTTP Headers Firefox plugin to check 301s.
"Bad" redirects are avoided
  • These include 302s, 307s, meta refresh, and JavaScript redirects as they pass little to no value.
  • These redirects can easily be identified with a tool like Screaming Frog.
Redirects point directly to the final URL and do not leverage redirect chains
  • Redirect chains significantly diminish the amount of link equity associated with the final URL.
  • Google has said that they will stop following a redirect chain after several redirects.
Use of JavaScript
  • Is content being served in JavaScript?
  • Are links being served in JavaScript? Is this to do PR sculpting or is it accidental?
Use of iFrames
  • Is content being pulled in via iFrames?
Use of Flash
  • Is the entire site done in Flash, or is Flash used sparingly in a way that doesn't hinder crawling?
Check for errors in Google Webmaster Tools
  • Google WMT will give you a good list of technical problems that they are encountering on your site (such as: 4xx and 5xx errors, inaccessible pages in the XML sitemap, and soft 404s)
XML Sitemaps
  • Are XML sitemaps in place?
  • Are XML sitemaps covering for poor site architecture?
  • Are XML sitemaps structured to show indexation problems?
  • Do the sitemaps follow proper XML protocols?
Canonical version of the site established through 301s
Canonical version of site is specified in Google Webmaster Tools
Rel canonical link tag is properly implemented across the site
Uses absolute URLs instead of relative URLs
  • This can cause a lot of problems if you have a root domain with secure sections.

Site speed

Review page load time for key pages

Make sure compression is enabled

Enable caching

Optimize your images for the web

Minify your CSS/JS/HTML

Use a good, fast host
  • Consider using a CDN for your images.

Optimize your images for the web

Mobile

Review the mobile experience
  • Is there a mobile site set up?
  • If there is, is it a mobile site, responsive design, or dynamic serving?

Make sure analytics are set up if separate mobile content exists

If dynamic serving is being used, make sure the Vary HTTP header is being used

Review how the mobile experience matches up with the intent of mobile visitors
  • Do your mobile visitors have a different intent than desktop based visitors?
Ensure faulty mobile redirects do not exist
  • If your site redirects mobile visitors away from their intended URL (typically to the homepage), you're likely going to run into issues impacting your mobile organic performance.
Ensure that the relationship between the mobile site and desktop site is established with proper markup
  • If a mobile site (m.) exists, does the desktop equivalent URL point to the mobile version with rel="alternate"?
  • Does the mobile version canonical to the desktop version?
  • Official documentation.

International

Review international versions indicated in the URL
  • ex: site.com/uk/ or uk.site.com
Enable country based targeting in webmaster tools
  • If the site is targeted to one specific country, is this specified in webmaster tools?
  • If the site has international sections, are they targeted in webmaster tools?
Implement hreflang / rel alternate if relevant
If there are multiple versions of a site in the same language (such as /us/ and /uk/, both in English), update the copy been updated so that they are both unique

Make sure the currency reflects the country targeted
Ensure the URL structure is in the native language
  • Try to avoid having all URLs in the default language

Analytics

Analytics tracking code is on every page
  • You can check this using the "custom" filter in a Screaming Frog Crawl or by looking for self referrals.
  • Are there pages that should be blocked?
There is only one instance of a GA property on a page
  • Having the same Google Analytics property will create problems with pageview-related metrics such as inflating page views and pages per visit and reducing the bounce rate.
  • It is OK to have multiple GA properties listed, this won't cause a problem.
Analytics is properly tracking and capturing internal searches

Demographics tracking is set up

Adwords and Adsense are properly linked if you are using these platforms
Internal IP addresses are excluded
UTM Campaign Parameters are used for other marketing efforts
Meta refresh and JavaScript redirects are avoided
  • These can artificially lower bounce rates.
Event tracking is set up for key user interactions

This audit covers the main technical elements of a site and should help you uncover any issues that are holding a site back. As with any project, the deliverable is critical. I've found focusing on the solution and impact (business case) is the best approach for site audit reports. While it is important to outline the problems, too much detail here can take away from the recommendations. If you're looking for more resources on site audits, I recommend the following:

Helpful tools for doing a site audit:

Annie Cushing's Site Audit
Web Developer Toolbar
User Agent Add-on
Firebug
Link Checker
SEObook Toolbar
MozBar (Moz's SEO toolbar)
Xenu
Screaming Frog
Your own scraper
Inflow's technical mobile best practices


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

By |January 19th, 2015|MOZ|0 Comments

Hacker’s List allows you to hire a hacker anonymously and quickly

Hackers-for-hire
Feed-fb

Hacker's List, a website that offers to connect customers and "professional" hackers for hire, would have you believe that just about everyone, at one point or another, needs to hack into something. And it wants to help.

The website — which shows listings as far back as November, when it launched — includes more than 400 posts from users seeking hackers. There are around 70 hacker profiles displayed on the site, but many of them don't appear to be active

"Hiring a hacker shouldn't be a difficult process, we believe that finding a trustworthy professional hacker for hire should be a worry free and painless experience," reads the website ...

More about Hacking, Tech, and Apps Software

By |January 18th, 2015|Apps and Software|0 Comments