The Future of PageRank: 13 Experts on the Dwindling Value of the Link

In a recent Webmaster video, Matt Cutts confirmed that Google has tried internal versions of its search engine that work entirely without links. The results are low-quality – “for now,” he said. But this suggests that the value of the almighty link has come into question at Google, and they may be working on a version of the PageRank algorithm that doesn’t depend so heavily on the link graph – which means PageRank as we know it might be on the chopping block. But when?

Future of PageRank

Don LaFontaine, Master of the “In a World” Movie Trailer Voiceover

In light of this, we asked some of our favorite SEO and inbound marketing experts to answer the following three questions:

Do you see the link losing value over time? Do you foresee a future where backlinks lose some or all of their weight in the PageRank algorithm? How far off would it be?Thought experiment: If Google search did work without links (as Russian search engine Yandex is attempting), what metrics would replace it?How – if at all – should SEO’s change their content marketing and link building strategies in the coming years, given inevitable changes to the algorithm?

We got great insights into the future of PageRank, links, and SEO from industry experts Aaron Wall, Rae Hoffman, Brett Tabke, Michelle Robbins, Julie Joyce, Rand Fishkin, Glenn Gabe, Barry Adams, Alan Bleiweiss, Larry Kim, Pete Meyers, Eric Enge, and Dharmesh Shah (click their names to jump ahead). Prepare to have your mind blown!

Here are their answers, in no particular order:

Aaron Wall SEO BookAlmost all individual signals lose value over time as more variables get added into the mix. 

One could perhaps say that variables that are on the way up/gaining importance are an exception to this, but even in these cases as the search results become more ad-heavy it offsets some of those alleged gains. 

Links have been losing weight for about a half-decade now due to the folding in of other metrics and increasing algorithmic and manual penalties. How far off x level of decline is really depends on loads of factors which are query and vertical dependent. Some queries are localized, some have paid vertical ads from Google, some have lots of usage data which can be folded in, some queries are mobile-centric, some queries have Google scraping-n-replacing the results with their knowledge graph, etc. All these variations on search impact different areas to different degrees. In some areas SEO might still be profitable even for small businesses for another half-decade or decade to come. In other areas SEO will have close to a zero percent chance of being profitable unless it is done by one of the players which is already favored algorithmically before they consider investment into SEO. 

And even in some of those cases which look great, Google can arbitrarily shift outcomes overnight on a vertical-wide basis. Look at the historical algorithmic performance on SEMRush for and BizRate. One entity is an extension of the home team, while the other clearly is not.

As mentioned above, there are some vertical-based metrics Google can use for things like location or similar. And then there are a nearly limitless number of ways Google could count their passive tracking of users from logged in user accounts, Google Chrome, Google Android, Google Fiber, etc. Google can further use things like credit card registration, YouTube usage data, location, search history, Google+ activity, etc. to determine which user accounts to trust more than others. 

The differential and deferential policing of link-based activities is in effect a way of removing links, selectively. One way Google is “uncounting” links is through massive amounts of algorithmic and manual penalties. That in effect is a way of having links not count for some while allowing them to count for others. The broad distribution (and even amplification) of fear-based propaganda around links by various link analysis tool vendors only further removes some types of links from the link graph. 

Anything which is scalable and widely scaled will eventually be promoted as a form of spam, unless it is done by the home team. Thus the more differentiated one’s efforts are and the harder + more expensive they are to reverse-engineer and duplicate, the better. 

One can focus aggressively on brand building and raise funds from venture capitalists tied in with Google ventures, such that they become exempt from algorithmic and manual review issues. Give Google a taste of the revenues and your chances of success increases dramatically. In terms of outcomes, it’s the difference between CustomMade or RetailMeNot versus TeachStreet or a small mom and pop e-commerce store. The search engine advertising driven biases the Google founders warned about in their early research were not so much a warning as a roadmap for Google. When Google buys MediaOcean we can expect TV ads to bleed even more directly into driving “organic” rankings.

Aaron Wall is the owner of

Rae Hoffman SugarraeI think that Google would love to find a way to make links less of a component in search engine rankings, but I don’t see that being truly viable anytime soon. Links are essentially the currency of the web. I don’t think Google can change that at its core in the near future – especially since links are valuable outside of search engine rankings. 

What I think is plausible in the next few years is having a better “checks and balances” system for links – where they can look at outside factors in correlation to a link to give that link more or less value. Right now, they look at the quality of the linking site. But, does the link get shared socially (and if so, who is sharing it)? Does a link bring traffic to the linked site (and if so, what’s the bounce rate, time on site, etc. of traffic from that link?) And how is a site traffic profile affected after link bursts? For example, if you average 100 visitors a day and then get an inbound traffic / link spike, does your traffic – after the initial spike wears off – increase to 120 visitors per day? 

Then you get into what they’re hoping to do with Google+ –  who wrote the content containing the link? And what is that person’s area of expertise? For example, in a perfect world, Google would know that I am authoritative on SEO and affiliate marketing – and articles I write on those topics, and people I link out to, should be given more weight. But if I wrote an article on gardening, should that content / outbound links also receive more weight because I as an individual am considered “authoritative” in a different topic? What I believe they’re aiming for is to answer that question with a “no.” 

I think that what Google is aiming to do is kill off the value of the “scalable” link. Everything they’ve been doing – from both a penalty and “where we want to go in the future” perspective appears to be targeting that.

I think they could use many of the aspects I mentioned above in that scenario. But I also don’t believe they’re looking to replace the link. If I were them, I’d be looking to have better ways to “validate” the link. 

I’ve been talking about a link building strategy that contains a heavy focus on traffic development vs. merely “link development” for almost a decade now. I think sites that build content to solve problems vs. “get search traffic” have had the advantage in regards to obtaining defensible search engine rankings for several years now and will continue to increase their advantage in the future by staying focused on that same strategy. “Content marketing” has existed long before it was called content marketing.

Typically, an SEO looks at analytics reports with a filter to look at only search traffic, and I think that’s a mistake. You need to be looking at reports filtered down to direct and referral traffic as well, because if those aren’t increasing with your link building and content marketing strategies, then you’re not building defensible links. And more importantly, you’re not building a defensible online business.  It seems counterintuitive, but by putting a larger focus on increasing the slices of your traffic pie outside of search engine rankings than most business currently do, you’ll end up with better – and more defensible – search engine rankings at the end of the day.

Rae Hoffman, AKA Sugarrae, is the CEO of PushFire.

Brett TabkeYes. The web has been rewritten in the image of PageRank. The link’s value is questionable. Google has used a lot of band-aids over the years to the Page Rank based algorithm in an attempt to keep it valid. However, it is clear that the value of the link as a metric is questionable in almost all occurrences today.

I think it is clear it has *already* done that. Google is trying everything they can to “devalue” the link as a scoring metric. A link from a PR9 page used to mean instant top-page rankings under the appropriate keywords. Today, that same link means very little by itself. The value of the link is going to continue to decline.

You remember the story of how Google figured out spelling suggestions? They looked at all the ways people misspelled britanny, and then let the users tell them which one is right. They do that for all their spelling suggestions. In effect, they are using user-powered intelligence to direct their spelling algo. They can do something similar for search results.

So there are a couple of ways I think Google could eliminate the page rank algo:

(a) human-powered reviewers scoring pages

(b) a user behavior derived algo

(c) a hybrid of a/b

Consider if you have 1,000 people reviewing pages. Each person could visit 1 page per minute for 6-8 hours a day. That is 360,000 pages per day. That means in 1 week’s time, you could score the top results for (guessing) 75-80% of the links in SERP’s people actually see and click on. To refine it down even further, you could take the top clicked links and do a reverse QA check every week to get scoring from multiple reviewers and give it a group score. That means, you could basically “hand score” the entire set of top clicked SERP links in a couple of months. Now imagine you are doing that for say the last 10 years. Why would you need page rank when you have people rank?

Think about all the click and traffic data Google has to work with:

Google Analytics (the leading site metric on the web). Google knows where click paths come from, go to, and where they dead-end in a happy camper, or a “back button and try again.”AdSense. Ads that people like and click on. Also gives them page view data.Google Chrome browser.

That gives Google a huge set of data to score pages with for any query. They know what links people follow to find successful answers. With all that, why would they need a Page Rank algo? They could be almost to the point of eliminating both on-the-page and off-the-page criteria. They can just follow user behavior. They could let users train the algo the same why they let users tell them to identify spelling mistakes by offering them two choices and seeing what ones people pick most.

I would start to focus on all traffic sources that don’t require search traffic. I would use those efforts as content fodder for the search engines, and allow the SE’s to send whatever traffic they send. I would pretend that search engines don’t exist and focus on everything else: full stop (yes, I know and I’m not happy about it either).

Brett Tabke is the founder of Pubcon and WebmasterWorld Inc.

Michelle RobbinsI think links were a necessary part of the PageRank algo in the beginning, and for a very long time, when there were no other signals available. But that pool was (and still is) very easily manipulated – and it’s become a terrible signal at best. Google can keep playing link whack a mole, but I believe that if they want to present truly relevant and valuable results to users – using links as a heavily weighted factor (or even at all) – has to go. They know this, and we’re probably not more than a few years or less away from this happening.

Everyone is keen to believe social signals are or will replace links in importance, but that’s just trading one unreliable signal for another. I believe the real key to relevant results lies in tying offline, behavioral data, with the online data they crawl. Google’s in an interesting fix – they always try to tie a result to a person, what they know about you via your search history, social signals, etc. – but that’s not getting the job done. And it’s not even really necessary. Just because I order pizza online regularly, doesn’t mean I’m the one ever buying or eating that pizza. However, knowing that I physically walk into the same pizza place a few times a month, and am there for 45 mins to an hour – well that actually tells a story, about both my behavior and the quality/popularity of that restaurant. 

So for local business search results, what’s the best indicator of a business’ relevance in a community? Joe’s Pizza has 1,000 backlinks and a 4-star rating on Yelp, but only about 2,000 people through the doors each month. Paul’s Pizza has maybe 100 links, is not even reviewed on Yelp, but has 5,000 people through the doors each month. Which pizza place should rank higher? Google understands this – that getting IRL behavioral data is necessary, which is why they are all in on the Android OS. They don’t care about making nice phones. They need a reliable tracking device, and not just for a maps button or default search – but for the data that can be obtained via that device and the apps, where it lives, where it shops, where it eats, what it buys. 

The iOS platform presents a challenge for them for this kind of data acquisition – but they’re scrappy in Mountain View – they’ll just roll their own wifi hotspots, give you an app that supplies immediate authentication, you’re happy, get free wifi, and they can aggregate that very valuable “where” data. Mostly, I remain surprised that they simply don’t just partner with Nielsen. Nielsen already has all of this data (see below).

Nielsen Data

They know so much more about the IRL habits of people – and it’s valid, powerful, anonymized data – all of which provides real, authentic brand/business signals. Google has a data bias though, they prefer to acquire it themselves. We are seeing some movement in this regard however, notably, the ComScore partnership.

Keep an eye on Google’s partnerships, and especially their acquisitions, this is usually the best indicator of where they are headed. 

SEOs need to market as if Google isn’t watching. 

The acceptance that content actually is king, means SEOs are now catching up to where large brands have been all along. This turns the tables because for so long, the big brands “didn’t get it” – they either had minimal or nonexistent web presences. This was boon for online only businesses and gave the “little guy” a shot at the top – whether or not in the real world, that small brand was dominant. 

The past few years (and algorithm shifts) have brought cries from SEOs that “Google has a big brand bias” – I don’t think this is true at all. Google has a data bias – and that means a content bias. And big brands have content. Stacks of content, decades of content, that they have finally gotten around to putting online. Brands don’t even have to do much to promote their own content – consumers happily do it for them. In natural ways and in varied places across the ecosystem that Google crawls. Nike is a perfect example of this – if Nike dominates an athletic shoe SERP, it’s not brand bias, it just makes sense. 

As to the power of traditional marketing and branding, and how doing a thorough job of that can translate to winning in the SERPs, there is a very large brand, with decades of content, that had a site lie dormant for 2-3 years (no updates, no changes, no content being added). In less than three years, after relaunching, here’s where they were as of the 3rd quarter of last year:

Monthly unique visitors: 1 million

Monthly unique visits: 1.7 million

Monthly pageviews: 3.4 million

SEO budget: $0

They have a 2-person marketing team, and 1 full-time writer. They have no SEO team, they hire no SEO consultants. They have truly terrible title tags and URL structure. Yet for about 20 terms I polled, they rank in the top 5 for them all (usually #1 – #3).


The real TL;DR version:

Google wants their results to be valid and relevant – to mirror user expectations in the real world, and they will continue to evolve their system to get there. Links aren’t doing it, so they are working to adapt. Offline signals are valid, not easily manipulated, and can be captured. Thus I believe they will be used in the algo.

To continue being effective in Google results, SEOs need to focus on marketing. Brands have learned (or acquired or hired) what they need from SEOs – SEOs now need to catch up and learn solid brand marketing fundamentals.

The biggest challenges will be faced by online only brands – especially small brands, in competitive niches. But in this case, I wouldn’t focus my efforts on Google, I’d focus on finding where my customers are online, where else they go, what the affinity brands for them are – and invest in a lot of co-marketing and advertising. Just like they would need to do in the offline world 😉 

Michelle Robbins is the Vice President of Technology for Search Engine Land’s parent company, Third Door Media.

Julie Joyce on PageRankI think that links could lose some value but it’s not a doomsday scenario in my opinion. Considering links are how we move around on the web I cannot imagine a successful search engine that doesn’t take their importance into effect. Unless someone wants to totally rebuild Google, I don’t see it happening soon. I’m sure someone will do it, but do it well without links? I shudder to imagine that world and not just because I’d have to find something else to spam all up on.

Interaction, hopefully. I could see them looking at things like how much time does a user spend on a page, how many pages on a site does he or she visit, how many of a site’s posts get social love, how many legitimate comments are happening, etc.

I think they should start thinking more like human beings and less than money-making machines. I know it’s marketing and that’s the goal, but when you stop thinking like a human you take the kind of shortcuts that get you into trouble and ruin it for the rest of us.  

Julie Joyce is the owner of Link Fish Media, the co-founder of SEO Chicks, and a regular link contributor to Search Engine Land and Search Engine Watch.

Rank Fishkin on PageRank ChangesRelative to other elements in Google’s arsenal of algorithmic components, I’d say that yes, the link has been losing value for almost a decade. That said, I don’t think that in the next decade, we’ll see a time when links are completely removed from ranking features. They provide a lot of context and value to search engines, and since the engines keep getting better at removing the influence of non-editorial links, the usefulness of link measurement will remain high.

Likely a lot of things already in the index like user and usage data, history and personalization, content and context analysis, semantic analysis, brand signals, etc. I think Google would also find great value in social signals, but they’ve chosen to forego these (at least the direct ones) due to competitive issues.

If you’ve been building links without thought to whether a search engineer would ideally want to count that link’s value, you’re likely in for a nasty surprise at some point. Producing things (content, branding, product, services, features, etc.) that naturally earn the types of links engines are trying to find and count is the very best way to stay ahead of whatever shifts may occur in the future.

Rand Fishkin is the Founder of Moz.

Yes, I absolutely believe links will lose value over time, and especially as technology evolves and the engines have access to even more data. For example, when you have Google Glass-like technology in a contact lens, you’re driving a car run by Android, and eventually a chip in your head that can retrieve information in milliseconds. And if you just laughed, you should know Google’s AI expert Ray Kurzweil recently explained that by 2029, robots will be smarter than humans. Think about ranking factors when that arrives. 🙂 But that’s 2029, not the next 3-5 years.

Glenn GabeFor now, links aren’t dead. Not even close actually. I still think a strong link profile is a valuable asset that will be assessed by Google and Bing when determining rankings. And Google’s internal testing of “link-less” rankings backs that up… Matt Cutts explained last week that Google’s test produced low-quality results. It shows that Google is obviously interested in factors beyond links, but it also shows the complexity in getting that right.

The fact of the matter is that a historical view of a link profile still provides solid insight into how popular and relevant a website is for a particular query (when taking volume, quality, relevance, temporal factors, etc. into account.) Sure, links have been gamed heavily over the years, but as someone that does a boatload of Penguin work, I can tell you that Google has launched an all-out assault on unnatural links.

So, I still think links will be important over the next 2-3 years. That said, a rounded combination of factors could absolutely diminish the power of links as time goes on. And that’s been somewhat happening already…which brings me to your next question.

I wrote a post before Facebook Graph Search rolled out titled “BeastRank – 12 Potential Ranking Factors for the Upcoming Facebook Search Engine.” That post holds a lot of information about how Social could influence search rankings (albeit for Facebook’s purposes). But Google also has G+, and it does have access to a lot of social data of its own. When you combine other factors, you can see how an algorithm that relies less on links, and more on engagement, could be used by Google.

PageRank Social Data

Beyond links, the following factors could all possibly be used to influence rankings:

Brand mentions across Google properties and services (Search, YouTube, Google Plus, Gmail, etc.) Brands are trusted. People like brands. And Google loves brands.AuthorRank (or some form of it) once it officially rolls out. The model of ranking people versus websites is extremely intriguing. It makes a lot of sense and can tie rankings to authors versus the sites their content is written on.Social engagement including +1s, shares, mentions, participation, etc. This is where something like BeastRank could shine. See my link above for more information about the potential of Facebook’s search algorithm.Influencer Engagement – connecting the dots between experts and thought leaders and the content they view, share, etc. So it would take the previous bullet to another level. Content Engagement – dwell time, actual bounce rate (by vertical and niche), downstream traffic, etc. Panda is already taking some of this into account, so it wouldn’t be a stretch to think it can become even more prominent.Long-Term – And thinking long-term when Google Glass and Android-run cars take over, other interesting factors could come into play. For example, what you have seen, where you have been, where you are headed, what you are thinking about, who you are looking at, how long you stayed somewhere, the history of the people you are with, your physical condition, your mental condition, etc. could impact rankings. But that’s a bit into the future. 

In the short-term (2-3 years), I still believe the combination of producing killer content (based on data) and utilizing Social to connect with target audiences is a recipe for success. Effectively executing that combination can impact a wide range of ranking factors (beyond just links). And that can help companies rank as links begin to lose their value.  For example, it can impact brand mentions, sharing, influencer engagement, AuthorRank, etc. 

And for the long-term, I highly recommend becoming extremely familiar with the next phase of technology (like wearables, automotive technology, etc.) For example, how will people share from cars, how will they retrieve information from wearables, what’s the mechanism for attribution from the next generation of gadgets, etc.? That might just lead you down the path towards new ranking factors. For example, the number of (Android) cars that have bookmarked a certain piece of content. 

From a content generation standpoint, I still believe companies need to think about the pain points their target market is dealing with. Then understand how those people are searching for solutions.  And think about where those people will be, how they will be searching, what types of devices they will be using, how they will be asking questions (Hummingbird), etc. That might lead to new versions of content that once didn’t make sense to create and publish.

By the way, if you do everything right (based on what I listed above), those efforts can lead to strong metrics across potential rankings factors (as mentioned earlier). And guess what? One of the byproducts would be more links. So I guess we’re back to square one.  🙂

Glenn Gabe is President of G-Squared Interactive.

Barry AdamsYes I do see links lose value over time, but I don’t believe links will disappear entirely from the ranking algorithms. I think Google is already moving to a scenario where links will have different weights depending on the specific query space. For knowledge graph entities, links might not mean much for the top 10 results, but for competitive commercial terms Google might still want to take a strong queue from websites’ link graphs to determine which one to rank. But as a search engine based on links, Google will never entirely abandon the link graph. Links are the essential foundation of the world wide web, and Google as a search engine would be foolish to discard links entirely – even if it were possible for them to do so.

I suspect sentiment analysis, once Google cracks it (and it’s a very hard nut to crack), will take up a lot of the slack from links in the ranking algos. Google is likely to start looking at brand mentions online and assessing if those mentions were positive or negative. In the end Google wants to rank the most relevant results for a query that gives its users the best experience, and that means ranking websites that deliver quality service and added value. That can be reflected in positive brand mentions online, so a sturdy sentiment analysis algorithm is Google’s holy grail to replace links with.

The trend in SEO over the past year – if not longer – has been to build a strong brand presence online and use this to accumulate links naturally. Coupled with outreach activities that focus on adding value to third-party websites with engaging content, and you already have the foundation for where Google and, as a result, SEO are heading: strong online brands with abundant positive mentions. The focus will need to shift from using content to build links to using content to build a positive online brand, and that’s a very small step to take which some SEOs already have done.

Barry Adams is the Digital Director of The Tomorrow Lab and Editor at State of Digital.

Alan BleiweissLink value is destined to become lower in the overall search algorithm framework as Google discovers more signals that can help reinforce/confirm or challenge link signals. The critical factor here though is how difficult it is to determine the trust strength of those other signals. How long it takes to wean themselves off of link signals is a crap-shoot guess. The biggest obstacle to such a transition is directly based on that difficulty of determining trust strength elsewhere.

The obvious metrics would be social media and on-site factors.

Social is a multi-headed beast that is ever-changing. User behavior within social can be gamed in many ways, just as the link landscape has been gamed in many ways over the years. On-site gaming of the system is already more difficult than gaming the social sphere, and this is why we’ve continually seen more and more efforts to drive improved on-site signal trust through the years. It’s why Schema markup, crawl efficiency and page processing considerations are now baked into the signal evaluation process. And it’s why Google just recently refined their “above the fold” algorithm.  

All of these can help Google better understand sites and quality/relevance signals…  

Since Google+ isn’t considered a pure social channel, Google’s already been reshaping off-site signal considerations by using G+ for author and publisher trust signal clarification and I expect they’ll continue to do so with more effort.

IF they can get enough adoption of Google Fiber across a big enough swath of the United States (and eventually elsewhere around the globe) I would not be surprised if they tap that data stream to help influence consumer intent and preference signals, just as they are already likely tapping Gmail. Not just for paid advertising, but also for overall understanding of trust signals that can (and may, to some degree already be) used to help influence organic results.  

From there, it’s all about societal behavior digitally. So as new digital channels come into existence, it’s all fair game for tapping.

I drive this message consistently across all the interviews I’m in, all my audit work, and client training I perform. Content marketing and link building are tactics. They’re not strategies. The strategy most people group them in is “what can we do to improve our SEO?” The correct, sustainable and proper strategy is bigger than that: Brand building. Whatever the channel, method or opportunity is that comes along, does it contribute to real, trustworthy brand building? If so, great. If not (if it’s a “trick,” “shortcut” or “way to fool search engines”), it’s toxic.  

Alan Bleiweiss is the owner of Bleiweiss Consulting, a specialized SEO company.

Larry Kim on PageRankYes. I see a future where links are far less valuable and I think that the golden age of the SEO link is behind us. Links are too easily manipulated and there are many better signals available today that weren’t available when Google was first started.

Quality Score for organic search. Google treats paid search ads as content, too. Google only gets paid if people click on the ads, so they give more prominent positioning to ads that are more relevant. An algorithm called Quality Score determines ad relevancy without using any links at all!

Our internal research has revealed that Quality Score is mostly based on user engagement, in particular, the click-through rate of your ad compared to the expected click-through rate for your ad’s given position. This kind of data is harder to fake since there are way more clicks happening than links on the web, and Google can figure out which clicks are real vs. fake since they have Google Analytics/Google AdWords/AdSense tracking codes on most websites, and have huge market share for Chrome on both desktop and mobile. They already have systems in place for detecting click-fraud, and I’d also note that Google acquired click-fraud company on Friday.

Additionally, signals like authorship, social media, and your browsing history could be used, too.

Here’s what we’re investing in:

Content Quality: Long form, in-depth articles with original research and non-obvious point of view.Remarketing: Amplify the effectiveness of our content marketing and SEO efforts with remarketing to increase brand recall and user engagement metrics like repeat visitor rate, time on site and conversion rates.Social Media: Improve social engagement metrics.Diversification: Non-search marketing stuff, like partner and event marketing, contests, etc.

Larry Kim is the Founder and Chief Technology Officer of WordStream.

Dr. Pete MeyersI think Google is definitely adding layers, so in that sense – yes. If you add non-link signals then the math says that, overall, links must count for less. I guess the question is – how much less? Right now, in 2014, I’d say not much less. Links are still the core of Google’s engine, and it’s hard to imagine that’s going to change overnight.

The PageRank algorithm is elegant and was impressively effective when it launched, especially compared to crude on-page factors, but I don’t think that any modern search engine can operate on just one type of ranking factor. Anything can be gamed. I suspect that Google will move more toward corroborating signals. Let’s say you have a ton of links, but no traffic, no social, no CTR. That’s just not normal. Likewise, if you have thousands of Likes, but no Tweets, no +1s, no links, etc., something isn’t right. The real power, down the road, is in being able to look across signals. Links will still matter, but you won’t be able to build them in a vacuum.

If they were willing to cross the data barrier, I think user signals – including traffic from Google Analytics. Forget social – dig in and see what sites people are actually visiting. Beyond privacy and gaps in the data, though, there’s a chicken-and-egg issue. For example, let’s say you used CTR and dwell time (the time people spend between clicking a result and returning) as primary signals. How do you know what to rank in the first place in order to measure CTR and dwell time? Again, this is where I think corroborative signals are important.

There are hundreds of answers, if you want to dig into the details, but let me boil it down to one main idea that I think is critical. Make sure that you diversify your tactics and that any tactic has more than one purpose.

Here’s an example – let’s say that Site (A) builds links purely for SEO, and Site (B) builds links that get traffic and those links just happen to also help SEO. What happens if Google changes the rules and discounts these links? Site (A) is dead in the water, but Site (B) still has traffic. Put aside white-hat/black-hat, Google’s guidelines, etc. and ask this: “Is my SEO strategy also building my business?” Editorial links aren’t just good for SEO; they’re good for your brand and they can drive traffic. Even if the rules change, they’ll still have value.

Pete Meyers is a Marketing Scientist at Moz.

Eric EngeOn February 3, 2014 I published an article in Search Engine Land called Google is Not Broken. The point of this is that Google is doing just fine. Yes, you can point to bad search results, but their stock price is soaring, and they are effectively a global monopoly in their market space (China, Korea, Czechoslovakia are notable exceptions).

For this reason, I don’t see Google as having great urgency to make some great overhaul of PageRank or their search results. But, even if there was this great urgency, let’s take a look at the nature of the available signals on the web:

a. Links are a signal that require the person to own a web site. Once you implement a link, it is permanently there until you remove it, visible for all to see.  As a result, links require a fair amount of effort and commitment.

b. Social shares (or tweets) is another potential signal. Bear in mind though that there is little effort or commitment involved in a share or a tweet. I know many people who reshare the articles of others without even visiting the article page (just based on the title). This is a very common practice! In addition, that share disappears from people’s streams or feeds pretty quickly.

c. Likes, +1s, and Favorites are even weaker signals. Google cannot even tell what you have Liked – it is entirely invisible to them. They can crawl that Like counter on a web page, but they can’t tell if it was generated entirely by Fiverr or really authoritative people. While Google can see +1s and Favorites this signal requires even less effort, and less commitment than a share or a tweet.

d. User interaction with content and the search results could also be used. I believe that Google has already incorporated this to some degree in the search results. Duane Forrester confirmed to me that Bing is using click through rate as a ranking factor in an interview I did with him in 2011. However, I believe that links are still a stronger ranking factor.

e. Content analysis? No, we gave that one up back in 1998 when Google came out with their link-based algorithm.

Ever since the first Moz correlation study people have been adamant that Google is using social signals as a ranking factor, and Google remains adamant that they are not. I believe that Google is NOT using them as a result of extensive studies I have done on the topic:

a. Direct Measurement of Google Plus Impact on Rankings

b. Does Facebook Activity Impact SEO?

Important note: Of course Google is using Google Plus in personalized results. You can read more about that in this article.

So in summary, I don’t think that link-based signals will lose all of their weight until there is a fundamentally different set of signals available that are not web-based.

I think there are likely changes that will come in terms of increasing the weight on authoritative sites and continuing to decrease the weight on sites with little or no authority.

I think that Google would use user interaction with the search results as the most important signal. This could involve simply testing content in the SERPs, seeing how it does, and adjusting accordingly. This is what Duane Forrester described to me in terms of what Bing is doing.

They could then supplement this with social signals. I believe the use of social signals would be very heavily slanted in two directions: personalization and endorsements by people with highly authoritative profiles.

Another big question! I have been saying for many years that the big need is to go holistic. You have to look at the big picture. Think about what you would like to have in place if Google and Bing went away entirely – imagine if Congress passed a law saying that operating a search engine was illegal.

What are the thing you wish you had in place if that happened? You would probably want the major sites that cover your market space to regularly publish your content. You would probably want the top influencers in your market space to regularly share you stuff in social media. You would want people to recognize your brand.

You know what is great about this strategy? I will leave you with a quote from my July 2012 interview of Matt Cutts:

“By doing things that help build your own reputation, you are focusing on the right types of activity. Those are the signals we want to find and value the most anyway.”

Isn’t that cool? Pursing the best strategy for search engines is actually the best strategy if there are no search engines!

Eric Enge is the president of Stone Temple Consulting.

Dharmesh ShahI think it’s losing some value – but not as much as some experts believe.  Links have always acted as a form of “endorsement” and a signal of quality to the target page. I think that’s still true today. What’s different is the sheer volume of other signals of quality – namely social signals. Since it is much easier to tweet than blog, more people tweet than blog. So in a way, it creates a more “democratic” and diverse set of data. I don’t think Google or the other search engines are going to eliminate links as a signal from the algorithm anytime soon. They may just reduce the weighting, but it doesn’t make sense to drop the signal completely.  

Honestly, I don’t know. The first thing that jumps to mind is social data. Signals like tweets, shares, follows etc.

I’ve long advocated that people focus more on the content, experience and brand than “tactical” SEO. It seems that the long-term winners in the SEO race are those that deliver a positive experience to users, create useful content and get the “basics” right in terms of crawlability and such. Everything else always seems to be a short-term boost, because the algorithm keeps changing so much. The thing I like to keep in mind is that Google (and the other engines) are simply trying to calculate what content searchers want to see (i.e. what they consider high value). That’s what the algorithm is trying to proxy. In my mind, SEO is basically HHO (Human Happiness Optimization).

Dharmesh Shah is Chief Technology Officer and Co-Founder of HubSpot.

View the original article here

, , , ,

No comments yet.

Leave a Reply