Ways of Spammy BackLinks to Avoid in SEO

If the last few months of ranking changes have shown me anything, it’s that poorly executed link building strategy that many of us call white hat can be more dangerous than black-hat strategies like buying links. As a result of well intentioned but short-sighted link building, many sites have seen significant drops in rankings and traffic. Whether you employ link building tactics that are black, white, or any shade of grey, you can do yourself a favor by avoiding the appearance of link spam.

It’s become very obvious that recent updates hit sites that had overly aggressive link profiles. The types of sites that were almost exclusively within what I called the “danger zone” in a post about one month before Penguin hit. Highly unnatural anchor text and low-quality links are highly correlated, but anchor text appears to have been the focus.

I was only partially correct, as the majority of cases appear to be devalued links rather than penalties. Going forward, the wise SEO would want to take note of the types of link spam to make sure that what they’re doing doesn’t look like a type of link spam. Google’s response to and attitude towards each type of link spam varies, but every link building method becomes more and more risky as you begin moving towards the danger zone.

1. Cleansing Domains

While not technically a form of link building, 301 “cleansing” domains are a dynamic of link manipulation that every SEO should understand. When you play the black hat game, you know the chance of getting burned is very real. Building links to a domain that redirects to a main domain is one traditionally safe way to quickly recover from Google actions like Penguin. While everyone else toils away attempting to remove scores of exact-match anchor text, the spammers just cut the trouble redirected domains loose like anchors, and float on into the night with whatever treasure they’ve gathered.

A cleansing domain for NFL jersies

When Penguin hit, this linkfarm cleansing domain changed from a 301 to a 404 almost overnight.

Link building through redirects should be easy to catch, as new links to a domain that is currently redirecting is hardly natural behavior. To anyone watching, it’s like shooting up a flare that says, “I’m probably manipulating links.” The fact that search engines aren’t watching closely right now is no guarantee of future success, so I’d avoid this and similar behavior if future success is a goal.

2. Blog Networks & Poorly Executed Guest Blogs

I’ve already covered the potential risks of blog networks in depth here. Google hates blog networks – fake blogs that members pay or contribute content to in order to get links back to their or their clients’ sites. Guest blogging and other forms of contributing content to legitimate sites is a much whiter tactic, but consider that a strategy that relies heavily on low-quality guest blogging looks a lot like blog network spam.

With blog networks, each blog has content with a constant ratio of words to links. It posts externally to a random sites multiple times, and with a lot of “inorganic” anchor text for commercially valuable terms. Almost all backlinks to blog networks are also spam.

I cringe when I see low-quality blogs with questionable backlinks accepting guest blog posts that meet rigid word length and external link guidelines. Quality blogs tend not to care if the post is 400-500 words with two links in the bio, and quality writers tend not to ruin the post with excessive linking. Most of us see guest blogging as a white-hat tactic, but a backlink profile filled with low-quality guest posts looks remarkably similar to the profile of a site using automated blog networks.

I’d obviously steer clear of blog networks, but I’d be just as wary of low-quality inorganic guest blogs that look unnatural. Guest blog on sites with high quality standards and legitimate backlink profiles of their own.

3. Article Marketing Spam

Article link addiction is still a real thing for new SEOs. You get one or two links with anchor text of your choice, and your rankings rise. You’re not on the first page, but you do it again and get closer. The articles are easy and cheap, and they take no creativity or mental effort. You realize that you’re reaching diminishing returns on the articles, but your solution isn’t to stop – you just need to do more articles. Before you know it, you’re searching for lists of the top article sites that give followed links and looking for automated solutions to build low-quality links to your low-quality links.

Most articles are made for the sole purpose of getting a link, and essentially all followed links are self-generated rather than endorsements. Google has accordingly made article links count for very little, and hashammered article sites for their low-quality content.

Ezine Articles SEO visibility

Maybe you’re wondering how to get a piece of that awesome trend, but hopefully you’ll join me in accepting that article directories aren’t coming back. Because they can theoretically be legitimate, article links are generally devalued rather than penalized. As with all link spam, your risk of receiving more harsh punishment rises proportionate to the percentage of similar links in your profile.

4. Single-Post Blogs

Ironically named “Web 2.0 Blogs” by some spam peddlers, these two-page blogs on Tumblr and WordPress sub-domains never see the light of day. After setting up the free content hub with an article or two, the site is then “infused” with link juice, generally from social bookmarking links (discussed below).

Despite their prevalence, these sites don’t do much for rankings. Links with no weight come in, and links with no impact go out. They persist because with a decent free template, clients can be shown a link on a page that doesn’t look bad. Google doesn’t need to do much to weed these out, because they’re already doing nothing.

5. (Paid) Site-Wide Links

Site-wide footer links used to be all the rage. Google crippled their link-juice-passing power because most footer links pointing to external sites are either Google Bombs or paid links. Where else would you put a site-wide link that you don’t want your users to click?

To my point of avoiding the appearance of spam, Penguin slammed a number of sites with a high proportion of site-wide (footer) links that many would not have considered manipulative. Almost every free WordPress theme that I’ve seen links back to the creator’s page with choice anchor text, and now a lot of WordPress themes are desperately pushing updates to alter or remove the link. Penguin didn’t care if you got crazy with a plugin link, designed a web site, or hacked a template; the over-use of anchor text hit everyone. This goes to show that widespread industry practices aren’t inherently safe.

6. Paid Links in Content

There will never be a foolproof way to detect every paid link. That said it’s easier than you think to leave a footprint when you do it in bulk. You have to trust your sellers not to make it obvious, and the other buyers to keep unwanted attention off their own sites. If one buyer that you have no relationship to buys links recklessly, the scrutiny can trickle down through the sites they’re buying from and eventually back to you.

If you do buy links, knowing what you’re doing isn’t enough. Make sure everyone involved knows what they’re doing. Google is not forgiving when it comes to buying links.

7. Link Exchanges, Wheels, etc.

Speaking of footprints, I believe it’s possible to build a machine learning model to start with a profile of known links violating guidelines, which you can acquire from paid link sites and link wheel middlemen with nothing more than an email address. You can then assess a probability of a site being linked to in that manner, corroborating potential buyers and sellers with a link graph of similar profiles. I have no idea what kind of computing/programming power this would take, but the footprint is anomalous enough that it should be possible.

Exchanging links through link schemes requires a lot more faith in a bunch of strangers than I can muster. In a link wheel, you’re only as strong and subtle as your “weakest links.” My opinion is that if you’re smart enough to avoid getting caught, you’re probably smart enough to build or write something awesome that will have superior results and lower risk than link wheels.

8. Low-Quality Press Release Syndication

High-quality syndication and wire services possess a few unattractive attributes for spammers: there are editorial guidelines, costs, and even fact checking. Low-quality syndication services will send almost anything through to any site that will take it. You’ll end up with a bunch of links, but not many that get indexed, and even fewer that get counted.

My experience has been that press releases have rapidly diminishing returns on syndication only, and the only way to see ROI is to generate actual, real coverage. I still see link-packed press releases all over the web that don’t have a chance of getting coverage – really, your site redesign is not news-worthy. I’m not sure whether to attribute this to bad PR, bad SEO, or both.

9. Linkbait and Switch

In this context, we’re talking about creating a real piece of linkbait for credible links, and later replacing the content with something more financially beneficial. Tricking people into linking to content is clearly not something Google would be ok with. I don’t see linkbait and switch done very often, but I die a little every time I see it. If you’re able to create and spread viral content, there’s no need to risk upsetting link partners and search engines. Instead, make the best of it with smart links on the viral URL, repeat success, and become a known source for great content.

10. Directories

Directories have been discussed to death. The summary is that Google wants to devalue links from directories with no true standards. Here’s a Matt Cutts video and blog post on the topic. Directory links often suffer from a high out/in linking ratio, but those worth getting are those that are actually used for local businesses (think Yelp) and any trafficked industry directories.

  1. Would I pay money for a listing here?
  2. Are the majority of current listings quality sites?
  3. Do listings link with the business or site name?

If the answer to any of these questions is no, don’t bother with a link. This immediately excludes all but a handful of RSS or blog feed directories, which are mostly used to report higher quantities of links. When I was trained as an SEO, I was taught that directories would never hurt, but they might help a tiny bit, so I should go get thousands of them in the cheapest way possible. Recent experience has taught us that poor directory links can be a liability.

Even as I was in the process of writing this post, it appears that Google began deindexing low-quality directories. The effect seems small so far – perhaps testifying to their minimal impact on improving rankings in the first place – but we’ll have to wait and see.

11. Link Farms and Networks

I honestly can’t speak as an authority on link farms, having never used them personally or seen them in action.

“I’m telling you right now, the engines are very very smart about this kind of thing, and they’ve seen link farming over and over and over again in every different permutation. Granted, you might find the one permutation – the one system – that works for you today, but guess what? It’s not going to work tomorrow; it’s not going to work in the long run.” – Rand in 2009

My sense is that this prediction came true over and over again. I’d love to hear your thoughts.

12. Social Bookmarking & Sharing Sites

Links from the majority of social bookmarking sites carry no value. Pointing a dozen of them at a page might not even be enough to get the page crawled. Any quality links that go in have their equity immediately torn a million different directions if links are followed. The prevalence of spam-filled and abandoned social bookmarking sites tells me that site builders seriously over-estimated how much we would care about other people’s bookmarks.

Sites focusing on user-generated links and content have their own ways of handling trash. Active sites with good spam control and user involvement will filter spam on their own while placing the best content prominently. If you’d like to test this, just submit a commercial link to any front-page sub-Reddit and time how long it takes to get the link banned. Social sites with low spam control stop getting visitors and incoming links while being overrun by low quality external links. Just ask Digg.

13. Forum Spam

Forum spam may never die, though it is already dead. About a year ago, we faced a question about a forum signature link that was in literally thousands of posts on a popular online forum. When we removed the signature links, the change was similar to effect of most forum links: zero. It doesn’t even matter if you nofollow all links. Much like social sites, forums that can’t manage the spam quickly turn into a cesspool of garbled phrases and anchor text links. Bing’s webmaster forums are a depressing example.

14. Unintended Followed Link Spam

From time to time you’ll hear of a new way someone found to get a link on an authoritative site. Examples I have seen include links in bios, “workout journals” that the site let users keep, wish lists, and uploaded files. Sometimes these exploits (for lack of a better term) go viral, and everyone can’t wait to fill out their bio on a DA 90+ site.

In rare instances, this kind of link spam works – until the hole is plugged. I can’t help but shake my head when I see someone talking about how you can upload a random file or fill out a bio somewhere. This isn’t the sort of thing to base your SEO strategy around. It’s not long-term, and it’s not high-impact.

15. Profile Spam

While similar to unintended followed links on authority domains, profile spam deserves its own discussion due to their abundance. It would be difficult for Google to take any harsh action on profiles, as there is a legitimate reason for reserving massive numbers of profiles to prevent squatters and imitators from using a brand name.

What will hurt you is when your profile name and/or anchor text doesn’t match your site or brand name.

car-insurance-spam-profile

“The name’s Insurance. Car Insurance”

When profile links are followed and indexed, Google usually interprets the page as a user page and values it accordingly. Obviously Google’s system for devaluing profile links is not perfect right now. I know it’s sometimes satisfying just to get an easy link somewhere, but profile link spam is a great example of running without moving.

16. Comment Spam

If I were an engineer on a team designed to combat web spam, the very first thing I would do would be to add a classifier to blog comments. I would then devalue every last one. Only then would I create exceptions where blog comments would count for anything.

I have no idea if it works that way, but it probably doesn’t. I do know that blogs with unfiltered followed links are generally old and unread, and they often look like this:

Followed blog comments

Let’s pretend that Google counts every link equally, regardless of where it is on the page. How much do you think 1/1809th of the link juice on a low-authority page is worth to you? Maybe I’m missing something here, because I can’t imagine spam commenting being worth anything at any price. Let’s just hope you didn’t build anchor text into those comments.

17. Domain Purchase and Redirect/Canonical

Buying domains for their link juice is an old classic, but I don’t think I have anything to add beyond what Danny Sullivan wrote on the matter. I’m also a fan of Rand’s suggestion to buy blogs and run them rather than pulling out the fangs and sucking every ounce of life out of a once-thriving blog.

Domain buying still works disgustingly well in the (rare) cases where done correctly. I would imagine that dozens of redirected domains will eventually bring some unwelcome traffic to your site directly from Mountain View, but fighting spam has historically been much easier in my imagination than in reality.

This list is not meant to be comprehensive, but it should paint a picture of the types of spam that are out there, which ones are working, and what kinds of behaviors could get you in trouble.

Spam Links: Not Worth It

I have very deliberately written about what spam links “look like.” If you do believe that black hat SEO is wrong, immoral, or in any way unsavory that’s fine – just make sure your white hat links don’t look like black hat links. If you think that white hat SEOs are sheep, or pawns of Google, the same still applies: your links shouldn’t look manipulative.

I’m advising against the tactics above because the potential benefits don’t outweigh the risks. If your questionable link building does fall apart and your links are devalued, there’s a significant cost of time wasted building links that don’t count. There’s also the opportunity cost – what could you have been doing instead? Finally, clearing up a manual penalty can take insane amounts of effort and remove Google’s revenue stream in the meantime.

Advertisements

Top 5 Link Building Strategies 2012

The popularity of any site is not just determined by the number of incoming links pointing to it, but also by the quality of those links. Link building helps increase the popularity of your site, and so your rankings on the search engines. A link building strategy needs careful planning and implementation, so you can reap slow yet steady results. I’ve put together a list of my top link building strategies to help you increase your rankings:

1. Links from Multiple Domains

While your link building strategies should focus on volume, your plan should not be to get most links from a single site. Instead, work towards just a few links from many sites for best results.

As search engines rank sites, they consider how relevant the content is to the keywords and search terms. When you get links from multiple domains, you demonstrate that many sources find your content useful.

This is critically important when you develop additional backlinks from authoritative domains.

2. Deep (Ingrained) Links

Instead of directing links to your site’s main page, use links directly to specific pages of content or images. These are known
as deep or ingrained links, and they show that you have important information throughout your site, not just isolated on the home
page.

Using deep or ingrained links adds to your ability to use important keywords for greater ranking in search engine results. You can spread these words throughout your site as appropriate, which will ensure a user-friendly overall appearance.

3. Links to Local Sites

If your link building strategies don’t include neighboring customers, you are missing out on a large opportunity. It is exciting to develop international backlinks, but it is the locals that will keep you in business.

Obtaining authoritative links from local sources is far easier than from nationwide or internationally renowned experts.

Nearby specialists enjoy building their communities by working with neighbouring businesses. SEO rankings will climb when you include local authorities such as the Chamber of Commerce or the Better Business Bureau for your state. Consider local organizations, libraries, museums, and similar sites as well as your local government agencies. Directories like yellowpages.com and superpages.com are also excellent options. You might have to pay a fee for the service, but this tends to be very low. Some of those you approach will jump at the chance for a quid pro quo linking relationship.

4. Links to Authoritative Domains

Sites classified as authoritative on a subject are those that offer expertise in a given industry or specific field. For example, nytimes.com has extensive, reliable, and accurate information on events relevant to New York. Its strong reputation as a source of specialized knowledge lends it authority on New York news.

Link building strategies that include government and university websites lead to high ratings, as these are accepted as authorities on their subject matter. They can be easily recognized by their .gov and .edu domains, for example IRS and California State University.

Authoritative domain links lead visitors to extensive amounts of quality, well researched information, and they often contain
additional links that lead to good sources for further exploration.

5. Linking through Anchor Text

The strongest and most popular tool available today is linking through anchor text. For best SEO ranking results, be sure to make this part of your link building strategies.

They work by turning your text into hyperlinks, so as readers go along, they can click on an interesting keyword. The link directs them to the page you specify, which makes your keywords even stronger, improving your site’s ultimate ranking.

good luck!

6 Ways to Recover from Bad Links

It’s a story we hear too often: someone hires a bad SEO, that SEO builds a bunch of spammy links, he/she cashes their check, and then bam – penalty! Whether you got bad advice, “your friend” built those links, or you’ve got the guts to admit you did it yourself, undoing the damage isn’t easy. If you’ve sincerely repented, I’d like to offer you 6 ways to recover and hopefully get back on Google’s Nice list in time for the holidays.

This is a diagram of a theoretical situation that I’ll use throughout the post. Here’s a page that has tipped the balance and has too many bad (B) links – of course, each (B) and (G) could represent 100s or 1000s of links, and the 50/50 split is just for the visual:

Hypothetical link graph

Be Sure It’s Links

Before you do anything radical (one of these solutions is last-ditch), make sure it’s bad links that got you into trouble. Separating out a link-based penalty from a devaluation, technical issue, Panda “penalty”, etc. isn’t easy. I created a 10 minute audit a while back, but that’s only the tip of the iceberg. In most cases, Google will only devalue bad links, essentially turning down the volume knob on their ability to pass link-juice. Here are some other potential culprits:

  1. You’ve got severe down-time or latency issues.
  2. You’re blocking your site (Robots.txt, Meta Robots, etc.).
  3. You’ve set up bad canonicals or redirects.
  4. Your site has massive duplicate content.
  5. You’ve been hacked or hit with malware.

Diagnosing these issues is beyond the scope of this post, but just make sure the links are the problem before you start taking a machete to your site. Let’s assume you’ve done your homework, though, and you know you’ve got link problems…

1. Wait It Out

In some cases, you could just wait it out. Let’s say, for example, that someone launched an SQL injection attack on multiple sites, pointing 1000s of spammy links at you. In many cases, those links will be quickly removed by webmasters, and/or Google will spot the problem. If it’s obvious the links aren’t your fault, Google will often resolve it (if not, see #5).

Even if the links are your responsibility (whether you built them or hired someone who did), links tend to devalue over time. If the problem isn’t too severe and if the penalty is algorithmic, a small percentage of bad links falling off the link graph could tip the balance back in your favor:

Link graph with bad links removed

That’s not to say that old links have no power, but just that low-value links naturally fall off the link-graph over time. For example, if someone builds a ton of spammy blog comment links to your site, those blog posts will eventually be archived and may even drop out of the index. That cuts both ways – if those links are harming you, their ability to harm will fade over time, too.

2. Cut the Links

Unfortunately, you can’t usually afford to wait. So, why not just remove the bad links?

Link graph with all bad links cut

Well, that’s the obvious solution, but there are two major, practical issues:

(a) What if you can’t?

This is the usual problem. In many cases, you won’t have control over the sites in question or won’t have login credentials (because your SEO didn’t give them to you). You could contact the webmasters, but if you’re talking about 100s of bad links, that’s just not practical. The kind of site that’s easy to spam isn’t typically the kind of site that’s going to hand remove a link, either.

(b) Which links do you cut?

If you thought (a) was annoying, there’s an even bigger problem. What if some of those bad links are actuallyhelping you? Google penalizes links based on patterns, in most cases, and it’s the behavior as a whole that got you into trouble. That doesn’t mean that every spammy link is hurting you. Unfortunately, separating the bad from the merely suspicious is incredibly tough.

For the rest of this post, let’s assume that you’re primarily dealing with (a) – you have a pretty good idea which links are the worst offenders, but you just can’t get access to remove them. Sadly, there’s no way to surgically remove the link from the receiving end (this is actually a bit of an obsession of mine), but you do have a couple of options.

3. Cut the Page

If the links are all (or mostly) targeted at deep, low-value pages, you could pull a disappearing act:

Link graph with page removed

In most cases, you’ll need to remove the page completely (and return a 404). This can neuter the links at the target. In some cases, if the penalty isn’t too severe, you may be able to 301-redirect the page to another, relevant page and shake the bad links loose.

If all of your bad links are hitting a deep page, count yourself lucky. In most cases, the majority of bad links are targeted at a site’s home-page (like the majority of any links), so the situation gets a bit uglier.

4. Build Good Links

In some sense, this is the active version of #2. Instead of waiting for bad links to fade, build up more good links to tip the balance back in your favor:

Link graph with good links added

By “good”, I mean relevant, high-authority links – if your link profile is borderline, focus on quality over quantity for a while. Rand has a great post on link valuation that I highly recommend – it’s not nearly as simple as we sometimes try to make it.

This approach is for cases where you may be on the border of a penalty or the penalty isn’t very severe. Fair warning: it will take time. If you can’t afford that time, have been hit hard, or suspect a manual penalty, you may have to resort to one of the next two options…

5. Appeal to Google

If you’ve done your best to address the bad links, but either hit a wall or don’t see your rankings improve, you may have to appeal to Google directly. Specifically, this means filing a reconsideration request through Google Webmaster Tools. Rhea at Outspoken had an excellent post recently on how to file for reconsideration, but a couple of key points:

  • Be honest, specific and detailed.
  • Show that you’ve made an effort.
  • Act like you mean it (better yet: mean it).

If Google determines that your situation is relevant for reconsideration (a process which is probably semi-automated), then it’s going to fall into the hands of a Google employee. They have to review 1000s of these requests, so if you rant, provide no details, or don’t do your homework, they’ll toss your request and move on. No matter how wronged you may feel, suck it up and play nice.

6. Find a New Home

If all else fails, and you’ve really burned your home to the ground and salted the earth around it, you may have to move:

Link graph with site moved

Of course, you could just buy a new domain, move the site, and start over, but then you’ll lose all of your inbound links and off-page ranking factors, at least until you can rebuild some of them. The other option is to 301-redirect to a new domain. It’s not risk-free, but in many cases a site-to-site redirect does seem to neuter bad links. Of course, it will very likely also devalue some of your good links.

I’d recommend the 301-redirect if the bad links are old and spammy. In other words, if you engaged in low-value tactics in the past but have moved on, a 301 to a new domain may very well lift the penalty. If you’ve got a ton of paid links or you’ve obviously built an active link farm (that’s still in play), you may find the penalty comes back and all your efforts were pointless.

A Modest Proposal

I’d like to end this by making a suggestion to Google. Sometimes, people inherit a bad situation (like a former SEO’s black-hat tactics) or are targeted with bad links maliciously. Currently, there is no mechanism to remove a link from the target side. If you point a link at me, I can’t say: “No, I don’t want it.” Search engines understand this and adjust for it to a point, but I really believe that there should be an equivalent of nofollow for the receiving end of a link.

Of course, a link-based attribute is impossible from the receiving end, and a page-based directive (like Meta Robots) is probably impractical. My proposal is to create a new Robots.txt directive called “Disconnect”. I imagine it looking something like this:

Disconnect: www.badsite.com

Essentially, this would tell search engines to block any links to the target site coming from “www.badsite.com” and not consider them as part of the link-graph. I’d also recommend a wild-card version to cover all sub-domains:

Disconnect: *.badsite.com

Is this computationally possible, given the way Google and Bing process the link-graph? I honestly don’t know. I believe, though, that the Robots.txt level would probably be the easiest to implement and would cover most cases I’ve encountered.

While I recognize that Google and Bing treat bad links with wide latitude and recognize that site owners can’t fully control incoming links, I’ve seen too many cases at this point of people who have been harmed by links they don’t have control over (sometimes, through no fault of their own). If links are going to continue to be the primary currency of ranking (and that is debatable), then I think it’s time the search engines gave us a way to cut links from both ends.

Update (December 15th)

From the comments, I wanted to clarify a couple of things regarding the “Disconnect” directive. First off, this is NOT an existing Robots.txt option. This is just my suggestion (apparently, a few people got the wrong idea). Second, I really did intend this as more of a platform for discussion. I don’t believe Google or Bing are likely to support the change.

One common argument in the comments was that adding a “Disconnect” option would allow black-hats to game the system by placing risky links, knowing they could be easily cut. While this is a good point, theoretically, I don’t think it’s a big practical concern. The reality is that black-hats can already do this. It’s easy to create paid links, link farms, etc. that you control, and then cut them if you run into trouble. Some SEO firms have even built up spammy links to get a short-term boost, and then cut them before Google catches on (I think that was part of the JC Penney scheme, actually).

Almost by definition, the “Disconnect” directive (or any similar tool) would be more for people who can’t control the links. In some cases, these may be malicious links, but most of the time, it would be links that other people created on their behalf that they no longer have control over.

Cheers!!!

Wake Up SEOs, the New Google is Here

I must admit that lately Google is the cause of my headaches.

No, not just because it decided I was not going to be not provided with useful information about my sites. And neither because it is changing practically every tool I got used since my first days as an SEO (Google Analytics, Webmaster Tools, Gmail…). And, honestly, not only because it released a ravenous Panda.

No, the real question that is causing my headaches is: What the hell does Google want to go with all these changes?

Let me start quoting the definition of SEO Google gives in its Guidelines:

Search engine optimization is about putting your site’s best foot forward when it comes to visibility in search engines, but your ultimate consumers are your users, not search engines.

Technical SEO still matters, a lot!

If you want to put your site’s best foot forward and make it the most visible possible in search engines, then you have to be a master in technical SEO.

We all know that if we do not pay attention to the navigation architecture of our site, if we don’t care about the on-page optimization, if we mess up with the rel=”canonical” tag, the pagination and the faceted navigation of our web, and if we don’t pay attention to the internal content duplication, etc. etc., well, we are not going to go that far with Search.

Is all this obvious? Yes, it is. But people in our circle tend to pay attention just to the last bright shining object and forget what one of the basic pillars of our discipline is: make a site optimized to be visible in the search engines.

The next time you hear someone saying “Content is King” or “Social is the new link building”, snap her face and ask her when it was the last time she logged in Google Webmaster Tools.

Go fix your site, make it indexable and solve all the technical problems it may have. Just after done that, you can start doing all the rest.

User is king

Technical SEO still matters, but that does not mean that it is synonym of SEO. So, if you hear someone affirming it, please snap her face too.

No... content is not the only King. User is the King! Image by Jeff Gregory

User and useful have the same root: use. And a user finds useful a website when it offers an answer to her needs, and if its use is easy and fast..

From the point of view that Google has of User, that means that a site to rank:

  1. must be fast;
  2. must have useful content and related to what it pretends to be about;
  3. must be presented to Google so that it can understand the best it can what it is about.

The first point explains the emphasis Google gives to site speed, because it is really highly correlated to a better user experience.

The second is related to the quality of the content of a site, and it is substantially what Panda is all about. Panda, if we want to reduce it at its minimal terms, is the attempt by Google of cleaning its SERPs of any content it does not consider useful for the end users.

The third explains the Schema.org adoption and why Google (and the other Search Engines) are definitely moving to the Semantic Web: because it helps search engines organize the bazillion contents they index every second. And the most they understand really what is your content about, the better they will deliver it in the SERPs.

The link graph mapped

The decline of Link graph

We all know that just with on-site optimization we cannot win the SERPs war, and that we need links to our site to make it authoritative. But we all know how much the link graph can be gamed.

Even though we still have tons of reasons to complain with Google about the quality of SERPs, especially due to sites that ranks thanks to manipulative link building tactics, it is hard for me to believe that Google is doing nothing in order to counteract this situation. What I believe is that Google has decided to solve the problem not with patches but with a totally new kind of graph.

That does not mean that links are not needed anymore, not at all, as links related factors still represent (and will represent) a great portion of all the ranking factors, but other factors are now cooked in the ranking pot.

Be Social and become a trusted seed

In a Social-Caffeinated era, the faster way to understand if a content is popular is to check its “relative” popularity in the social media environment. I say “relative”, because not all contents are the same and if a meme needs many tweets, +1 and likes/share to be considered more popular than others, it is not so for more niche kind of contents. Combining social signals with the traditional link graph, Google can understand the real popularity of a page.

The problem, as many are saying since almost one year, is that it is quite easy to spam in Social Media.

The Facebook Social Graph from Silicon Angle

For this reason Google introduced the concepts of Author and Publisher and, even more important, Google linked them to the Google Profiles and is pushing Google Plus, which is not just another Social Media, but what Google aims to be in the future: a social search engine.

Rel=”author” and Rel=”publisher” are the solution Google is adopting in order to better control, within other things, the spam pollution of the SERPs.

If you are a blogger, you will be incentivized in marking your content with Author and link it to your G+ Profile, and as a Site, you are incentivized to create your G+ Business page and to promote it with a badge on you site that has the rel=”publisher” in its code.

Trusted seeds are not anymore only sites, but can be also persons (i.e.: Rand or Danny Sullivan) or social facets of an entity… so, the closer I am in the Social Graph to those persons//entity the more trusted I am to Google eyes.

The new Google graph

As we can see, Google is not trying to rely only on the link graph, as it is quite easy to game, but it is not simply adding the social signals to the link graph, because they too can be gamed. What Google is doing is creating and refining a new graph that see cooperating Link graph, Social graph and Trust graphand which is possibly harder to game. Because it can be gamed still, but – hopefully – needing so many efforts that it may become not-viable as a practice.

Wake up SEOs, the new Google is here

As a conclusion, let me borrow what Larry Page wrote on Google+ (bold is mine):

Our ultimate ambition is to transform the overall Google experience […] because we understand what you want and can deliver it instantly.

This means baking identity and sharing into all of our products so that we build a real relationship with our users. Sharing on the web will be like sharing in real life across all your stuff. You’ll have better, more relevant search results and ads.

Think about it this way … last quarter, we’ve shipped the +, and now we’re going to ship the Google part.

I think that it says it all and what we have lived a year now is explained clearly by the Larry Page words.

What can we do as SEOs? Evolve, because SEO is not dieing, but SEOs can if they don’t assume that winter – oops – the change of Google is coming.

The New SEO graph

 

The 4 Critically Essential Off-The-Page Search Engine Optimization Factors

In our last lesson we talked about the things you can do on your website to help it rank well in the search engines — in other words, the “on the page” factors. In this lesson we’re going to talk about the external factors that can influence your rankings — the “off the page” factors.

Your Google PageRank

Before we get into the “hows”, it’s important that you understand a little bit about Google’s PageRank. PageRank is Google’s way of indexing all content and websites based on importance in the internet community. It’s an important factor in Google’s ranking algorithm, and by understanding a little of how it works, you’ll have a better idea about how to boost your rankings in the world’s most popular search engine.

To establish the “importance” of your page, Google looks at how many other websites are linking to your page. These links are like “votes”, and the more “votes” you have, the greater your online “importance” and the higher your PageRank.

And higher PageRank is an important contributor to higher search engine rankings.

It’s not as democratic as it sounds, however: Not every page that links to you is given equal “voting power”. Pages that have a high PageRank have more voting power than pages with low PageRank. This means that the “boost” a link gives to your own PageRank is closely related to the PageRank of the site that’s linking to you.

For instance… receiving just ONE link from a PR5 page might well give you more benefit than receiving 20 links from PR0 pages. It’s quality not quantity that’s important.

The equation for working out how much PR value you’ll get from a link looks something like this:

  • PR = 0.15 + 0.85 x (your share of the link PR)
  • By “your share of the link PR” I mean that every site only has a certain amount of PR “juice” to give out. Let’s say a page has 100 votes. Lets say it has 20 outgoing links on that page. Then each link is sending 5 votes to the other site. (100 / 20 = 5) That is a simple way of looking at the share of the PR of the link. In reality the higher-placed links get higher voting power, (e.g. 10 votes each) while the lower-placed ones will get less, (e.g. 2 votes each).

There are many other factors at play that determine the PageRank of a page:

  1. The amount of PageRank flowing in to your page. PageRank can come from other sites linking to your page, but also from other pages on your website linking to your page.
  2. Your internal linking: As I just mentioned, PageRank can also come from other pages on your website, trickling from one page to another through your internal linking, menus and such. The trick is to “sculpt” the flow of your PageRank so that it “pools” in your most important pages. (In other words, don’t waste your PageRank by linking to your “contact us” page and site-map all over the show… add rel=”nofollow” to those links to stop the PageRank leaking through to them.)
  3. The number of pages in your website: the more pages your website has, the higher your PageRank will be.
  4. The number of external sites you link to. Again, think of PageRank as being something that “flows”. By linking to lots of other websites you’re letting your PageRank flow out of your page, rather than allowing it to pool. Try to have reciprocal links wherever possible, so that the PageRank flows back to you.

The best piece of advice is to keep these points in mind when building your site and try to avoid any on-page factors which might be detrimental to the “flow” of your PageRank through your site. Once you’ve done that, work on getting quality links from quality websites. The easiest way to do this is to fill your website with useful, relevant information that makes people want to link to you!

And remember: PageRank is just part of Google’s ranking algorithm. You’ll often see pages with high PageRank being outranked by pages with lower PageRank, which shows that there’s much more at play here!

#1: Build lots of 1-way incoming links

Do this through article submissions, directory submissions, submitting articles to blog networks (such as the PLRPro blog network), buying links (e.g. from digital point forums), and so on.

But be careful…

Purchased links can sometimes be more powerful than links you get by more natural methods… but Google will penalize you if they know that you are buying links. One way they’ll nab you is if you buy a link on a monthly lease and then end up canceling it. One link might not be enough to send up the red flags, but some people buy and cancel hundreds of links in this manner.

A better idea is to buy lifetime links from places like forums.digitalpoint.com, and to try to find links from websites that are on topics relevant to your own.

#2: Get some links from good sites

By “good sites” I mean websites that have a high PageRank, or sites with a high “trust” factor (such as Yahoo, Dmoz or sites with a .edu suffix). If you can get good links to the pages on your site that generate the most income for you, even better — if you can improve the ranking of these pages you’ll get more traffic, more conversions, and more money!

#3: Make sure that pages you gain links from are, in fact, indexed.

A link to your site won’t count for anything if the page that is linking to you hasn’t actually been indexed by the search engines. The search engines won’t see the link, and they won’t give you any credit for it. I see a lot of people submitting their sites to article directories and search directories, and then ending up on a page that the search engines don’t visit. This is pointless!

The good news is that it’s pretty simple to get all these pages indexed. All you have to do is let the search engines know about the page yourself. To do this you need to set up a webpage outside of your main site, such as a free blog or a Twitter.com profile. Make sure that the search engines are indexing this page, of course, and then every time you get a new link to your main site, write about it in your blog or Twitter profile! The search engines will see this and visit the other site — hey presto! The page is now indexed, and you’ll get credit for your link.

Important: Don’t link to this blog or Twitter profile from your main money website. Doing this will create a reciprocal link loop…

#4: Don’t loop your links

Reciprocal links aren’t as powerful as one-way links. This is why you want to receive one-way links from other websites wherever possible.

But there are also things called “reciprocal link loops” which are like bigger versions of this. I mentioned one in the last tip… A links to B, B links to C and C links to A. That’s a loop… it eventually comes full circle back to the first site. A “link loop” can get pretty large, but if it eventually ends up back at the start, it’s still a loop, and all links within the loop become less powerful. Small loops are the worst, but try to avoid loops wherever possible.

That brings us to the end of our critical off-page factors for search engine optimization. In part three of this five-part mini-course I’ll talk link building strategies: Keep an eye out for it!

How to Establish QUALITY Backlinks?

When starting your own website or blog, acquiring a good search engine ranking position (SERP) is usually one of the main goals. Achieving a high SERP for relevant keywords is essential for the exposure you need to drive traffic and visitors to your content. When you see the number of online competitors, it is enough to overwhelm even the most determined webmaster. However, if you plan things out properly, you can overcome even stiff competition with a bit of effort. You can even do this with fewer posts and a much younger site.

The way to reach the top spots in searches is through properly targeted quality content and backlinks. You correctly target your posts and backlinks based on your niche’s relevant keywords. Relevant keywords is a short way of saying the words and phrases that people will naturally type into search engines when they are looking for the kind of content, products, and/or services that you offer.

Quality content is an important key. Your posts, articles, videos, and other content must appeal to human readers. It is easy to adjust well-written features to include relevant keywords for the search engines to see. However, you will only drive away human readers if you make poorly written blogs, spammy sales text, and other “junk” content. It is important to make sure things are tuned properly for search engines, but do not forget it is human beings that open their minds and wallets.

Building good backlinks to quality content is one of the most effective and surefire ways to drive your site to the top of the SERPs. Good backlinks use your relevant keywords as the anchor text. Anchor text is simply a web link that is wrapped around a word or set of words, just link a website menu link. 🙂

Knowing where to build your backlinks is a key element. You should place your links naturally in places that have many visitors and/or high search engine trust. Trust is usually measured by PageRank, which is a measure of website authority by Google. Alexa Rank is the normal tool to measure visitor volume.

Blog backlinks are one of the most common and powerful tools. Bloggers form a large network and community. You can include your website and one or two backlinks with a blog comment. Auto-approve blogs are considered highly desirable, because comments are not screened and do not wait on a human being to approve them. Dofollow relevant blogs and forums are also greatly sought after because backlinks from them count the most, as the search engines see the referring site as explicitly endorsing the links.

When leaving blog backlinks, be sure to include substantive comments. Just spamming your links everywhere will have you targeted by spam detection programs and leave a wake of annoyed blog owners. When using auto approve blogs, be sure to check the other links present. You do not want your site featured in a “bad neighborhood” with male enhancement and gambling spammers. Auto-approve blogs make for easy and good links, if the comments are not overfull and the other links look good.

It can seem impossible when first starting out to reach the top of the search engine rankings. However, you can dominate the search results with quality content and good backlinks. Blog backlinks are among the most popular options and auto-approve blogs provide a major venue to promote your site.

Cheers !!! 🙂

Easy Ways To Optimize Your Site For SEO- Quick Overview

Are you losing your mind and pulling your hair out trying to crack the code of search engine optimization? Look, I have been there so let me enlighten you on a few easy techniques I have picked up along the way to help you get more traffic and make more sales through your websites and web 2.0 properties.

Fist off there is no magic formula to SEO no matter what guru says so nor is it a complicated process expressed by so-called SEO experts. Basically it comes down to common sense on how things work on the technical side of things online.

First off is keywords. If you are driving traffic to your websites via keywords you ought to know how to utilize them to the max for best case scenario. Using the keyword in your domain, description, title, and tags is very important but do not over use them in the body of text. One time in bold is enough and have your main keyword supported by other keywords in your text but most importantly write naturally using slang and proper language associated with your topic. There are SEO tools to help point this out if you are using a blogging platform in case your eyes miss it. 🙂

Next is competition. can you compete against a giant keyword term? You could but it will take a very long time to catch up, so best stick with things you can compete with. Usually staying under 100,000 competing pages is key to ranking on the first page. Look at your competition and take note of their domain age, page rank, and backlinks. This will determine what it will take and how long it will take to out rank them.

And then there is backlinks. This is a very important step to insuring you rank in the top 10 or even #1! It is not how many backlinks you have but where they are coming from. An example: 25 links from PR5 to PR9 sites will outrank 10,000 links from junk low page rank sites, period. I should also point out that these links should be relevant to your niche. If you have a site on “SEO” then most of your links coming in should be from SEO type sites and not “basket weaving” for instance. This way you can easily become untouchable in the rankings.

Good luck!!! 🙂