Posts Tagged ‘Link building’

The Professional Guide to Link Building

Published by Todd Herman on July 19th, 2011 - in Guest Post SEO

The Professional Guide to Link Building

©2010 SEOmoz

This article serves as a guide to defining and understanding what Natural link building is, why links are important for a website and its rankings, and how to obtain links to one’s website. It covers various link building strategies and also provides information on various tools that help the link building process. Concluding this article is an appendix of tools for easy access.

Section I: The Theory and Goals of Link Building for SEO

Link building is the practice of getting other sites to implement hyperlinks back to a web site. This can often be done using a wide variety of strategies, such as:

  • Asking for a link
  • Giving away content or tools
  • Widget development
  • Social media campaigns
  • PR
  • Purchasing links
  • And more…

This guide will focus on all of these strategies as well as provide additional link building resources and tools.

Why Link Building Matters

In broad terms, whenever any user enters a query into a search engine, the search engine needs to determine how to return the best results. One key step is to figure out which pages on the web are related in any way to the query. Another key step is to evaluate which of the pages that relate to the query are the most important, or the most authoritative in regard to that query. One of the biggest factors in deciding that is the link profile of the site containing the web page being evaluated, and the link profile of that page itself.

In principle, each link to a web page is seen as a vote for that web page. In simple terms, if there are two pages that are equally relevant to a given search query, the page with the better inbound link profile will rank higher than the other page. However, the world does not reward those who sit back and wait for links to come to them. It is incumbent on the site publisher to go out and tell the world about their site and get people to link to it. Publishers who do not pursue link building are at high risk of losing their search engine traffic, or never building up their web sites to the point where the traffic they are getting meets their goals.

A related concept to think about is whether or not the link is something that will help the site with its rankings for the long term. To illustrate, there are types of links that publishers can obtain to their sites that are against the search engine’s terms of service. For example, Google has taken a strong stance against the practice of buying links for the purposes of influencing rankings on the Google index. Yet many people still purchase links and it can and does work for many of them, at least in the short term.

However, as Google actively invests time in finding paid links (and other link schemes they deem to be against their terms of service), even if the links work in the short term for the publisher, there is no guarantee that they will work in the long term. This leads to a choice that every publisher must make: whether to pursue short term strategies, such as buying links, that may bring faster results with less effort, or to pursue longer term strategies that have much lower risk. For the record, we do not recommend pursuing paid links as a strategy. The risk is high, and the consequences of getting caught are high.

How Search Engines Evaluate Links

The notion of using links as a way of measuring a siteís importance was first made popular by Google, with the implementation of their PageRank algorithm. In simple terms, each link to a web page is a vote for that page. However, votes do not have equal weight. A detailed explanation of how PageRank works can be found here.

Sticking to the letter of the original PageRank algorithm (which has evolved significantly), links from pages with higher PageRank will be more valuable. To some degree this is still true. However, it has gotten more complicated by the introduction of 3 new factors:

1. Relevance. If the link comes from a site that is on the same topic as the publisherís site (or a closely related topic), that link is worth more than a links that comes from a site with an unrelated topic. For example, think of the relevance of each link being evaluated in the specific context of the search query just entered by the user. If the user enters “used cars in Phoenix” and a publisher received a link to its Phoenix used cars page that is from the Phoenix Chamber of Commerce, that link will reinforce the search engineís belief that the page really does relate to Phoenix.

Similarly, if a publisher has another link from a magazine site that has done a review of used car websites, that will reinforce the notion that the site should be considered a used car site. Taken in combination, these 2 links will be quite powerful in helping the publisher rank for the query “used cars in Phoenix”.

2. Authority. Within any given market space there are sites that are considered by the search engine as authoritative. One of the more famous papers on the topic is that of Apostolos Gerasoulis and others at Rutgers University on Applying Link Analysis to Web Search. This paper became the basis of the Teoma algorithm, which was later acquired by AskJeeves and became part of the Ask algorithm. What made this algorithm unique was the focus on evaluating links on the basis of their relevance and authority. Googleís original PageRank algorithm did not incorporate the notion of relevance. This is a critical part of what Google does today, but Teoma was first to offer a commercial implementation of this idea.

Teoma introduced the notion of hubs, which are sites that link to most of the important sites relevant to a particular topic, and authorities, which are sites that are linked to by most of the sites relevant to a particular topic. The key concept here is that each topic area that a user can search on will have authority sites specific to that topic area. The authority sites for used cars are different than the authority sites for baseball. So if a publisher has a site about used cars, he should get links from web sites that are considered by the search engine to be an authority on used cars (or perhaps more broadly, cars). However, the search engines donít tell you which sites are authoritative, making the publisherís job more difficult.

3. Trust. Search engines attempt to measure how much they trust a site. If a site is highly trusted, its vote will count for more than if it is not that trusted. A trustrd site will also tend to have higher rankings on related search phrases. One basic concept for evaluating “trust” of a web site is to review the site link profile to see what other trusted sites link to it. Links from trusted sites improve the trust score of the site and page receiving the links. Evaluation of a site’s trust has evolved significantly from this simple notion.

In 2004, Yahoo! and Stanford University published a paper titled Combating Web Spam with Trust Rank (PDF). The basis of this paper was the notion of using manual human review to identify a small set of seed pages from sites that were deemed to be the most trusted/authoritative. The trust level of other sites, other than the manually identified ones, would be evaluated based on how many clicks away they are from the seed sites. Sites that are one click away from a seed site are highly unlikely to be spam sites, so they also end up being considered pretty trustworthy. Sites that are two clicks from a seed site still have a fairly low probability of being a spam site, but are also a bit more likely to be spam than sites that are only one click away,and so forth. Of course, one click away from multiple seed sites is the best of all worlds. The paper suggests that using this approach removes the inherent risk in having an algorithm determine the trustworthiness of a site, and potentially coming up with false positives/negatives.

The researchers who wrote the paper on Trust Rank also authored an interesting paper on a concept they call spam mass (PDF). This paper focuses on evaluating the effect level of spammy inbound links on a site’s (unadjusted) rankings. The greater the impact of those links, the more likely the site itself is spam. Similarly, if the search engine determines that a large percentage of a site’s links are purchased, or likely to have been purchased, this could be problematic as well.

Expanding on this slightly, you also think about the notion of “Reverse Trust Rank”. This is the notion that if your site links to spammy sites, their Trust Rank should be lowered. The theory should provide ample motivation to make sure that you take care to not link to any bad sites. It is highly likely that Google, Yahoo!, and Bing all use some form of trust measurement to evaluate web sites, and that this trust metric can be a major factor in rankings.


The EDU Myth

On a specific note, letís consider for a moment the “EDU myth.” Many people believe that a link is better if it comes from a “.edu” (or a “.gov”) Top Level Domain (“TLD”). However, it does not make sense for search engines to look at it so simply. There are many forums, blogs, and other pages on .edu domains that are easily manipulated by spammers to gain links to their sites. For this reason, search engines cannot simply imbue a special level of trust or authority to a site because it is on a .edu TLD.

However, it is true that .edu domains are often authoritative or trusted, but this is a result of the link analysis that defines a given college or university as an authority on one or more topics. The result is that there can be (and there are) domains that are authoritative or trusted on one or more topics on some sections of their site, and yet can have another section of the site that is actively being abused by spammers. Search engines deal with this problem by varying their assessment of a domain’s authority and trust across the domain. The publisherís http://yourdomain.com/usedcars section may be considered authoritative or trusted on the topic of used cars, but http://yourdomain.com/newcars might not be authoritative or trusted on the topic of new cars.

Ultimately, every site, and every page on every site, gets evaluated for the links they have on a topic by topic basis. Further, each section and page of a site also gets evaluated on this basis. A certain link profile gives a page more authority or trusted on a given topic, making that page likely to rank higher on queries for that topic, and also providing that page with more valuable links that they could then give to other websites related to that topic.

Structuring a Link Building Campaign

One of the most difficult parts of implementing a link building campaign is deciding what the strategy will be. There are many types of campaigns that can be chosen, and making those decisions can often be difficult to do. It is also a critical decision, as publishers will want and need to get the highest possible return on their investment in link building.

There are five major components to a link building campaign. These are:

  1. Getting links from authoritative domains
  2. Getting links from a volume of domains
  3. Obtaining links to pages other than the home page (also known as “deep links”)
  4. Local links for local rankings
  5. Getting the anchor text needed to drive rankings.

Each of these components will be discussed in more detail.

1. Links from Authoritative Domains

The highest authority sites in the publisherís space are ones from where they should want to get links. A good way to start structuring a link building campaign is to identify the high authority sites in a given space and then determine what it will take to get a link from them. The first complication is that the search engines provide no direct measurement of a web siteís authority. There are limited amounts of publicly available data. These include Google PageRank, as displayed on the Google Toolbar, and total backlinks to a site, as extracted from third party tools such as SEOmoz’ Open Site Explorer.

However, one really good way to start the process is to determine if major government sites, major university sites, or major media sites are potentially authoritative for a given siteís topic matter. For example, if the New York Times has a major section of their newspaper devoted to the publisherís topic (or something closely related to it), there is a good bet that it is considered authoritative on the topic. Taking this first step simply depends on the publisher using his knowledge of their market, and who the major players are. The next step, or narrowing down who the top authority sites may be at a more granular level, is harder.

One way to approach this is to collect the most likely suspects, 10 of them for example, and then see which of those 10 has the most links from the other nine. You can obtain authority measurement data by using Open Site Explorer (OSE). OSE metrics include Domain Authority and Page Authority, both of which are useful to know. You can also use Open Site Explorer or Majestic SEOto pull back link data for different potential authority web sites to see which of the other potential authority sites link to them via manual analysis.

If a Potential Authority Domain has links from more or all of the other prospective authority sites that were identified, itís a good bet that the search engines see it as authoritative. It should be noted that search engines probably consider the number of different relevant domains that link to a site as a factor in determining its authority as well. Identifying how many different domains link to a site may not be that hard using the right tools, but determining the relevance of a large number of links is quite a bit harder.

Once this type of analysis is complete, there is still no way to be 100% sure that the identified sites will be seen by the search engines as authoritative. This means the publisher may identify 10 sites he thinks is authoritative, and perhaps only 6 of them are actually seen as authoritative by the search engine. But, this is about the best a publisher can do in identifying authoritative target sites.

As outlined above, the next major step is that the publisher has to figure out what it will take to get these types of sites to link to his site. Is it a particular piece of content, or a special tool? Is it sufficient to publish this great data, information, infographic, content or tool on the publisherís site, or should it be syndicated in some fashion to the site you think may be considered an authority?

There are also direct and indirect approaches. Contacting a site directly and asking for a link is a direct method. Using PR or a social media campaigns are indirect methods. Either can work as a strategy for getting a link from a given web site. As publishers look at each strategy in turn, there will be a variety of issues they must consider. These will be discussed later on in this guide.

2. Links from a Volume of Domains

It is also extremely valuable to get links from a large number of domains. 100 links from one domain are not nearly as valuable as one link from each of 100 different domains (assuming that the pages involved are likewise equal). The reason for this is that search engines want to count a link to a site as an endorsement by that site. In addition, the search engines want that to be an editorial decision by an informed person. 100 links from one domain requires only one editorial decision, but one link from 100 domains most likely involves 100 editorial decisions. The bottom line is that it’s important to get links from many domains, which complements the process of pursuing authoritative links. As with authoritative links, there are many different strategies one can use to try and get links in volume. Direct and indirect means can be used to get links in volume, although direct means will require significant effort over a sustained period of time.

3. Links to Pages Other Than the Home Page (Deep Links)
It is also very useful to get links to pages in a site other than its home page. This is particularly true for larger sites, but is also applies to sites of any size. Publishers should look to get links to each major section of their site. Each page on a web site has the opportunity to rank for different search terms, and supporting those pages with direct links will help them rank better. In addition, these direct links to lower level pages represent anchor text opportunities for those pages as well.

In principle, each major theme or topic of the site should be the subject of its own link campaign. It’s also useful to get links to sub-pages of each theme. The publisher may not want to launch focused campaigns on each such sub-page, but if the campaign is structured in such a way that most links will go to the highest level page of each theme, and then some links will go to sub-pages, that would be a very effective use of link building resources. Even a small number of links to lower level pages will have a surprisingly big impact (see the Disproportionate Value of Deep Links for more info on this topic.

4. Local Links for Local Rankings

A publisher with a page related to real estate in Seattle may want that page to rank for search terms such as “Seattle Real Estate”. In addition to doing on page optimization and general link building, the publisher should also look to obtain links from other local businesses. For example, a link from the local chamber of commerce is likely to be helpful in ranking for local search terms. There are many other similar sources, such as local business directories, the local chapter of the Better Business Bureau, local libraries, local businesses. and local clubs. Getting links from these sources reinforces the local theme of the publisherís site or web page. This is equally important, or perhaps even more important, in international settings. If the publisher wants traffic from Google UK, they should plan on getting some links from other sites hosted in the UK.

5. Get the Anchor Text Needed to Drive Rankings

Anchor text has been rated by leading SEOs as one of the most powerful factors in search rankings. Search engine representatives have also acknowledged that anchor text is a significant ranking factor. They use anchor text to provide further evidence of what the page receiving the link is about. Since these are assumed to be specified by the person giving the link, it is a factor that is potentially very powerful. Search engines interpret anchor text as if an editor went through a process to classify the site (with some filtering, of course).

If a site has a page selling used Ford cars and the page has a lot of links pointing to it that use the anchor text “Used Ford Cars,” the anchor text is likely to help the page’s rankings for related search terms. Also of note is that most web sites do not need to do anything at all to rank for their company or web site name. The search engines have a very high success rate in delivering the correct result in the top spot for a branded search, because many of the links you receive will use your company name as all or part of the anchor text for the link. While these links don’t use the keyword rich anchor text you might prefer, they still have value.

This is one reason why some SEOs choose to purchase links. When a publisher purchases a link, they normally specify the exact anchor text that they want; after all, they are buying an ad, and the ad should say what they want. The existence of this market then becomes a reason for search engines to place less weight on anchor text, as it makes the signal less reliable for them to use in ranking a site. That said, search engines do still plcae a lot of weight on this factor.

It is also possible to implement “white hat” link building campaigns and get the anchor text that you want. The techniques in the referenced article can be quite effective and do not come with the risk of being seen as spam by the search engines.

Other Considerations

There are other factors that one must consider in structuring a link building campaign. These are:

1. “Looking” Natural
Many SEOs talk about the need for a web siteís links to “look natural.” What they mean by this is that search engines shouldn’t perceive the links to be artificially manipulated. For example, if a publisher has links from 10 different domains and every one of them is a site wide link, and they all use very similar anchor text which is not the name of the web site, itís a good bet that those links are the result of manipulative behavior by the publisher.

To dig into another example a bit more deeply, the most popular forms of anchor text are:

  • The site name
  • The site URL
  • “Click here”, “more”, or a similar derivative

This is just a fact of life for links that are naturally given. For that reason, if a publisher gets a very large percentage of the links to his site with the exact same keyword rich anchor text, he is asking for trouble. This is largely only a consideration for publishers who engage in manipulative schemes to obtain their links. However, publishers not engaging in such practices also have reason to be aware of the issue and understand how it affects their link building efforts.

However, the concept of “looking natural” is a dangerous one. It is far better to think about “being natural” in your approach.

2. Distributed Sources
Search engines look at many different types of signals when ranking sites. As a component of reducing the influence of paid links, one such factor is looking at the variance in the sources of the links and the anchor text. Publishers that have obtained a greater balance in their linking structure could potentially be ranked higher than other publishers. This is one key reason for implementing multiple link building strategies at the same time, including approaching different types of target sites. For example, a publisher may choose to concentrate resources on requesting links directly from major universities while also implementing a significant PR effort, or perhaps a campaign simultaneously targeted at bloggers and traditional media. These types of strategies have the dual benefits of being more natural and providing a broader balance to the link profile of the site.

3. Permanence
Another important factor is to realize that link building should never stop. An effective link building campaign is a competitive advantage for a publisher. Achieving initial success with a link building campaign (by getting the desired rankings in search engines) and then stopping only results in giving the publisherís competition a chance to catch up and pass them. Once you get into the lead, you need to maintain it.

4. Anti-Spam Measures by Search Engines
Search engines have only one goal in mind: improving the quality of their index (which leads to increased market share and increased revenues). This is the entire reason for Google’s policies regarding link buying for PageRank purposes and other similar techniques. Site owners cannot overlook this in deciding on their link building strategies. Companies that make billions of dollars in revenue every year have a lot of money to invest in protecting their business. As a result, the budget that search engines have for fighting spam is considerable.

5. Competition
Search engines engines evaluate link profiles in a broad context of the market overall. If you are in the business of selling golf clubs online, your link profile will be compared to that of your competition. A key element of any link building campaign is to understand the competitive landscape and what you will need to do to win within it. You don’t want to spend a year pursuing a link building program, meet all of your program goals, and not meet your business goals.

Public companies such as Google look for scalable structures to their business. Their clear preference is to improve their algorithm to fight spam. It is difficult for them to implement large teams of human reviewers to look at one site after another to determine whether or not it is gaining higher rankings for reasons that are in conflict with their algorithms (though note that Google does have a team of reviewers based overseas that do just that).

This is what leads some to say that Google and spammers are engaged in an arms race. But even those who are known for black hat expertise professed in the Black Hat, White Hat panel at SES San Jose 2008 stated clearly they use what they consider white hat practices with their clients, and only experiment with black hat techniques on sites they consider throw away domains, and their purpose in pursuing them is the learn more about the current algorithms.

5. Temporal Factors
Search engines also keep detailed data on when they discover the existence of a new link or the disappearance of a link. They can perform quite a bit of interesting analysis with this type of data. Here are some examples:

  • When did the link first appear? This is particularly interesting when considered in relationship to the appearance of other links. Did it happen immediately after you received that link from the New York Times?
  • When did the link disappear? Some of this is quite routine, such as links that appear in blog posts that start on the home page of a blog and then get relegated to archive pages over time. However, perhaps it is after you rolled out a new major section on your site, which could be an entirely different (negative) type of signal.
  • How long has the link existed? You can potentially count a link for more or less if it has been around for a long time. Whether or not you choose to count it for more or less could depend on the authority/trust of the site providing the link, the temporal nature of the content (e.g. news information gets old and less valuable over time), or other factors.
  • Rate of adding links. Did you go from one link per week to 100 per day? Or vice versa? Such drastic changes in the rate of link acquisition could also be a significant sign. Whether or not this is taken as a positive signal depends on the circumstances. Does it reflect a link buying campaign gone wild? Or has the site suddenly become newsworthy?


Determining a Link’s Value

Putting together a link campaign often starts with researching sites that would potentially link to the publisher’s site, and then determining the relative value of each potential linker. There are several basic factors that go into this, including:

  1. The PageRank of the site. This is a factor, but it has declined in importance. In addition, it is hard to determine what this value is. Many people mistakenly assume that the PageRank of the home page of a site is the PageRank of the domain. This is not the case (it is only the PageRank of the home page). One of the best ways to get an estimate of this is to use SEOmoz’ Open Site Explorer which performs a calculation of Domain mozRank.
  2. How Trusted is the domain?. I think of trust as an enabling factor. Imagine a domain with a link profile with an arbitrary value of 100, and a trust value of 0.2. In concept, you can multiply the two scores together and get a final score of 20. If that domain then gets a link from another site that adds 5 points to their link profile value, and 5 points to their trust level, that is a double win. Their final link score leaps up to 26.5. While the actual algorithm is undoubtedly significantly different, the notion that trusted links carry much more value is one that is generally accepted.
  3. The perceived authority of the site. While there is a relationship between authority and PageRank, they do not have a 1 to 1 relationship. Authority relates to how the sites in a given market space are linked to, whereas PageRank measures aggregate raw link value without regard to the market space. So higher authority sites will tend to have higher PageRank, but this is not always the case.
  4. The PageRank of the Linking Page.
  5. The perceived authority of the Linking Page.
  6. How much of the Domain’s Trust accrues to the Linking Page? As we outlined above in our discussion on the “EDU Myth, just because a domain is trusted does not mean that a particular page on that domain is trusted.
  7. The number of outbound links on the Linking Page. This is important because the Linking Page can vote its “Passable PageRank”, but each page it links to consumes a portion of that PageRank, leaving less to be passed on to other pages. This can be expressed mathematically:For a page with Passable PageRank “n”
    And with “r” outbound links
    Passed PageRank = n/rPassable PageRank is as defined in the following article on SEOmoz about PageRank. This is a rough formula, but the bottom line is that the more outbound links a page has, the less the value of a link from that page.
  8. The relevance of the linking page and site.

Organizing this data in a spreadsheet, or at least being consciously aware of these factors, when putting together a link building campaign is a must. One tool that can help with this task is the SEOmoz Page Strength Tool. In this guide, we are going to assume that the sites get divided into four possible categories:

  1. Low Value Sites
  2. Medium Value Sites
  3. High Value Sites
  4. Very High Value Sites

The level of effort put into obtaining links from each type of site will be discussed further below. We will also discuss more tools that can be used for evaluating a link’s value.


The Role of Content in Building Links

In natural link building the publisher must provide compelling content (or data, or tools …). Publishers need good reason to provide links to another site, and it is not something they do frivolously. Superior content is the key to obtaining such links. Aggressive publishers can even let their content strategy be guided by their link building strategy. This is not to say that they should change their business itself for their link building strategy. Normally, however, there are many different types of content a site could produce. The concept is simply to identify the link building targets (whether they are market segments or individual web sites), what content they will need to see to potentially provide a link (for example, what type of content do they link to now?), and then tweak the content plan accordingly.

It is also important to consider the balance between commerce and content value. Publishers are very slow to give links to purely commercial sites, unless they are major brands. But even major brands can receive more links by implementing valuable non-commercial content that is useful to users, and placing it on their site in a manner that is separated from the commercial part of their site.

Content is also at the heart of achieving link building nirvana, which is having a site so good that people discover it and link to it without any effort on the publisherís part. This can be done, but does require that the publisher create content that truly stands out for the topics that their site covers. You can only achieve this by becoming a leader or recognized expert in your field.

Another aspect of content is the decision of where to place it. There are two major alternatives for content placement:

1. On Site
Of course, the content may be placed on the publisher’s site. Within the publisher’s site, there are other decisions to be made, such as does the content go in a special section, or is it integrated throughout the site. As an example of how this decision may be made, an e-tail site that publishes a large catalogue of products may not want all (or some of) the pages in their catalog laden with a lot of article content. A site like this might build a separate section with all kinds of tips, tricks, and advice related to the products they sell. On the other hand, an advertising supported site might want to integrate the content throughout the main body of the site.

2. Off Site
A publisher may also choose to place the content on another site (e.g. syndicating it). One reason for this would be to provide the content to another site in return for a link to their site. This can be quite an effective link building strategy. If you pursue this strategy, consider implementing unique new content for syndication instead of syndicating articles from your site. Having others publish articles from your site will result in the engines seeing your article as duplicate content, and this can have some undesirable affects.

Some publishers also implement blogs in other sites, simply because that is where the blog publishing platform resides. These then get integrated into the main site through the cross linking structure. This is not the same as a syndication strategy, because the links from these types of blog platforms do not inherit the authority or trust of the domain on which they reside. This is more akin to launching a separate new domain, putting a blog on it and linking to yourself. You can do that, but it only has value if that separate domain has built its own link profile.

 

Section II: Professional Link Building Strategies

Asking for a Link

Sometimes simple is best. If the site looking for links is a high quality site with unique and/or authoritative content, the publisher may simply need to tell the world about what they have. They can do this via a PR strategy (which we will discuss later) or by simply emailing other publishers. If there is already a relationship between the publisher requesting the link and the publisher being asked to provide the link, this process is pretty easy. The requesting site sends the other party a note with whatever pitch they want to make. This pitch is easy to personalize, and the nature of what is said is likely to be guided by the existing relationship.

When one publisher is in the process of establishing a contractual business relationship with another publisher, it is also relatively easy. Assuming that they are going to sign a contract with this publisher, all that needs to be done is to make linking to the site a term of the agreement. For example, if you are granting a third party the right to resell your products, make a link back to you a part of the deal.

Publishers that have a standard contract that they use with many people can derive substantial benefit from adding a link clause to that agreement. Such links still represent endorsements, even though they were contractually required, because no one forced the other party to sign the contract, and they would not be interested in reselling or distributing the products or services of your site unless they were prepared to endorse them.

However, if the publisher requesting a link does not know the people they are contacting, it is a very different ballgame. Deciding whether or not to contact such a site, including how much effort to put in, can be summarized quickly with the following table:

 

Lower Value Sites Targets may be indentified as a result of large scale research. It may not be worth the trouble to contact such sites but it does, contact is by email and is personalized, but likely in a semi-automated way. These types of sites are being relied on to obtain a volume of links. No customized content is developed.
Medium Value Sites Targets may result either from large scale research or industry knowledge by the publishers of the site requesting links. Contact is by email and is personalized, likely done by a human, but with only low to moderate levels of effort. No customized content is developed.
High Value Sites Targets identified by principals of the business or senior marketing people. Email contact is entirely custom and tailored to the site being contacted. Phone calls may also be used in pursuit of these links. Content may be developed just to support the campaign to get links from these types of sites.
Very High Value Sites Targets identified by principals of the business or senior marketing people, or as a result of research. Email contact is entirely custom and highly tailored to the site being contacted. Phone calls may also be used in pursuit of these links. Face to face visits may also be involved. Content may be custom developed just to support the campaign to get links from these types of sites.

Basic Email Pitch

Assuming that the Requester does not know the Requestee, there are few simple guidelines that should go into the link pitch:

  1. Keep it simple and short. The Requestee is receiving an email that is unsolicited. They are not going to read a 2 page email, or even a 1 page email.
  2. Clearly articulate the request. It is an investment to get someone to read an email, and it is critical that the pitch be clear about the desired result.
  3. Clearly articulate why the Link Destination Site deserves a link. This, generally speaking, involves pointing at the great content, data, or tools on the site, and perhaps citing some major endorsements.
  4. Follow each and every guideline of the CAN-SPAM Act. Unsolicited emails are not illegal as long as they follow the guidelines of the Act. Donít even think about violating them. Serious CAN-SPAM violations can result in jail time.


Creating a Value Proposition for Direct Requests

You must make your pitch interesting enough for the site you are contacting to give you a link. As noted above, this starts with understanding what content or tools the requestee site might be interested in. For purposes of clarity: sites do not link to other sites for the purpose of helping those sites make money. They link because they perceive that their users might value the content or tools on the site. This relates to the fundamental structure of the web, which is designed around the notion of interlinking related documents. Positioning a site, or a section of a site, as being related to the Requestee site is a requirement of each link building request.

With High Value Sites and Very High Value sites, it may be worth spending quite a bit of time and energy on putting together this value proposition. In many cases, it is even worth doing custom content or tool development to increase the perceived value and relevance of the content to the site being contacted. In all cases, developing a link request value proposition begins with understanding the nature of the target siteís content, and then deciding how to match up your content with it.

Requests Via Social Media Sites

It is also possible to reach out to people through social networks, such as Facebook, LinkedIn, and Twitter. This is similar to emailing people with a few important distinctions:

  1. Publishers can send out communications to their friends on those networks. Assuming that they have treated this designation (that of friend) with any level of seriousness, instead of “friending” everybody in sight, the communication can be a bit more informal.
  2. Publishers can also join groups on these networks related to their market space, and then send messages out to those groups. These groups will provide them with the ability to reach new people with related interests.
  3. Messages broadcast through these networks cannot be personalized, so a more general message needs to be tailored for these types of broadcasts.
  4. These are social networks. Beware of broadcasting too many messages or poorly targeted messages. Many publishers have made this mistake and become a pariah in the communities, and lost the leverage that these communities bring in the process.
  5. Personalized messages can be sent on a 1 to 1 basis as well.

In fact, one strategy for approaching a High Value Site or a Very High Value Site is to make initial contact by friending someone senior who works at the company that publishes such a Target Site. Then you can establish a relationship with that senior person that makes a later request for a link more likely to succeed.

Strategies for doing this involve learning as much about the person as possible and interacting around those interests, and possibly helping them with a thing or two. Once the relationship is established, a more informal approach can be used to introducing the great content on your site.

For example, on LinkedIn you can network and make connections with people who have known you in the past, and then use those connections to get introductions to others. Or, you can buy a a higher level of service from Linkedin, and then you do not even need the introduction and can reach out to someone whom you want to get to know directly through LinkedIn. These are called “Inmails”, and the number you are allowed to send is limited depending on the level of service you buy, starting at 10 Inmails per month for $49.95.

One of the key benfits of these communications is that there open rate is far higher than trying to reach out to someone for the first time using an unsolicited e-mail (for one thing, you are not subject to email spam filters). The number if Inmails you can send is failry limited (a max of 50 per month with the highest level of service), but this is still a very useful way to reach out to those critical targets in your link building plan.

Creative Link Request Tactics

Creative link requests can come in a few forms:

  1. Make the link request standout through creative presentation. This requirement might make a publisher think about dancing girls, for example, but this might not work for a woman who might prefer dancing men, and in any event is a bit over the top. Publishers that pursue the notion of creative presentation need to be careful to not make it look like a commercial. Itís more important to find a creative way to make the point stand out.
  2. Offer something of unique value. For example, one way to do this is to write a great article and offer it in return for an attribution link (aka Content Syndication). We will talk about this particular tactic and other creative tactics later in this document.
  3. Understand the mindset of the person you are contacting. You are contacting someone you don’t know, with a request they don’t expect, for a task that they really did not have on their list for today, or for that matter, this month. They don’t know you, and you are interrupting them, and asking for a favor. Sounds compelling doesn’t it? Let’s provide a few examples to illustrate some alternatives:
    • Become a recognized authority in your industry. Speak at industry conferences and write article in major periodicals related to your market space. This will increase your credibility, and this helps with getting links.
    • Conduct a training session on a related topic on the premises of the Target Site.
    • Build a relationship with an important person from the Target Site. This can be done through meeting at industry conferences, networking through social media, getting an introduction from a friend, and many other methods. This tactic can unfold over many months, with the relationship being developed with no mention of getting a link. Then after the relationship is fully developed, you can let your contact know about some linkworthy thing that you have done.

This may seem like an extensive amount of work, but it might be worth it to get a link on this PageRank 9 site (Nasa.gov):

Or this Page Rank 10 site from a small search engine called Google:

In all of these strategies, the key is to develop trust before making the request. In fact, ideally, the publisher never makes the request at all, but the context of the relationship causes the link to be given. It is far easier to get a link when there is trust in place then when there is not.

Directories

Directories can be a great way to obtain links. There are a large number of directories out there, and they may or may not require money in order to obtain a listing. Examples of high quality directories that are free include:

Examples of high quality directories which require a fee are:

A more comprehensive list of directories is available from Strongest Links.

What Search Engines Want From Directories

Not all directories provide links that the search engines value. The key as to what would cause a search engine would vlaue a directory link or not is in the editorial policy of the directory. The essential factors that the search engines look for are:

  1. The fee paid is made in payment for an editorial review, not for a link.
  2. Editors may at their whim change the location, title, and description of the listing.
  3. Editors may reject the listing altogether.
  4. Regardless of the outcome, the directory keeps the money (even if the publisher doesn’t get a listing).
  5. The directory has a track record of rejecting submissions. The inverse of this, which is more measurable, is that the quality of the sites listed in the directory is high.

The following is an extract from my blog post on The Role of Directories in Link Building:

Ultimately, “Anything for a buck” directories do not enforce editorial judgment, and therefore the listings do not convey value to the search engines.

To take a closer look at this, let’s examine some of the key statements from Yahoo!’s Directory Submission Terms:

  • For web sites that do not feature adult content or services, the Yahoo! Directory Submit service costs US$299 (nonrefundable) for each Directory listing that is submitted.
  • I understand that there is no guarantee my site will be added to the Yahoo! Directory.
  • I understand that Yahoo! reserves the right to edit my suggestion and category placement; movement or removal of my site will be done at Yahoo!’s sole discretion.”

Classifying Directories

We can divide directories into 3 buckets:

  1. Directories That Provide Sustainable Links. These are directories that comply with the policies as outlined above. Most likely, these links will continue to pass link juice for the foreseeable future.
  2. Directories That Pass Link Juice that May Not Be Sustainable. These are directories that don’t comply with the policies as outlined above. The reason such directories exist is that search engines tend to use an “innocent until proven guilty” approach, so the search engine must proactively make a determination of guilt before a directory’s ability to pass link juice is turned off. Even so, link juice from these types of directories is probably not going to be passed in the long term.
  3. Directories That Do Not Pass Link Juice. These are the directories that have already been flagged by the search engines. They do not pass any value. In fact, submission to a large number of them could be seen as a spam signal, although it is unlikely that any action would be taken solely on this signal alone.


Detecting Directories that Pass Link Juice

The process is relatively simple for directories that pass sustainable links, as defined above. The steps are:

  • Investigate their editorial policies and see if they conform to what search engines want.
  • Investigate their track record. Do they enforce their policy for real? This may be a bit subjective, but if there are lots of junky links in their directory, chances are that the policy is just lip service.
  • As another check, search on the directory name and see if there is any SEO scuttlebutt about the directory as well, and then read what it says to see specifics.


The process is a bit harder for directories that do not confirm to the policies search engines prefer. There are still some things the publisher can do:

  • Search on the name of the directory to see if it shows up in the search engine results. If not, definitely stay away from it.
  • Take a unique phrase from the home page of the directory and see if that shows in the search engine results. If not, stay away from it.
  • Do they have premium sponsorships for higher level listings? A sure signal to search engines about their editorial policies.
  • Do they promote search engine value instead of traffic? Another bad signal.
  • Evaluate their inbound links. If they are engaged in shady link building tactics, it’s a good idea to stay away from them.

A good reference article for detecting bad directories is Rand Fishkin’s article on what makes a good web directory.

 

Content-Based Link Building Strategies

As outlined before, publishers do not link out to other sites just to help those other sites make money. In the eyes of the Requestee, there needs to be a purpose for giving out the link. Of course, paid links are one way of doing that, but we will discuss that later. For now, letís review six major ways to use content to build links:

1. Creating a Rich Resource Site

The principle here is simple. Create one of the best resources in a given market space and people will link to it. In an ideal world, the publisher can reach link building nirvana, where the links come to them without their doing anything. However, this is a difficult state to reach.

Ultimately, even if the publisher must take on the burden of promoting their content to get links, the key for the publisher is to establish their site as an expert in its field by publishing truly authoritative content (not authority in the search engine sense discussed previously, but recognized by people in their market as authoritative). This is a serious undertaking in any field, but it is extremely effective, especially if the field is already established. One way to make the strategy a bit easier is to focus on one particular vertical aspect of the field.

For example, if the general market space is “widgets,” there may be an opportunity to become an expert on a vertical aspect, such as “left handed widgets.” A specific area of vertical expertise such as this fictitious example is a very, very effective way to establish a new presence in a crowded market. Once this type of resource has been created, simply “Asking for a Link” as outlined above will often be a very effective strategy.

2. Rifle Shot Content

Consider producing a single killer article (or a small set of killer articles). There are two ways to do this:

  1. Answer a publicly stated need. This may come about because someone at the Target Site lets a need be known to the public. An example of this is the 2007 Web Analytics Shootout report published by Stone Temple Consulting. The Web Analytics report was spawned by this blog post by Rand on SEOmoz titled Free Link Bait Idea. Rand identified a need, the need was answered, and links resulted.
  2. Determine an unfulfilled need through research. Study your market space and find ways to do something stunningly unique that attracts the attention of the market.

3. Content Syndication

The previous two content based strategies were based on the notion that the content developed would be placed on the publisher’s site. This does not have to be the case. It is entirely possible to develop content with the intent of publishing it on someone else’s site. In return for providing the article, the author gets a link back to their site. It is also often possible to get targeted anchor text in this scenario. Watch the boundaries here though, because if the anchor text is unrelated to the article itself, it will not comply with what the search engines want publishers to do for link building techniques.

There are a couple of important points to watch for when syndicating content:

  1. Most publishers should not distribute articles that are published in the same form on the publisher’s own site. Search engines will see this as duplicate content. While the search engine’s goal is to recognize the original author of a piece of content, it is a difficult job to do perfectly, and it does happen that the search engines make mistakes.When looking to distribute content published on a site, the best practice is to write a new article on the same topic area, but make it different in structure and in its material points, and syndicate that version of the article. Therefore, if the site publishing the syndicated article ranks highly for key search terms, it is not a problem for the author’s site. If a publisher does choose to take an article from their site and distribute it in an unmodified form, they should make sure that the site publishing the article includes a link back to the original article on the publisher’s site. This increases the likelihood that the search engine will handle the duplicate content situation correctly by recognizing the original publisher of the article.
  2. When considering the syndication of content to a High Value Site, or a Very High Value Site, it makes sense to study the content needs of the site and custom tailor the content to those needs. This practice maximizes the chances of the acceptance of the article by the Target Site.

One variant of content syndication is to generate articles and the submit them to “article directories”. There are many of these types of sites, and frankly some of them are pretty trashy. But there are OK ones as well. The distinction between the trashy and good ones is relatively easy to recognize, based on the quality of the articles that they have published. However, common belief in the SEO community is that there is not much value in this strategy any more, because only a few article directories are believed to pass links of value, and even their value is likely to be limited.

Another type of content syndication is guest posting. This is the notion of specifically targeting blogs and then offering them content to publish on their site. Blogs are a particularly interesting target because: (1) They are often hungry for fresh content, and: (2) there are many blogs out there that accept guest posts farily routinely, yet have strict editorial standards.

One way to do this is to go and develop a list of sites that accept guest posts on topics similar to ones that fit your site and business, find out what they typically look for in content by reading their blog, and then reaching out to them and suggesting a post that you could provide to them. It helps if you have a demonstrable expertise (showing them your site may be sufficient). Then, once they indicate a willingness to accept such an article, get it written, and deliver it to them.

It is pretty well accepted that authors of such guest posts get to have an attribution link incorporated in the post, usually at the top of the post, or at the bottom of the post. If you choose to link back to your site in the body of the post, it is important that the page linked to have a strong tie to the article. For example, it could be the source of some key supporting data referred to in the article. Of course, it is possible to push the limits on some of these things, but you run the risk of rejection, or becoming known as a someone who does not play nice with the community.

A third form of content syndication is to develop an infographic. You can see numerous examples of infographics here. Infographics work because the web is a highly visual place. Since the volume of information on the web is growing without bound, people look for ways to get new information with the lest amount of effort possible. A highly visual chart can make an interesting or surprising point clear in a matter of seconds. For that reason if you develop a particularly infographic and offer up for people to publish on their own sites in return for a link, this can work quite well.

There are two key aspects to an infographic strategy:

  1. Developing compelling content – something that people will want to put on their site
  2. Finding an effective means for promotion it. For example, if you have a strong presence in social sites such as Facebook anf Twitter, you can make a lot of people aware of the new content in a hurry.


4. Social News and Bookmarking Sites

Social media sites such as Digg, Reddit, StumbleUpon, Delicious, and others can play a big role in a link building campaign. Becoming “popular” on these sites can bring in a tremendous amount of traffic and links. While social news sites like Digg and Reddit bring lots of traffic, this traffic is usually of low quality and will have a very low revenue impact on the site receiving it. The real ball game is to get the links that result. For example, stories that make it to the home page of Digg can receive tens of thousands of visitors and hundreds of links. While many of these links are transient in nature, there is also a significant number of high quality links that result.

Better still, articles about topics that also happen to relate to very competitive keywords can end up ranking very well, quite quickly, for those very competitive keywords. The key insight into how to make that happen is to use the competitive keyword in the title of the article itself and in the title of the article submission. These are the two most common elements grabbed by people linking to such articles when selecting the anchor text they use. Similar strategies can work well on a smaller scale with other social news sites like Mixx and Propeller.

Sites like Delicious and StumbleUpon are different in structure. Delicious is a tagging (or bookmarking) site used by users to mark pages on the web that they want to be able to find easily later. StumbleUpon shares some similarities with Delicious, but also offers a content discovery aspect to its service. Both of these sites have “popular” pages for content that is currently hot on their sites. Getting on those pages can bring lots of traffic to a site. This traffic is of higher quality than you get from social news sites.

Users coming to a site via tagging sites are more likely to be genuinely interested in the topic in a deeper way than users who got their eye caught by a snappy article title on a social news site. So while the traffic may be quite a bit lower than on the social news sites, publishers can also earn some quality links in the process. Tagging sites are best used in attempting to reach and develop relationships with major influencers in a market space. Some of these may link to the publisher, and these are potentially significant links.

Social media sites also allow publishers to create their own profiles. These profiles can, and should, include links back to the publisherís site. Learn more about social media sites here.

5. Facebook and Twitter

Social media sites can also play a strong role in Natural link building. This can happen at several levels, and the nature of the opportunity varies by site. These opporrtunities can be dvided into three categories:

  1. Developing a large audience which can be used in a PR like manner to get the word out about new programs you launch or publish on your site. For example, publishers that have succesfully obtained more than 10,000 followers on Twitter report that they can tweet about a new piece of content on their site and it will result in many links to that new content.
  2. Driving viral activity through the social media site. For example, if you have a Facebook fan page with a large number of fans, and you announce a new program, many of them may comment about it on their pages, and then all of their friends will see their updates. Then, some of them may likewise mention the program in one of their updates, and so forth.

These are just some of the things you can do with social media sites. The key is to develop a large number of friends/followers/connections and then leverage the particular platform in an appropriate way. These two articles from SEOmoz covers many of these mechanics in more detail. As with all social media strategies, you need to play nice in the community, and act like a member. Social networks are quick to punish those who violate their guidelines or normal civilized behavior.

In addition, it is important to be aware that you will need to invest a significant amount of time in each site with which you choose to work with. As a rule of thumb, I usually suggest a minimum of 10 hours per week, and you do get more benefits by doing even more. However, fresh college grads can often be used do much of the work, so it does not need to a huge expense.

6. Getting Links From Social Media Profiles

Some social media sites, such as LinkedIn, allow you to link back to your own sites in your personal profile, and these links are not NoFollowed (meaning they pass link juice). Leveraging this can be a great tactic, as it is simple and immediate.

In the case of LinkedIn, the process takes a few steps:

  • Login to LinkedIn.
  • Click “Account & Settings” in the top right corner of the LinkedIn screen.
  • Click on “My Profile” to edit it.
  • Click “Websites” to edit your websites.
  • This will present you with the ability to edit your additional information (click “edit”).
  • Add a web site listing. Start by taking the box that says “Choose” and selecting “other.” This is what will allow you to specify keyword rich anchor text.
  • Then enter in the correct URL and click “Save Changes.”
  • Next we have to make your websites visible to the public (and the search engines).
  • On the upper right of your screen you will see a link titled “Edit Public Profile Settings.” Click on it.
  • On the next screen, under Public Profile, make sure that “websites” is checked. This is what tells LinkedIn to display that publicly.
  • Click “Save Changes.”

Note that the above process was what was required as of early 2009, but the specifics may evolve over time as LinkedIn releases updates.

10 other social media sites that do not NoFollow links in their public profiles are:

  1. Flickr
  2. Digg
  3. Propeller
  4. Technorati
  5. MyBlogLog
  6. BloggingZoom
  7. Current
  8. Kirtsy
  9. PostOnFire
  10. CoRank

7. Blogging for Links

Blogging can also be quite effective in link development. How effective a blog will be is highly dependent on the content on it, the market space, and how it is promoted by the publisher. The first thing to realize when starting a blog is that it is a serious commitment. No blog will succeed if it does not publish content on a regular basis. How frequently a blog needs to publish depends on the market space. For some blogs, one post a week is enough. For others, it really needs to be 2-3 times per week, or even more.

Blogging is very much about reputation building as well. Quality content and/or very novel content is a key to success. However, when that first blog post goes up, the blog will not yet be well known, will not likely have a lot of readers, and those that do come by are less likely to link to a little known blog. In short, starting a new blog for the purpose of obtaining links is a process that can take a long time. But it can be a very, very effective tool for link building. Itís just that patience and persistence is required.

Here are a few key things to think about when blogging for links:

  1. One of the best places to get links to a blog is from other blogs. This is best done by targeting relationships with major bloggers and earning their trust and respect.
  2. Be patient when developing relationships with other blogs. Trust and respect donít come overnight, and they certainly donít result from starting the relationship with a request for a link.
  3. The publisher should target a portion of their content at the interests of other major bloggers in their market area. Over time, this process should turn into links to the publisherís blog from such other major bloggers.
  4. Once you obtain links from major bloggers, other less well-known bloggers will begin to see links on the major blogs and will begin to follow suit.

It is also important to leverage the social nature of the blog. Publishers should try to provide a personalized response to every person who comments on their blog. One effective way to do this is to send each and every one of them a personalized response by direct email that shows that the comment was read. This helps deepen the interest of the commenter, creates a feeling of personal connection, and increases the chance that the commenter will return and possibly add more comments. Nurturing the dialog on a blog in this fashion helps that dialog grow faster, and blogs with an active dialog taking place on them are more likely to receive links.

8. Widgets

Widgets are also a way of syndicating content to third party sites. The concept is to develop a widget and provide it to other publishers and allow them to place in on their sites, in return for an attribution link back to your site. However, most widgets are implemented in some form of Javascript, and this may result in any links embedded within a widget as being invisible to the search engine. However, it is possible to implement a widget in such a way that it has an HTML wrapper around it with a simple HTML text link in it, which is quite visible to the crawler. Popular widgets can get adopted by a large number of web sites and can result in a large number of links as a result. Widget campaigns can also result in links to deep pages on your site.

A word of caution is merited. Widgets used for link building should be closely related to the content of the page receiving the link. An example of someone who did this differently than this is discussed in this post: Another Paid Links Service Disguised As Hit Counter. Four days after this post went up, the sites referenced lost all of their high rankings in Google, and therefore lost most of their traffic. The main reason for this is that the links given with the hit counter were in fact unrelated to the widget and hidden in the <noscript> portion of the hit counter.

Be aware that making the link visible is not enough to make this practice legitimate in the search engines’ eyes. Google has confirmed that they consider tactics like the use of unrelated widgets for link building, such as a hit counter, a no-no, even if the link is visible. The underlying reason for this is that if the link is unrelated to the widget, it is pretty unlikely that the link given actually represents an endorsement of the web page receiving the link. The goal of the person installing the widget is to get the benefit of the contents of the widget. On the other hand, using this same strategy where there is a close relationship between the widget and the link given is far more likely to represent a real endorsement, particularly if content on the page receiving the link is the source of what is included in the widget.

Backlinking for Links

Backlinking, or seeing who links to whom, is one of the oldest methods of putting together a link campaign. If someone links to a competitor, there is a decent chance that they might be willing to link to you. The process starts by identifying sites related to the market space of the publisher, including directly competitive sites and non-competitive authoritative sites. Then, using tools such as Open Site Explorer, you obtain a list of the sites linking to those sites.

Third party tools such as SEOmoz’s Open Site Explorer or Majestic SEO can provide you backlink data in a spreadsheet format, and we will discuss these later on in this guide. You can also use Yahoo Site Explorer, but Yahoo! has already made it clear that this will shut down by 2012 as a result of the Microsoft-Yahoo! deal. Yahoo! Site Explorer provides a result set that looks like this:

One of the nice things about this is that you can download the first 1,000 results into a TSV file that you can pop straight into Excel, which gives you a list that looks like this:

Whatever tool you use, can filter the list of links by sorting by URL, and removing all the internal links (i.e., links from targetdomain.com to targetdomain.com) or use the tool to do that autmoatically for you. This will return a list of all the third party links to the site.

Once this list is obtained, the next steps are:

  1. Capture, at a minimum, the URL of the linking page, the anchor text used, the PR of the linking page, and the PR of the home page of the site containing the linking page. This step is an arduous process unless you use a third party tool to pull that data for you. If you use Open Site Explorer make sure to include Domain mozRank, Page mozRank, Domain mozTrust, and Page mozTrust.
  2. Analyze the linking pages to determine which ones are worthy of individualized attention. Campaigns can be designed very specifically around these sites, as discussed earlier in this document.
  3. Research the linking pages and try to determine the best contact to use in asking them for a link. This work should be done manually for two major reasons:
    • Using a computer program to extract contact information from a web site is a violation of the CAN-SPAM Act (just don’t go there).
    • Computer programs may not return the best contact to use, and the contact returned by such programs may in fact not be useful at all.
  4. Develop an email campaign using the guidelines provided above.

Performing backlink analysis on competitive sites is one great way to pursue this. If a site links to a competitor, there is a decent chance that they might link to your site. If your content is better than that of the competitor, this may even result in the competitor’s link being replaced by the yours. Count that as a double win.

It is also interesting to perform backlink analysis on authoritative sites, even if they are not competitors. The sites linking to these authority sites in your space have an interest in the market space, and could be a good source of links. As always, publishers need to be careful that they are presenting a good value proposition to the Target Sites in order to maximize their return.

Press Releases and PR

Press releases are one of the staples of a PR campaign. Developing SEO optimized press releases can be a very effective link building technique as well. To be effective for link building, these need to be distributed by wire services such as PRWeb and BusinessWire.

The key to optimizing a press release is to choose keywords for the title that make them likely to be picked up by major news media editors. Having the correct keywords will align a press release with the search terms the new media editor uses. These terms are not the same as those used in web search, because they relate to search queries by highly knowledgeable people searching through feeds and/or databases of press releases. Designing a press release to attract attention is a valuable skill to develop.

Once a press release is picked up by a news editor, if the content is compelling, there is now a chance that they will write about it. Press exposure like this is awesome. The publisher gets the link from the editor’s site, and some of their readers will like the site and link to it too.

Direct Contact of News Media

Another staple of traditional PR is the practice of making direct contact with bloggers, news editors and writers and getting them interested in a story. In other words, instead of a shotgun approach like a press release, it’s a rifle shot approach. The idea is to conduct a highly customized specific effort to appeal to the interests of the editor or writer. In today’s environment, news media encompasses traditional magazines and papers, as well as bloggers. High value blogs can be just as influential as a major news site. Contacting the major players in a highly personalized way makes a lot of sense. Use common relationship building rules, such as making sure to bring value to the person being contacted. Study the things they like, write about, or are interested in. Use that knowledge to tailor the approach to them.

Partnerships and Licensing

One of the simplest ways for a publisher to obtain a link is to incorporate it into the business terms of the contracts that they execute. For example, if a business uses distributors or resellers to sell their products, they can have the distributor or reseller link back to the publisher’s site. This may seem like a compensated link, but that is at least a debatable point. The third party would not want to distribute or resell someone’s product unless they were willing to endorse it. Linking to the publisher’s site is simply one element of that endorsement.

There are two types of related programs that are seen by Google as being “non-editorial” in nature (i.e. they would prefer that these links have no value). These are:

1. Affiliate Programs
Affiliate programs are when one web site links to another, and the site providing the link gets paid whenever a visitor goes to the publisher’s site and buys something or completes some other form of conversion. Links in affiliate programs often come in the form similar to the following:

http://www.yourdomain.com?id=1234

The ID code is used to track the visitors from the affiliate to determine when a transaction or conversion occurs. The basic problem with this is that http://www.yourdomain.com?id=1234 is seen as duplicate content to the original page: http://www.yourdomain.com. This can cause some problems. In addition, Google has publicly indicated that it does not want people to use affiliate programs as a method for getting links that pass PageRank. They have stated this several times, including in this interview with Matt Cutts. Matt Cutts has also indicated that Google does a pretty good job at detecting when a link is an affiliate link.

For publishers that want to push the envelope a little bit, it is possible to implement a 301 redirect from http://www.yourdomain.com?id=1234 to http://www.yourdomain.com. This can be done by having a web application recognize the incoming affiliate link, cookie the user, and then execute the redirect. Publishers who use this technique need to recognize, however, that the basis of this type of link campaign carries some real risk.

2. Discount Programs
This is a closely related concept, but it removes the element of compensation. The idea here is for the publisher to offer visitors from certain web sites a discount on the publisherís product. Based on this, the publisher can encourage those other sites to put a link on their site back to the publisherís site, noting the availability of the discount. This may appeal to such other sites as it allows them to offer real value to their visitors.

However, this is also seen by Google as a questionable practice. While it would seem that the discount offer would only be interesting to the third party site if they valued the product (or think their visitors will), Matt Cutts clearly indicated that Google does not want to value such links in the above referenced interview.

Buying Links for SEO

One of the more popular techniques is to buy links. It has two significant advantages:

  • It seems like it should be easy. There is no need to sell the quality of the content on your site. The only things that need to happen are determining that the Target Site is willing to sell, and setting a price.
  • Since the link is an ad, the Target Site can simply specify the anchor text they want. Anchor text is a powerful ranking signal, and this is one of the major reasons that people engage in link buying.

Google’s Policy on Paid Links

The major downside is that buying links for SEO is against Google’s Webmaster Guidelines, and Google puts a lot of effort into detecting and disabling paid links. The policy on paid links can be briefly summarized as: Links given in return for compensation should not be obtained for purposes of improving PageRank / passing link juice.

Google is not saying that publishers should not be able to buy ads on the web. Their policy is that links which are purchased should only be purchased for the traffic and branding value that they bring. Google also recommends that publishers use the NoFollow attribute on such ads, which means that they will have no SEO value.

On another note, PPC campaigns using Adwords, Yahoo! Search Marketing, etc., are not considered a violation of the policy against paid links. This is because they are easily recognized by the search engine crawlers and do not pass link juice.


Methods for Buying Links

There are four major methods for buying links. These are:

  1. Direct link advertising purchases. This method involved contacting sites directly and asking them if they are willing to sell text link ads. Many sites have pages that describe their ad sales policies. However, sites that openly say that they sell text links are more likely to get caught in a human review, resulting in the link being disabled from passing PageRank by Google.
  2. Link brokers. The next method is the use of link brokers. These are companies that specialize in identifying sites selling links and reselling that inventory to publishers looking to buy such links. The major danger here is that ad brokers may have a template of some sort for their ads, and a template can be recognized by a spider as being from a particular broker.
  3. Charitable donations. Many sites of prominent institutions request charitable contributions. Some of these provide links to larger donors. Search for pages like this one on genomics.xprize.orgthat link to their supporters and provide a link to them. Sometimes these types of links donít cost a ton of money. This tactic is, however, frowned on by Google, and best used with care. One way a publisher can potentially make the tactic more acceptable is to support causes that are related in a material way to their site. However, it is not clear that this would be acceptable to Google either.Finding these types of sites may seem hard, but the search engines can help with this. Aaron Wall wrote a post about how to find donor linksa long time ago, and the advice he gave then still holds true. Publisherís can search on terms such as these:
    • Sponsors
    • Donors
    • Donations
    • “Please visit our sponsors”
    • “Thanks to the following donors”

    These can be filtered a bit further to narrow the list down some, by adding operators such as site:.org or site:.edu to the end of the search command.

Strategies Which Are Not Considered Buying Links

It’s worth noting some strategies where money is involved in obtaining a link, yet it is not considered buying a link. Here are some examples:

  1. Using a consultant to promote articles on a social media site such as Digg
  2. Paying a PR firm to promote a site
  3. Paying a link building firm to ask for (as opposed to buy) links on your behalf

The key point is that these strategies do not compensate the Target Site itself for the links given, and the links are given freely.

Ultimately, I do not recommend buying links, and neither does SEOmoz. Buying links that the search engines cannot detect is hard work. The same level of effort can get you truly natural links without the risk.

Reciprocal Links

Another common practice is to trade links between web sites. As with buying links, in many cases websites will agree to trade links without regard to the quality of the content on the publisherís site. Of course, this illustrates the problem of a lack of editorial judgment by the parties participating in the link swap.

Link trading used to be done by publishers in large volume. However, less of it occurs today, as more publishers are aware of the limitations of this technique. In early 2006, Google implemented an update that was called “Big Daddy.” What this update did was devalue swapped links involving websites that had an unnaturally high number of reciprocal links.

Another factor to be concerned about is the relevance of the site with which the link is traded. Poor relevance may limit the value of the link received. In addition, linking to unrelated sites can dilute the relevance of the site of the publisher. The bottom line is that a high volume campaign focused on reciprocal links is a bad idea.

However, this does not mean a publisher should never swap links. There is certainly a time and a place for it. For example, swapping a link with an authoritative site in the same market space as the publisher can be very beneficial indeed. One rule of thumb to use is whether or not the site that the publisher is considering swapping links with is something that they might link to even if they did not get a link in return. If the answer is yes, then the relevance and authority of the site may justify a link swap.

Links from Pages Without Editorial Control

There are many thousands of places on the web where links can be obtained without any editorial control. Some prime examples of this are:

  • Forums
  • Blogs
  • Guestbooks
  • Social media sites
  • Wikis
  • Discussion boards
  • Job posting sites

These types of sites often allow people to leave behind comments or text without any review. A common spam tactic is to write a bot that crawls around the web looking for blogs and forums, and then put machine-generated comments in them containing a link back to the spammer’s site (or their client’s site). Most of the forums and blogs that have these comments added to them will automatically NoFollow all user contributed links, catch them in their spam filters, or find them manually and remove them. However, the spammer does not care because not all of the sites will catch and eliminate the comments or NoFollow them.

Throwing millions of comments out using a bot costs the spammer very little. If it results in a few dozen links that are not NoFollowed, then it is worth their effort. This is a deep black hat tactic and one that all the search engines do not want publishers to use. It is highly risky, and getting caught is very likely to result in getting banned.

NoFollow

The NoFollow meta tag or attribute is one that the search engines have agreed should mean that the links so tagged will not pass any PageRank or link juice to the site receiving the link. It comes in two forms, the NoFollow meta tag and the NoFollow attribute.

NoFollow Meta Tag

The NoFollow meta tag is implemented by placing code similar to the following in the <head> section of a given web page:

<meta name=”robots” content=”NoFollow”>

When this meta tag is seen by the search engines, it tells them that none of the links on the web page should pass any link juice at all to the pages they link to.

NoFollow Attribute

The NoFollow attribute is meant to allow a more granular level of control than the NoFollow meta tag. Its implementation looks something like this:

<a href=”http://www.yourdomain.com/page37.html” rel=”NoFollow”>

When used as an attribute, the only link that is affected is the one contained within the same anchor statement (in this case, the link to www.yourdomain.com/page37.html). This allows publishers to select only specific links that they want to NoFollow from a given page.

NoFollow Uses and Scams

One of the most common uses of NoFollow is to NoFollow all links in comments on a blog or in a forum. This is a common and legitimate use of the tactic, as it is used to prevent spammers from flooding forums and blogs with useless comments that include links to their sites.

There used to be some common scams people implemented using NoFollow. One simple example of this was that someone would propose a link swap and provide the return link they promised, but NoFollow it. The effect of this was that it looked like a clean one way link from one site to the other, even though a swap was done.

The presumption of this scam was that by NoFollowing the outbound link that the cheating site was conserving that link juice and saving it for their own site. This presumption is now known to be incorrect as a result of Matt Cutt’s statements at SMX Advanced 2009 and his follow up post titled PageRank Sculpting. Google has now made it clear that the link juice is simply thrown away when you use NoFollow on a link, so perpetrators of this scam will no loger receive any benefit from using NoFollow.

Publishers who are actively link building still need to be aware of NoFollow. There will be Target Sites that offer only NoFollow links for reasons not related to a scam scheme, and these sites providing links that have little or no SEO benefit (there may be other non-SEO benefits in those links, such as branding and traffic).

Helpful Search Terms, Phrases, and Advanced Query Parameters

The search engines provide a rich array of tools to perform link related research. The following shoes some some search commands as well as links to additional resources to learn more about query parameters.

1. inanchor:keyword

This command is useful in analyzing the relevance of a particular page based on the links pointing to it. For example, the search “inanchor:seo” returns the pages that have the best inbound links including the word SEO in the anchor text.

Publishers can go one step further and look at the pages on a particular website that have the best anchor text for a keyword as follows:

“inanchor:keyword site:domain-to-check.com”

This search is great for checking if a site has unnatural links. If 80% or more of the site’s links include the major keyword for their market (and it’s not part of their company name), it’s a sure bet that they are buying links or doing something similar to steer the anchor text. The operator is very valuable in learning about a competitor’s site and its strength in competing for the keyword. There are other related operators as well:

Intext:keyword – shows pages that have the phrase implemented in their text
Intitle:keyword – shows pages that have the phrase in the page title
Inurl:keyword – shows pages that have the phrase in the URL for the page.

2. yourdomain.com -site:yourdomain.com, with the &as_qdr parameter

This one is a bit trickier. In Google, perform a search on domain-to-check.com ñsite:domain-to-check.com. Then add “&as_qdr=d” to the end of the URL on the search results page and reload the page. This will show the mentions that domain-to-check.com has received in the past 24 hours.

This command is very useful in evaluating the competitive strength of a site. Trying it for Microsoft.com results in the following:br />


Publishers who are competing against someone who has received 39,600 mentions in the past 24 hours are in deep trouble or should focus on ranking somewhat lower than that competitor! Fortunately, your competition probably gets fewer mentions in a day than Microsoft.

Here are some variants of the operator that can be used at the end of the URL:

  1. &as_qdr=w (past week)
  2. &as_qdr=m (past month)
  3. &as_qdr=y (past year)

3. intext:domain-to-check.com

This command can help a publisher rapidly identify sites that reference “domain-to-check.com” but don’t implement that reference as a link. This command is very powerful because it can be used to identify “lost” links to the publisher’s domain. By identifying sites that reference the publisher’s domain, the publisher can go and contact those sites and ask them to convert it into a link. Since these are sites that are already endorsing the publisher’s site (most likely they are at any rate), the conversion rate can be expected to be reasonably high.

Sources & Additional Resources

Here are some additional resources for learning about search operators and how they can help with a variety of tasks:

Section III: Valuable Tools & Search Queries for Link Building

There are lots of tools available in the market for use in link building. This section will summarize some of the more interesting ones.

Note, however, that some of the articles you find on the web on the topic of advanced search operators will include references to the linkdomain: command that Yahoo! used to support. However, as of September 2010 that command is no longer supported.

 

The Basics

The first thing a publisher should do is develop an understanding of the links they already have. The 2 most basic tools for doing that are:

1. Google Webmaster Tools is a powerful first start. With Google Webmaster Tools, publishers can easily download a spreadsheet of links that Google has in its database (note that they might not include some links that they do not consider significant). Publishers can only use this tool to see the links to their own site. Here is a screenshot of how it looks:

2. Yahoo! Site Explorer will allow a publisher to see all of their backlinks. Yahoo! Site Explorer will also allow a publisher to see the backlinks for any web site, not just their own. You can also extract up to 1000 of these into a spreadsheet.

For quick and dirty link totals, it is handy to make use of a Firefox plugin known as Search Status. This provides basic link data on the fly with just a couple of mouse clicks. Here is the menu that is shown with regard to backlinks:

Notice also that it offers an option for highlighting NoFollow links, as well as many other capabilities. Itís a great tool to help pull numbers like these much more quickly than would otherwise be possible.

 

Competitive Backlink Analysis

One of the most important things a publisher does is obtain links to his/her site. A great way to do that is to see who links to the competition. Yahoo! Site Explorer is one tool for doing just that, but its major flaw is that you cannot extract more than 1,000 links into a spreadsheet. In addition, Yahoo! has announced that the tool has a limited future.

 

Open Site Explorer

However, advanced tools exist to do much of the work for the publisher. One example of such a tool is SEOmoz’s Open Site Explorer. Open Site Explorer is a tool that has been developed based on a crawl of the web conducted by SEOmoz, as well as a variety of parties engaged by SEOmoz. You can see a complete list of Open Site Explorer’s sources here.

Open Site Explorer is designed to be the tool for SEOs to use in really mapping links across the web. It gets around the 1,000 URL limit imposed by Yahoo! and will let you get as many links as are found by Open Site Explorer and extract them all into a spreadsheet.

The following is a sample report:

Included with this report are mozRank and mozTrust scores for each URL and domain listed. The tool also flags redirects, and you can easily obtain additional data, such as the Google PageRank scores for the linking page and its domain. No other tool exists to provide measurements of trust, and this is a powerful factor in link value.

The beauty of this tool is that it allows SEOs and publishers to collect competitive intelligence on other people’s domains, much like they do with Yahoo! Site Explorer today, but to escape the 1,000 result limit and also get SEOmoz’s proprietary mozRank and mozTrust scores. The importance of mozRank and mozTrust cannot be underestimated. The PageRank score published by Google through their toolbar is known to not be a reasonable approximation of the value of a given page. The Google algorithm has simply evolved far past the original simplistic concept of PageRank that Google used in its early days. In addition, all the data is extractable in a spreadsheet to allow for easy offline processing.

Majestic SEO

Majestic SEO is a UK based company that offers a powerful link analysis tool. They get their data by having established a network of user computers who donate CPU time to them for crawling purposes (a computer’s unused CPU cycles are used to crawl the web). This data is then pulled back into a link database, which as a customer you could then access. Here is an example of Majestic SEO’s historical view of links to a site:

You can also see this data filtered to show only the number of linking domains:

This helps filter out the impact of sitewide links on linking totals. Majestic SEO also allows you to extract large quantities of link data into spreadsheets for your own offline analysis.

Other Tools

Using some method for extracting link data in this fashion is a must for performing competitive backlink analysis.

Over at Search Engine Journal, Ann Smarty did a nice write up of SEO Tools. In it she mentioned Domain Inbound Links Checker, which appears to provide some quality data. However, many third party tools used to rely on the Yahoo! Search API which is no longer available given the deal between Yahoo! and Bing, so check to make sure that the output of the tool makes sense, before committing to use them.

Determining Link Value

Early on in this guide, we established the notion of evaluating sites (and pages) to determine if they were Low Value, Medium Value, High Value, or Very High Value. We established the following parameters as being important:

  1. The PageRank of the site
  2. How trusted is the domain?
  3. The perceived authority of the site
  4. The PageRank of the page
  5. How much of the domain’s trust accrues to the page?
  6. The perceived authority of the page
  7. The number of outbound links on the page
  8. The relevance of the link

There are other factors that one could take into account. These can include the number of Delicious (and other bookmarking sites) tags a site has, or the total number of citations of the site on the web which are not implemented as links. One tool that helps with this is the Trifecta Tool, which is accessible via an SEOmoz PRO account. Most SEO firms that engage in link building for their clients will either leverage a tool such as Trifecta or an internally developed tool.

General Link Building Tools

Some other link building tools include:

  1. SoloSEO’s Link Search Tool provides an extensive list of links out to valuable advanced Google queries based on the selected search terms.
  2. Aaron Wall’s SEO Book Tools Page has a rich set of tools for link research and beyond.


Advanced Tools

Google Webmaster Tools
Google Webmaster Tools can do more for your link building campaign then just provide you with a list of the links you already have. Looking at the 404 errors report in Google Webmaster Tools can be very illuminating. Here is an example output from that report:


The URL circled in red is of particular interest. The correct URL is http://www.stonetemple.com/STC_Services.shtml. This type of data in the 404 report is a sure sign that someone has linked to the publisherís site using an incorrect URL. The problem with this is that the link brings no benefit to the site, as links to a 404 page are not counted. However, this is easily fixed by 301 redirecting from http://www.stonetemple.com/stc_services.shtml to http://www.stonetemple.com/STC_Services.shtml.

On an Apache web server, this can be implemented in one single line in the .htaccess file that looks like this:

redirect 301 /stc_services.shtml http://www.stonetemple.com/STC_Services.shtml

 

Recovering Lost Links

Many times people reference a publisher’s site without implementing the reference as a true link. Mainstream media does this quite a bit – they will cite a source without implementing it as a link. The result is that the user can’t click on it and the search engine does not count it as a link to the publisher’s site. Some of the sites that reference a publisher’s site might be willing to make it a link if they are asked. After all, they are already endorsing the publisher’s content by referencing it, and perhaps the failure to implement it as a link was simply an oversight.

Finding these types of citations is easy using the following command:

intext:seomoz.org -site:seomoz.org

Once the publisher has executed this command on their site, the next step is to review the sites listed and then determine which ones are worth contacting to request a link. This should provide a fairly high return.
 

Section IV: Tracking Link Campaign Results

Once a link campaign is launched, it is important to track the results. This is also a difficult process. One of the basic ways to do this is to use the link reports in Google Webmaster Tools and Bing Webmaster Tools for ongoing comparisons. A tool such as Open Site Explorer or Majestic SEO can be run to get the current inbound links. Then you can find the links that appear in the tool’s link report and remove the ones that are in your existing database. The links that are left are the ones that are new to your site.

The other thing that can be done is to make use of the analytics software in use on the publisher’s site. For example, publishers can look at the referrers report on a daily basis and recognize when a brand new referrer shows up. The reason that this works (for the most part) is that people generally perform a quick test of a new link when they implement it to make sure it is working. This should show up as a new referral in the analytics software. As before, it is helpful to maintain a database (or spreadsheet) with a list of all historical referrers to make it easier to recognize when a new one comes in.
 

Section V: Putting It All Together

Starting Strategically

It is critical to start strategically when implementing a link building campaign. It’s easy to start down the wrong track and waste a lot of valuable time, so this should be avoided. Here are some of the major steps for assembling a strategy:

1. Identify Types of Link Targets

There are many possible different ways to classify types of link targets. Some examples include:

  • Major media sites
  • Social media sites (including Digg, Propeller, Delicious, StumbleUpon, and others)
  • Blogs
  • Universities and colleges
  • Government sites
  • Sites that link to your competitors
  • Related hobbyist sites

These types of lists can get a lot more granular. There might be seven different categories of blogs that represent good targets for a publisher to pursue, each requiring a somewhat different link building strategy. In the first stage of putting together a link building strategy, publishers should focus on building a list of all the different options.

2. Identify Strategies for Each Segment

The next step is to identify what it will take to get links from a particular market segment. Publishers who want to get links from “green” bloggers, for example, will need to create something that appeals to that audience.

This step should not be underestimated. Without the right content, data, or tools to appeal to a given audience, the publisher will not be successful. In addition, understanding the content or tools that needs to be developed is critical to understanding the cost of a link building strategy. Part of this step is also to identify the content packaging. For example, if the content is to be delivered using a widget, resources will need to be allocated to build the widget. Another question to ask is: will the content be placed on the publisher’s site or on a third party site (i.e., syndicating it)?

3. Identify Channel Strategies

This relates to the method of reaching the Target Sites. As defined before, there are indirect means and direct means for reaching sites. Examples of indirect means are:

  • PR
  • Social media sites
  • Writing a column in a major media magazine
  • Speaking at conferences

There are also many direct means. Some of these include:

  • Sending emails
  • Calling people
  • Sending snail mail
  • Content syndication

4. Find Out What the Competition is Doing

The next step is to see what segments the competition is pursuing. This is useful from two perspectives:

  • Determine where they are successful and follow in their footsteps
  • Find out what segments they have ignored and use that to get ahead

Of course, a major part of this is also nailing down specific links the competition has and identifying the ones that have the highest value. Publishers should use the competitive review to provide valuable input into finalizing their strategy.

5. Review the Cost of the Various Strategies

How much will it cost to pursue a particular strategy? Will the strategies meet the business goals? Are there brand considerations? What will the cost be in terms of money and resources? These questions must be answered before finalizing. It is no good to have the ultimate link building strategy defined if the publisher is unable to pursue it.

6. Finalize the Strategies to Pursue

The final step of the strategic phase is the easiest. Review the available information that has been put together and decide on an affordable but effective set of strategies to pursue. To do this, it might make sense to build out a chart that helps you visualize the level of effort involved vs. the return.

Effort to Pursue Campaign Value
Low High
Low Medium
Low Low
Medium High
Medium Medium
Medium Low
High High
High Medium
High Low

It may be seem a bit simplistic, but this level of visualization can be extremely helpful. Obviously you pursue the low effort – high value campaigns as a first priority, and you probably never get to the high effort – low value campaigns. You bring in other campaigns as makes sense along the way.

 


Execution

A world class link building campaign is always a large effort. The resources need to be lined up and focused on the goal. Publishers always experience bumps in the road and difficulties, but it is critical to line up the resources and go for it. It is also critical be persistent. It is amazing how poorly many companies execute their strategies. Publishers that execute relentlessly and with vision inevitably gain an edge over many of their competitors. Of course, if there are other competitors that also execute relentlessly, it becomes even more important to push hard on link building.

Conducting Strategic Reviews

Link building strategies should evolve in the normal course of business. As campaigns are pursued, lessons are learned, and this information can be fed back into the process. New strategies are also conceived over time, and some of these are great ones to pursue. Also, sometimes the initial strategy goes great for a while but begins to run out of steam. Publishers should have a constant stream of ideas that they are feeding into their link building plans.


Creating a Link Building Culture

Publishers should also train many people within the organization about their link building plan, its goals, and how it will help the business. The purpose of this is to engage the creativity of multiple team members in feeding the stream of link building ideas. The more ideas, the better. The quality of a link building campaign is directly proportional to the quality of the ideas that are driving it.
 

Never Stop

Link building is not something that you do once, or once in a while. We live in a culture where the search engine plays a large role in the well being of a business. Consider the business that implements a great link building campaign, gets to where they want to, and then stops. What happens to them when their competitors just keep on going? They get passed and left behind. Publishers who are lucky enough to get in front need to be prepared to fight to stay there. Publishers who are not out in front should fight to get there.

 

Section VI: Conclusion

Links are the engine that drives search engine rankings. This is not meant to minimize the importance of on page SEO (also known as “technical SEO”). A poorly optimized website will have problems getting traffic no matter how many links it gets. But, because links and their anchor text are used by the search engines to determine how important a site is, they determine how important a site is to a particular topic and are thought of as being the multiplier of traffic potential.

Think of it in housing terms. When you buy a house, the first thing you probably should do is to make sure all the major systems, electrical, heat, plumbing, etc. is working safely and correctly. Correspondingly, you don’t start with that kitchen remodeling project or adding onto the house. When you go to sell the house, however, the more visible improvements you have made to the house, such as the remodeling project, will have a direct impact on the sales price. The work you did on the major systems won’t have much impact on the price, but it will affect whether or not you will be able to sell the house (because people don’t want to move into a house with major systems problems). And so it is with technical SEO and link building. You won’t be able to get traffic if your site architecture is broken or there is not any real content on the site, but link building drives the site value.

In addition, think of link building as a marketing function, much like PR. Link building is about getting the word out about the great stuff you have on your site. Because of the dynamics of getting someone to link to you, you have to sell them on the quality of what you have, even if you make use of some form of incentives (other than compensation based incentives). Where link building differs from PR is that link building has specific goals about the outcome which is desired as a result of the promotional activities, and there are specific types of tactics and targets a link builder will choose as a result.

Finally, think of link building as a cost of doing business. You don’t want to make the mistake of doing a campaign, then stopping, then starting again, then stopping. All that this does it give a more determined competitor the chance to get in front of you and stay there because they do link building continuously and you don’t. Don’t let them do that. There is so much to be gained by increasing your search traffic – it is an opportunity that few can afford to ignore.

Go get yourself some links with Natural link building

© 2013 https://canadaseopro.ca - Canadian SEO Blog