In all of these strategies, the key is to develop trust before making the request. In fact, ideally, the publisher never makes the request at all, but the context of the relationship causes the link to be given. It is far easier to get a link when there is trust in place then when there is not.
Directories
Directories can be a great way to obtain links. There are a large number of directories out there, and they may or may not require money in order to obtain a listing. Examples of high quality directories that are free include:
Examples of high quality directories which require a fee are:
A more comprehensive list of directories is available from Strongest Links.
What Search Engines Want From Directories
Not all directories provide links that the search engines value. The key as to what would cause a search engine would vlaue a directory link or not is in the editorial policy of the directory. The essential factors that the search engines look for are:
- The fee paid is made in payment for an editorial review, not for a link.
- Editors may at their whim change the location, title, and description of the listing.
- Editors may reject the listing altogether.
- Regardless of the outcome, the directory keeps the money (even if the publisher doesn’t get a listing).
- The directory has a track record of rejecting submissions. The inverse of this, which is more measurable, is that the quality of the sites listed in the directory is high.
The following is an extract from my blog post on The Role of Directories in Link Building:
Ultimately, “Anything for a buck” directories do not enforce editorial judgment, and therefore the listings do not convey value to the search engines.
To take a closer look at this, let’s examine some of the key statements from Yahoo!’s Directory Submission Terms:
- For web sites that do not feature adult content or services, the Yahoo! Directory Submit service costs US$299 (nonrefundable) for each Directory listing that is submitted.
- I understand that there is no guarantee my site will be added to the Yahoo! Directory.
- I understand that Yahoo! reserves the right to edit my suggestion and category placement; movement or removal of my site will be done at Yahoo!’s sole discretion.”
Classifying Directories
We can divide directories into 3 buckets:
- Directories That Provide Sustainable Links. These are directories that comply with the policies as outlined above. Most likely, these links will continue to pass link juice for the foreseeable future.
- Directories That Pass Link Juice that May Not Be Sustainable. These are directories that don’t comply with the policies as outlined above. The reason such directories exist is that search engines tend to use an “innocent until proven guilty” approach, so the search engine must proactively make a determination of guilt before a directory’s ability to pass link juice is turned off. Even so, link juice from these types of directories is probably not going to be passed in the long term.
- Directories That Do Not Pass Link Juice. These are the directories that have already been flagged by the search engines. They do not pass any value. In fact, submission to a large number of them could be seen as a spam signal, although it is unlikely that any action would be taken solely on this signal alone.
Detecting Directories that Pass Link Juice
The process is relatively simple for directories that pass sustainable links, as defined above. The steps are:
- Investigate their editorial policies and see if they conform to what search engines want.
- Investigate their track record. Do they enforce their policy for real? This may be a bit subjective, but if there are lots of junky links in their directory, chances are that the policy is just lip service.
- As another check, search on the directory name and see if there is any SEO scuttlebutt about the directory as well, and then read what it says to see specifics.
The process is a bit harder for directories that do not confirm to the policies search engines prefer. There are still some things the publisher can do:
- Search on the name of the directory to see if it shows up in the search engine results. If not, definitely stay away from it.
- Take a unique phrase from the home page of the directory and see if that shows in the search engine results. If not, stay away from it.
- Do they have premium sponsorships for higher level listings? A sure signal to search engines about their editorial policies.
- Do they promote search engine value instead of traffic? Another bad signal.
- Evaluate their inbound links. If they are engaged in shady link building tactics, it’s a good idea to stay away from them.
A good reference article for detecting bad directories is Rand Fishkin’s article on what makes a good web directory.
Content-Based Link Building Strategies
As outlined before, publishers do not link out to other sites just to help those other sites make money. In the eyes of the Requestee, there needs to be a purpose for giving out the link. Of course, paid links are one way of doing that, but we will discuss that later. For now, letís review six major ways to use content to build links:
1. Creating a Rich Resource Site
The principle here is simple. Create one of the best resources in a given market space and people will link to it. In an ideal world, the publisher can reach link building nirvana, where the links come to them without their doing anything. However, this is a difficult state to reach.
Ultimately, even if the publisher must take on the burden of promoting their content to get links, the key for the publisher is to establish their site as an expert in its field by publishing truly authoritative content (not authority in the search engine sense discussed previously, but recognized by people in their market as authoritative). This is a serious undertaking in any field, but it is extremely effective, especially if the field is already established. One way to make the strategy a bit easier is to focus on one particular vertical aspect of the field.
For example, if the general market space is “widgets,” there may be an opportunity to become an expert on a vertical aspect, such as “left handed widgets.” A specific area of vertical expertise such as this fictitious example is a very, very effective way to establish a new presence in a crowded market. Once this type of resource has been created, simply “Asking for a Link” as outlined above will often be a very effective strategy.
2. Rifle Shot Content
Consider producing a single killer article (or a small set of killer articles). There are two ways to do this:
- Answer a publicly stated need. This may come about because someone at the Target Site lets a need be known to the public. An example of this is the 2007 Web Analytics Shootout report published by Stone Temple Consulting. The Web Analytics report was spawned by this blog post by Rand on SEOmoz titled Free Link Bait Idea. Rand identified a need, the need was answered, and links resulted.
- Determine an unfulfilled need through research. Study your market space and find ways to do something stunningly unique that attracts the attention of the market.
3. Content Syndication
The previous two content based strategies were based on the notion that the content developed would be placed on the publisher’s site. This does not have to be the case. It is entirely possible to develop content with the intent of publishing it on someone else’s site. In return for providing the article, the author gets a link back to their site. It is also often possible to get targeted anchor text in this scenario. Watch the boundaries here though, because if the anchor text is unrelated to the article itself, it will not comply with what the search engines want publishers to do for link building techniques.
There are a couple of important points to watch for when syndicating content:
- Most publishers should not distribute articles that are published in the same form on the publisher’s own site. Search engines will see this as duplicate content. While the search engine’s goal is to recognize the original author of a piece of content, it is a difficult job to do perfectly, and it does happen that the search engines make mistakes.When looking to distribute content published on a site, the best practice is to write a new article on the same topic area, but make it different in structure and in its material points, and syndicate that version of the article. Therefore, if the site publishing the syndicated article ranks highly for key search terms, it is not a problem for the author’s site. If a publisher does choose to take an article from their site and distribute it in an unmodified form, they should make sure that the site publishing the article includes a link back to the original article on the publisher’s site. This increases the likelihood that the search engine will handle the duplicate content situation correctly by recognizing the original publisher of the article.
- When considering the syndication of content to a High Value Site, or a Very High Value Site, it makes sense to study the content needs of the site and custom tailor the content to those needs. This practice maximizes the chances of the acceptance of the article by the Target Site.
One variant of content syndication is to generate articles and the submit them to “article directories”. There are many of these types of sites, and frankly some of them are pretty trashy. But there are OK ones as well. The distinction between the trashy and good ones is relatively easy to recognize, based on the quality of the articles that they have published. However, common belief in the SEO community is that there is not much value in this strategy any more, because only a few article directories are believed to pass links of value, and even their value is likely to be limited.
Another type of content syndication is guest posting. This is the notion of specifically targeting blogs and then offering them content to publish on their site. Blogs are a particularly interesting target because: (1) They are often hungry for fresh content, and: (2) there are many blogs out there that accept guest posts farily routinely, yet have strict editorial standards.
One way to do this is to go and develop a list of sites that accept guest posts on topics similar to ones that fit your site and business, find out what they typically look for in content by reading their blog, and then reaching out to them and suggesting a post that you could provide to them. It helps if you have a demonstrable expertise (showing them your site may be sufficient). Then, once they indicate a willingness to accept such an article, get it written, and deliver it to them.
It is pretty well accepted that authors of such guest posts get to have an attribution link incorporated in the post, usually at the top of the post, or at the bottom of the post. If you choose to link back to your site in the body of the post, it is important that the page linked to have a strong tie to the article. For example, it could be the source of some key supporting data referred to in the article. Of course, it is possible to push the limits on some of these things, but you run the risk of rejection, or becoming known as a someone who does not play nice with the community.
A third form of content syndication is to develop an infographic. You can see numerous examples of infographics here. Infographics work because the web is a highly visual place. Since the volume of information on the web is growing without bound, people look for ways to get new information with the lest amount of effort possible. A highly visual chart can make an interesting or surprising point clear in a matter of seconds. For that reason if you develop a particularly infographic and offer up for people to publish on their own sites in return for a link, this can work quite well.
There are two key aspects to an infographic strategy:
- Developing compelling content – something that people will want to put on their site
- Finding an effective means for promotion it. For example, if you have a strong presence in social sites such as Facebook anf Twitter, you can make a lot of people aware of the new content in a hurry.
4. Social News and Bookmarking Sites
Social media sites such as Digg, Reddit, StumbleUpon, Delicious, and others can play a big role in a link building campaign. Becoming “popular” on these sites can bring in a tremendous amount of traffic and links. While social news sites like Digg and Reddit bring lots of traffic, this traffic is usually of low quality and will have a very low revenue impact on the site receiving it. The real ball game is to get the links that result. For example, stories that make it to the home page of Digg can receive tens of thousands of visitors and hundreds of links. While many of these links are transient in nature, there is also a significant number of high quality links that result.
Better still, articles about topics that also happen to relate to very competitive keywords can end up ranking very well, quite quickly, for those very competitive keywords. The key insight into how to make that happen is to use the competitive keyword in the title of the article itself and in the title of the article submission. These are the two most common elements grabbed by people linking to such articles when selecting the anchor text they use. Similar strategies can work well on a smaller scale with other social news sites like Mixx and Propeller.
Sites like Delicious and StumbleUpon are different in structure. Delicious is a tagging (or bookmarking) site used by users to mark pages on the web that they want to be able to find easily later. StumbleUpon shares some similarities with Delicious, but also offers a content discovery aspect to its service. Both of these sites have “popular” pages for content that is currently hot on their sites. Getting on those pages can bring lots of traffic to a site. This traffic is of higher quality than you get from social news sites.
Users coming to a site via tagging sites are more likely to be genuinely interested in the topic in a deeper way than users who got their eye caught by a snappy article title on a social news site. So while the traffic may be quite a bit lower than on the social news sites, publishers can also earn some quality links in the process. Tagging sites are best used in attempting to reach and develop relationships with major influencers in a market space. Some of these may link to the publisher, and these are potentially significant links.
Social media sites also allow publishers to create their own profiles. These profiles can, and should, include links back to the publisherís site. Learn more about social media sites here.
5. Facebook and Twitter
Social media sites can also play a strong role in Natural link building. This can happen at several levels, and the nature of the opportunity varies by site. These opporrtunities can be dvided into three categories:
- Developing a large audience which can be used in a PR like manner to get the word out about new programs you launch or publish on your site. For example, publishers that have succesfully obtained more than 10,000 followers on Twitter report that they can tweet about a new piece of content on their site and it will result in many links to that new content.
- Driving viral activity through the social media site. For example, if you have a Facebook fan page with a large number of fans, and you announce a new program, many of them may comment about it on their pages, and then all of their friends will see their updates. Then, some of them may likewise mention the program in one of their updates, and so forth.
These are just some of the things you can do with social media sites. The key is to develop a large number of friends/followers/connections and then leverage the particular platform in an appropriate way. These two articles from SEOmoz covers many of these mechanics in more detail. As with all social media strategies, you need to play nice in the community, and act like a member. Social networks are quick to punish those who violate their guidelines or normal civilized behavior.
In addition, it is important to be aware that you will need to invest a significant amount of time in each site with which you choose to work with. As a rule of thumb, I usually suggest a minimum of 10 hours per week, and you do get more benefits by doing even more. However, fresh college grads can often be used do much of the work, so it does not need to a huge expense.
6. Getting Links From Social Media Profiles
Some social media sites, such as LinkedIn, allow you to link back to your own sites in your personal profile, and these links are not NoFollowed (meaning they pass link juice). Leveraging this can be a great tactic, as it is simple and immediate.
In the case of LinkedIn, the process takes a few steps:
- Login to LinkedIn.
- Click “Account & Settings” in the top right corner of the LinkedIn screen.
- Click on “My Profile” to edit it.
- Click “Websites” to edit your websites.
- This will present you with the ability to edit your additional information (click “edit”).
- Add a web site listing. Start by taking the box that says “Choose” and selecting “other.” This is what will allow you to specify keyword rich anchor text.
- Then enter in the correct URL and click “Save Changes.”
- Next we have to make your websites visible to the public (and the search engines).
- On the upper right of your screen you will see a link titled “Edit Public Profile Settings.” Click on it.
- On the next screen, under Public Profile, make sure that “websites” is checked. This is what tells LinkedIn to display that publicly.
- Click “Save Changes.”
Note that the above process was what was required as of early 2009, but the specifics may evolve over time as LinkedIn releases updates.
10 other social media sites that do not NoFollow links in their public profiles are:
- Flickr
- Digg
- Propeller
- Technorati
- MyBlogLog
- BloggingZoom
- Current
- Kirtsy
- PostOnFire
- CoRank
7. Blogging for Links
Blogging can also be quite effective in link development. How effective a blog will be is highly dependent on the content on it, the market space, and how it is promoted by the publisher. The first thing to realize when starting a blog is that it is a serious commitment. No blog will succeed if it does not publish content on a regular basis. How frequently a blog needs to publish depends on the market space. For some blogs, one post a week is enough. For others, it really needs to be 2-3 times per week, or even more.
Blogging is very much about reputation building as well. Quality content and/or very novel content is a key to success. However, when that first blog post goes up, the blog will not yet be well known, will not likely have a lot of readers, and those that do come by are less likely to link to a little known blog. In short, starting a new blog for the purpose of obtaining links is a process that can take a long time. But it can be a very, very effective tool for link building. Itís just that patience and persistence is required.
Here are a few key things to think about when blogging for links:
- One of the best places to get links to a blog is from other blogs. This is best done by targeting relationships with major bloggers and earning their trust and respect.
- Be patient when developing relationships with other blogs. Trust and respect donít come overnight, and they certainly donít result from starting the relationship with a request for a link.
- The publisher should target a portion of their content at the interests of other major bloggers in their market area. Over time, this process should turn into links to the publisherís blog from such other major bloggers.
- Once you obtain links from major bloggers, other less well-known bloggers will begin to see links on the major blogs and will begin to follow suit.
It is also important to leverage the social nature of the blog. Publishers should try to provide a personalized response to every person who comments on their blog. One effective way to do this is to send each and every one of them a personalized response by direct email that shows that the comment was read. This helps deepen the interest of the commenter, creates a feeling of personal connection, and increases the chance that the commenter will return and possibly add more comments. Nurturing the dialog on a blog in this fashion helps that dialog grow faster, and blogs with an active dialog taking place on them are more likely to receive links.
8. Widgets
Widgets are also a way of syndicating content to third party sites. The concept is to develop a widget and provide it to other publishers and allow them to place in on their sites, in return for an attribution link back to your site. However, most widgets are implemented in some form of Javascript, and this may result in any links embedded within a widget as being invisible to the search engine. However, it is possible to implement a widget in such a way that it has an HTML wrapper around it with a simple HTML text link in it, which is quite visible to the crawler. Popular widgets can get adopted by a large number of web sites and can result in a large number of links as a result. Widget campaigns can also result in links to deep pages on your site.
A word of caution is merited. Widgets used for link building should be closely related to the content of the page receiving the link. An example of someone who did this differently than this is discussed in this post: Another Paid Links Service Disguised As Hit Counter. Four days after this post went up, the sites referenced lost all of their high rankings in Google, and therefore lost most of their traffic. The main reason for this is that the links given with the hit counter were in fact unrelated to the widget and hidden in the <noscript> portion of the hit counter.
Be aware that making the link visible is not enough to make this practice legitimate in the search engines’ eyes. Google has confirmed that they consider tactics like the use of unrelated widgets for link building, such as a hit counter, a no-no, even if the link is visible. The underlying reason for this is that if the link is unrelated to the widget, it is pretty unlikely that the link given actually represents an endorsement of the web page receiving the link. The goal of the person installing the widget is to get the benefit of the contents of the widget. On the other hand, using this same strategy where there is a close relationship between the widget and the link given is far more likely to represent a real endorsement, particularly if content on the page receiving the link is the source of what is included in the widget.
Backlinking for Links
Backlinking, or seeing who links to whom, is one of the oldest methods of putting together a link campaign. If someone links to a competitor, there is a decent chance that they might be willing to link to you. The process starts by identifying sites related to the market space of the publisher, including directly competitive sites and non-competitive authoritative sites. Then, using tools such as Open Site Explorer, you obtain a list of the sites linking to those sites.
Third party tools such as SEOmoz’s Open Site Explorer or Majestic SEO can provide you backlink data in a spreadsheet format, and we will discuss these later on in this guide. You can also use Yahoo Site Explorer, but Yahoo! has already made it clear that this will shut down by 2012 as a result of the Microsoft-Yahoo! deal. Yahoo! Site Explorer provides a result set that looks like this:
One of the nice things about this is that you can download the first 1,000 results into a TSV file that you can pop straight into Excel, which gives you a list that looks like this:
Whatever tool you use, can filter the list of links by sorting by URL, and removing all the internal links (i.e., links from targetdomain.com to targetdomain.com) or use the tool to do that autmoatically for you. This will return a list of all the third party links to the site.
Once this list is obtained, the next steps are:
- Capture, at a minimum, the URL of the linking page, the anchor text used, the PR of the linking page, and the PR of the home page of the site containing the linking page. This step is an arduous process unless you use a third party tool to pull that data for you. If you use Open Site Explorer make sure to include Domain mozRank, Page mozRank, Domain mozTrust, and Page mozTrust.
- Analyze the linking pages to determine which ones are worthy of individualized attention. Campaigns can be designed very specifically around these sites, as discussed earlier in this document.
- Research the linking pages and try to determine the best contact to use in asking them for a link. This work should be done manually for two major reasons:
- Using a computer program to extract contact information from a web site is a violation of the CAN-SPAM Act (just don’t go there).
- Computer programs may not return the best contact to use, and the contact returned by such programs may in fact not be useful at all.
- Develop an email campaign using the guidelines provided above.
Performing backlink analysis on competitive sites is one great way to pursue this. If a site links to a competitor, there is a decent chance that they might link to your site. If your content is better than that of the competitor, this may even result in the competitor’s link being replaced by the yours. Count that as a double win.
It is also interesting to perform backlink analysis on authoritative sites, even if they are not competitors. The sites linking to these authority sites in your space have an interest in the market space, and could be a good source of links. As always, publishers need to be careful that they are presenting a good value proposition to the Target Sites in order to maximize their return.
Press Releases and PR
Press releases are one of the staples of a PR campaign. Developing SEO optimized press releases can be a very effective link building technique as well. To be effective for link building, these need to be distributed by wire services such as PRWeb and BusinessWire.
The key to optimizing a press release is to choose keywords for the title that make them likely to be picked up by major news media editors. Having the correct keywords will align a press release with the search terms the new media editor uses. These terms are not the same as those used in web search, because they relate to search queries by highly knowledgeable people searching through feeds and/or databases of press releases. Designing a press release to attract attention is a valuable skill to develop.
Once a press release is picked up by a news editor, if the content is compelling, there is now a chance that they will write about it. Press exposure like this is awesome. The publisher gets the link from the editor’s site, and some of their readers will like the site and link to it too.
Direct Contact of News Media
Another staple of traditional PR is the practice of making direct contact with bloggers, news editors and writers and getting them interested in a story. In other words, instead of a shotgun approach like a press release, it’s a rifle shot approach. The idea is to conduct a highly customized specific effort to appeal to the interests of the editor or writer. In today’s environment, news media encompasses traditional magazines and papers, as well as bloggers. High value blogs can be just as influential as a major news site. Contacting the major players in a highly personalized way makes a lot of sense. Use common relationship building rules, such as making sure to bring value to the person being contacted. Study the things they like, write about, or are interested in. Use that knowledge to tailor the approach to them.
Partnerships and Licensing
One of the simplest ways for a publisher to obtain a link is to incorporate it into the business terms of the contracts that they execute. For example, if a business uses distributors or resellers to sell their products, they can have the distributor or reseller link back to the publisher’s site. This may seem like a compensated link, but that is at least a debatable point. The third party would not want to distribute or resell someone’s product unless they were willing to endorse it. Linking to the publisher’s site is simply one element of that endorsement.
There are two types of related programs that are seen by Google as being “non-editorial” in nature (i.e. they would prefer that these links have no value). These are:
1. Affiliate Programs
Affiliate programs are when one web site links to another, and the site providing the link gets paid whenever a visitor goes to the publisher’s site and buys something or completes some other form of conversion. Links in affiliate programs often come in the form similar to the following:
http://www.yourdomain.com?id=1234
The ID code is used to track the visitors from the affiliate to determine when a transaction or conversion occurs. The basic problem with this is that http://www.yourdomain.com?id=1234 is seen as duplicate content to the original page: http://www.yourdomain.com. This can cause some problems. In addition, Google has publicly indicated that it does not want people to use affiliate programs as a method for getting links that pass PageRank. They have stated this several times, including in this interview with Matt Cutts. Matt Cutts has also indicated that Google does a pretty good job at detecting when a link is an affiliate link.
For publishers that want to push the envelope a little bit, it is possible to implement a 301 redirect from http://www.yourdomain.com?id=1234 to http://www.yourdomain.com. This can be done by having a web application recognize the incoming affiliate link, cookie the user, and then execute the redirect. Publishers who use this technique need to recognize, however, that the basis of this type of link campaign carries some real risk.
2. Discount Programs
This is a closely related concept, but it removes the element of compensation. The idea here is for the publisher to offer visitors from certain web sites a discount on the publisherís product. Based on this, the publisher can encourage those other sites to put a link on their site back to the publisherís site, noting the availability of the discount. This may appeal to such other sites as it allows them to offer real value to their visitors.
However, this is also seen by Google as a questionable practice. While it would seem that the discount offer would only be interesting to the third party site if they valued the product (or think their visitors will), Matt Cutts clearly indicated that Google does not want to value such links in the above referenced interview.
Buying Links for SEO
One of the more popular techniques is to buy links. It has two significant advantages:
- It seems like it should be easy. There is no need to sell the quality of the content on your site. The only things that need to happen are determining that the Target Site is willing to sell, and setting a price.
- Since the link is an ad, the Target Site can simply specify the anchor text they want. Anchor text is a powerful ranking signal, and this is one of the major reasons that people engage in link buying.
Google’s Policy on Paid Links
The major downside is that buying links for SEO is against Google’s Webmaster Guidelines, and Google puts a lot of effort into detecting and disabling paid links. The policy on paid links can be briefly summarized as: Links given in return for compensation should not be obtained for purposes of improving PageRank / passing link juice.
Google is not saying that publishers should not be able to buy ads on the web. Their policy is that links which are purchased should only be purchased for the traffic and branding value that they bring. Google also recommends that publishers use the NoFollow attribute on such ads, which means that they will have no SEO value.
On another note, PPC campaigns using Adwords, Yahoo! Search Marketing, etc., are not considered a violation of the policy against paid links. This is because they are easily recognized by the search engine crawlers and do not pass link juice.
Methods for Buying Links
There are four major methods for buying links. These are:
- Direct link advertising purchases. This method involved contacting sites directly and asking them if they are willing to sell text link ads. Many sites have pages that describe their ad sales policies. However, sites that openly say that they sell text links are more likely to get caught in a human review, resulting in the link being disabled from passing PageRank by Google.
- Link brokers. The next method is the use of link brokers. These are companies that specialize in identifying sites selling links and reselling that inventory to publishers looking to buy such links. The major danger here is that ad brokers may have a template of some sort for their ads, and a template can be recognized by a spider as being from a particular broker.
- Charitable donations. Many sites of prominent institutions request charitable contributions. Some of these provide links to larger donors. Search for pages like this one on genomics.xprize.orgthat link to their supporters and provide a link to them. Sometimes these types of links donít cost a ton of money. This tactic is, however, frowned on by Google, and best used with care. One way a publisher can potentially make the tactic more acceptable is to support causes that are related in a material way to their site. However, it is not clear that this would be acceptable to Google either.Finding these types of sites may seem hard, but the search engines can help with this. Aaron Wall wrote a post about how to find donor linksa long time ago, and the advice he gave then still holds true. Publisherís can search on terms such as these:
- Sponsors
- Donors
- Donations
- “Please visit our sponsors”
- “Thanks to the following donors”
These can be filtered a bit further to narrow the list down some, by adding operators such as site:.org or site:.edu to the end of the search command.
Strategies Which Are Not Considered Buying Links
It’s worth noting some strategies where money is involved in obtaining a link, yet it is not considered buying a link. Here are some examples:
- Using a consultant to promote articles on a social media site such as Digg
- Paying a PR firm to promote a site
- Paying a link building firm to ask for (as opposed to buy) links on your behalf
The key point is that these strategies do not compensate the Target Site itself for the links given, and the links are given freely.
Ultimately, I do not recommend buying links, and neither does SEOmoz. Buying links that the search engines cannot detect is hard work. The same level of effort can get you truly natural links without the risk.
Reciprocal Links
Another common practice is to trade links between web sites. As with buying links, in many cases websites will agree to trade links without regard to the quality of the content on the publisherís site. Of course, this illustrates the problem of a lack of editorial judgment by the parties participating in the link swap.
Link trading used to be done by publishers in large volume. However, less of it occurs today, as more publishers are aware of the limitations of this technique. In early 2006, Google implemented an update that was called “Big Daddy.” What this update did was devalue swapped links involving websites that had an unnaturally high number of reciprocal links.
Another factor to be concerned about is the relevance of the site with which the link is traded. Poor relevance may limit the value of the link received. In addition, linking to unrelated sites can dilute the relevance of the site of the publisher. The bottom line is that a high volume campaign focused on reciprocal links is a bad idea.
However, this does not mean a publisher should never swap links. There is certainly a time and a place for it. For example, swapping a link with an authoritative site in the same market space as the publisher can be very beneficial indeed. One rule of thumb to use is whether or not the site that the publisher is considering swapping links with is something that they might link to even if they did not get a link in return. If the answer is yes, then the relevance and authority of the site may justify a link swap.
Links from Pages Without Editorial Control
There are many thousands of places on the web where links can be obtained without any editorial control. Some prime examples of this are:
- Forums
- Blogs
- Guestbooks
- Social media sites
- Wikis
- Discussion boards
- Job posting sites
These types of sites often allow people to leave behind comments or text without any review. A common spam tactic is to write a bot that crawls around the web looking for blogs and forums, and then put machine-generated comments in them containing a link back to the spammer’s site (or their client’s site). Most of the forums and blogs that have these comments added to them will automatically NoFollow all user contributed links, catch them in their spam filters, or find them manually and remove them. However, the spammer does not care because not all of the sites will catch and eliminate the comments or NoFollow them.
Throwing millions of comments out using a bot costs the spammer very little. If it results in a few dozen links that are not NoFollowed, then it is worth their effort. This is a deep black hat tactic and one that all the search engines do not want publishers to use. It is highly risky, and getting caught is very likely to result in getting banned.
NoFollow
The NoFollow meta tag or attribute is one that the search engines have agreed should mean that the links so tagged will not pass any PageRank or link juice to the site receiving the link. It comes in two forms, the NoFollow meta tag and the NoFollow attribute.
NoFollow Meta Tag
The NoFollow meta tag is implemented by placing code similar to the following in the <head> section of a given web page:
<meta name=”robots” content=”NoFollow”>
When this meta tag is seen by the search engines, it tells them that none of the links on the web page should pass any link juice at all to the pages they link to.
NoFollow Attribute
The NoFollow attribute is meant to allow a more granular level of control than the NoFollow meta tag. Its implementation looks something like this:
<a href=”http://www.yourdomain.com/page37.html” rel=”NoFollow”>
When used as an attribute, the only link that is affected is the one contained within the same anchor statement (in this case, the link to www.yourdomain.com/page37.html). This allows publishers to select only specific links that they want to NoFollow from a given page.
NoFollow Uses and Scams
One of the most common uses of NoFollow is to NoFollow all links in comments on a blog or in a forum. This is a common and legitimate use of the tactic, as it is used to prevent spammers from flooding forums and blogs with useless comments that include links to their sites.
There used to be some common scams people implemented using NoFollow. One simple example of this was that someone would propose a link swap and provide the return link they promised, but NoFollow it. The effect of this was that it looked like a clean one way link from one site to the other, even though a swap was done.
The presumption of this scam was that by NoFollowing the outbound link that the cheating site was conserving that link juice and saving it for their own site. This presumption is now known to be incorrect as a result of Matt Cutt’s statements at SMX Advanced 2009 and his follow up post titled PageRank Sculpting. Google has now made it clear that the link juice is simply thrown away when you use NoFollow on a link, so perpetrators of this scam will no loger receive any benefit from using NoFollow.
Publishers who are actively link building still need to be aware of NoFollow. There will be Target Sites that offer only NoFollow links for reasons not related to a scam scheme, and these sites providing links that have little or no SEO benefit (there may be other non-SEO benefits in those links, such as branding and traffic).
Helpful Search Terms, Phrases, and Advanced Query Parameters
The search engines provide a rich array of tools to perform link related research. The following shoes some some search commands as well as links to additional resources to learn more about query parameters.
1. inanchor:keyword
This command is useful in analyzing the relevance of a particular page based on the links pointing to it. For example, the search “inanchor:seo” returns the pages that have the best inbound links including the word SEO in the anchor text.
Publishers can go one step further and look at the pages on a particular website that have the best anchor text for a keyword as follows:
“inanchor:keyword site:domain-to-check.com”
This search is great for checking if a site has unnatural links. If 80% or more of the site’s links include the major keyword for their market (and it’s not part of their company name), it’s a sure bet that they are buying links or doing something similar to steer the anchor text. The operator is very valuable in learning about a competitor’s site and its strength in competing for the keyword. There are other related operators as well:
Intext:keyword – shows pages that have the phrase implemented in their text
Intitle:keyword – shows pages that have the phrase in the page title
Inurl:keyword – shows pages that have the phrase in the URL for the page.
2. yourdomain.com -site:yourdomain.com, with the &as_qdr parameter
This one is a bit trickier. In Google, perform a search on domain-to-check.com ñsite:domain-to-check.com. Then add “&as_qdr=d” to the end of the URL on the search results page and reload the page. This will show the mentions that domain-to-check.com has received in the past 24 hours.
This command is very useful in evaluating the competitive strength of a site. Trying it for Microsoft.com results in the following:br />
Publishers who are competing against someone who has received 39,600 mentions in the past 24 hours are in deep trouble or should focus on ranking somewhat lower than that competitor! Fortunately, your competition probably gets fewer mentions in a day than Microsoft.
Here are some variants of the operator that can be used at the end of the URL:
- &as_qdr=w (past week)
- &as_qdr=m (past month)
- &as_qdr=y (past year)
3. intext:domain-to-check.com
This command can help a publisher rapidly identify sites that reference “domain-to-check.com” but don’t implement that reference as a link. This command is very powerful because it can be used to identify “lost” links to the publisher’s domain. By identifying sites that reference the publisher’s domain, the publisher can go and contact those sites and ask them to convert it into a link. Since these are sites that are already endorsing the publisher’s site (most likely they are at any rate), the conversion rate can be expected to be reasonably high.
Sources & Additional Resources
Here are some additional resources for learning about search operators and how they can help with a variety of tasks:
Section III: Valuable Tools & Search Queries for Link Building
There are lots of tools available in the market for use in link building. This section will summarize some of the more interesting ones.
Note, however, that some of the articles you find on the web on the topic of advanced search operators will include references to the linkdomain: command that Yahoo! used to support. However, as of September 2010 that command is no longer supported.
The Basics
The first thing a publisher should do is develop an understanding of the links they already have. The 2 most basic tools for doing that are:
1. Google Webmaster Tools is a powerful first start. With Google Webmaster Tools, publishers can easily download a spreadsheet of links that Google has in its database (note that they might not include some links that they do not consider significant). Publishers can only use this tool to see the links to their own site. Here is a screenshot of how it looks:
2. Yahoo! Site Explorer will allow a publisher to see all of their backlinks. Yahoo! Site Explorer will also allow a publisher to see the backlinks for any web site, not just their own. You can also extract up to 1000 of these into a spreadsheet.
For quick and dirty link totals, it is handy to make use of a Firefox plugin known as Search Status. This provides basic link data on the fly with just a couple of mouse clicks. Here is the menu that is shown with regard to backlinks:
Notice also that it offers an option for highlighting NoFollow links, as well as many other capabilities. Itís a great tool to help pull numbers like these much more quickly than would otherwise be possible.
Competitive Backlink Analysis
One of the most important things a publisher does is obtain links to his/her site. A great way to do that is to see who links to the competition. Yahoo! Site Explorer is one tool for doing just that, but its major flaw is that you cannot extract more than 1,000 links into a spreadsheet. In addition, Yahoo! has announced that the tool has a limited future.
Open Site Explorer
However, advanced tools exist to do much of the work for the publisher. One example of such a tool is SEOmoz’s Open Site Explorer. Open Site Explorer is a tool that has been developed based on a crawl of the web conducted by SEOmoz, as well as a variety of parties engaged by SEOmoz. You can see a complete list of Open Site Explorer’s sources here.
Open Site Explorer is designed to be the tool for SEOs to use in really mapping links across the web. It gets around the 1,000 URL limit imposed by Yahoo! and will let you get as many links as are found by Open Site Explorer and extract them all into a spreadsheet.
The following is a sample report:
Included with this report are mozRank and mozTrust scores for each URL and domain listed. The tool also flags redirects, and you can easily obtain additional data, such as the Google PageRank scores for the linking page and its domain. No other tool exists to provide measurements of trust, and this is a powerful factor in link value.
The beauty of this tool is that it allows SEOs and publishers to collect competitive intelligence on other people’s domains, much like they do with Yahoo! Site Explorer today, but to escape the 1,000 result limit and also get SEOmoz’s proprietary mozRank and mozTrust scores. The importance of mozRank and mozTrust cannot be underestimated. The PageRank score published by Google through their toolbar is known to not be a reasonable approximation of the value of a given page. The Google algorithm has simply evolved far past the original simplistic concept of PageRank that Google used in its early days. In addition, all the data is extractable in a spreadsheet to allow for easy offline processing.
Majestic SEO
Majestic SEO is a UK based company that offers a powerful link analysis tool. They get their data by having established a network of user computers who donate CPU time to them for crawling purposes (a computer’s unused CPU cycles are used to crawl the web). This data is then pulled back into a link database, which as a customer you could then access. Here is an example of Majestic SEO’s historical view of links to a site:
You can also see this data filtered to show only the number of linking domains:
This helps filter out the impact of sitewide links on linking totals. Majestic SEO also allows you to extract large quantities of link data into spreadsheets for your own offline analysis.
Other Tools
Using some method for extracting link data in this fashion is a must for performing competitive backlink analysis.
Over at Search Engine Journal, Ann Smarty did a nice write up of SEO Tools. In it she mentioned Domain Inbound Links Checker, which appears to provide some quality data. However, many third party tools used to rely on the Yahoo! Search API which is no longer available given the deal between Yahoo! and Bing, so check to make sure that the output of the tool makes sense, before committing to use them.
Determining Link Value
Early on in this guide, we established the notion of evaluating sites (and pages) to determine if they were Low Value, Medium Value, High Value, or Very High Value. We established the following parameters as being important:
- The PageRank of the site
- How trusted is the domain?
- The perceived authority of the site
- The PageRank of the page
- How much of the domain’s trust accrues to the page?
- The perceived authority of the page
- The number of outbound links on the page
- The relevance of the link
There are other factors that one could take into account. These can include the number of Delicious (and other bookmarking sites) tags a site has, or the total number of citations of the site on the web which are not implemented as links. One tool that helps with this is the Trifecta Tool, which is accessible via an SEOmoz PRO account. Most SEO firms that engage in link building for their clients will either leverage a tool such as Trifecta or an internally developed tool.
General Link Building Tools
Some other link building tools include:
- SoloSEO’s Link Search Tool provides an extensive list of links out to valuable advanced Google queries based on the selected search terms.
- Aaron Wall’s SEO Book Tools Page has a rich set of tools for link research and beyond.
Advanced Tools
Google Webmaster Tools
Google Webmaster Tools can do more for your link building campaign then just provide you with a list of the links you already have. Looking at the 404 errors report in Google Webmaster Tools can be very illuminating. Here is an example output from that report:
The URL circled in red is of particular interest. The correct URL is http://www.stonetemple.com/STC_Services.shtml. This type of data in the 404 report is a sure sign that someone has linked to the publisherís site using an incorrect URL. The problem with this is that the link brings no benefit to the site, as links to a 404 page are not counted. However, this is easily fixed by 301 redirecting from http://www.stonetemple.com/stc_services.shtml to http://www.stonetemple.com/STC_Services.shtml.
On an Apache web server, this can be implemented in one single line in the .htaccess file that looks like this:
redirect 301 /stc_services.shtml http://www.stonetemple.com/STC_Services.shtml
Recovering Lost Links
Many times people reference a publisher’s site without implementing the reference as a true link. Mainstream media does this quite a bit – they will cite a source without implementing it as a link. The result is that the user can’t click on it and the search engine does not count it as a link to the publisher’s site. Some of the sites that reference a publisher’s site might be willing to make it a link if they are asked. After all, they are already endorsing the publisher’s content by referencing it, and perhaps the failure to implement it as a link was simply an oversight.
Finding these types of citations is easy using the following command:
intext:seomoz.org -site:seomoz.org
Once the publisher has executed this command on their site, the next step is to review the sites listed and then determine which ones are worth contacting to request a link. This should provide a fairly high return.
Section IV: Tracking Link Campaign Results
Once a link campaign is launched, it is important to track the results. This is also a difficult process. One of the basic ways to do this is to use the link reports in Google Webmaster Tools and Bing Webmaster Tools for ongoing comparisons. A tool such as Open Site Explorer or Majestic SEO can be run to get the current inbound links. Then you can find the links that appear in the tool’s link report and remove the ones that are in your existing database. The links that are left are the ones that are new to your site.
The other thing that can be done is to make use of the analytics software in use on the publisher’s site. For example, publishers can look at the referrers report on a daily basis and recognize when a brand new referrer shows up. The reason that this works (for the most part) is that people generally perform a quick test of a new link when they implement it to make sure it is working. This should show up as a new referral in the analytics software. As before, it is helpful to maintain a database (or spreadsheet) with a list of all historical referrers to make it easier to recognize when a new one comes in.
Section V: Putting It All Together
Starting Strategically
It is critical to start strategically when implementing a link building campaign. It’s easy to start down the wrong track and waste a lot of valuable time, so this should be avoided. Here are some of the major steps for assembling a strategy:
1. Identify Types of Link Targets
There are many possible different ways to classify types of link targets. Some examples include:
- Major media sites
- Social media sites (including Digg, Propeller, Delicious, StumbleUpon, and others)
- Blogs
- Universities and colleges
- Government sites
- Sites that link to your competitors
- Related hobbyist sites
These types of lists can get a lot more granular. There might be seven different categories of blogs that represent good targets for a publisher to pursue, each requiring a somewhat different link building strategy. In the first stage of putting together a link building strategy, publishers should focus on building a list of all the different options.
2. Identify Strategies for Each Segment
The next step is to identify what it will take to get links from a particular market segment. Publishers who want to get links from “green” bloggers, for example, will need to create something that appeals to that audience.
This step should not be underestimated. Without the right content, data, or tools to appeal to a given audience, the publisher will not be successful. In addition, understanding the content or tools that needs to be developed is critical to understanding the cost of a link building strategy. Part of this step is also to identify the content packaging. For example, if the content is to be delivered using a widget, resources will need to be allocated to build the widget. Another question to ask is: will the content be placed on the publisher’s site or on a third party site (i.e., syndicating it)?
3. Identify Channel Strategies
This relates to the method of reaching the Target Sites. As defined before, there are indirect means and direct means for reaching sites. Examples of indirect means are:
- PR
- Social media sites
- Writing a column in a major media magazine
- Speaking at conferences
There are also many direct means. Some of these include:
- Sending emails
- Calling people
- Sending snail mail
- Content syndication
4. Find Out What the Competition is Doing
The next step is to see what segments the competition is pursuing. This is useful from two perspectives:
- Determine where they are successful and follow in their footsteps
- Find out what segments they have ignored and use that to get ahead
Of course, a major part of this is also nailing down specific links the competition has and identifying the ones that have the highest value. Publishers should use the competitive review to provide valuable input into finalizing their strategy.
5. Review the Cost of the Various Strategies
How much will it cost to pursue a particular strategy? Will the strategies meet the business goals? Are there brand considerations? What will the cost be in terms of money and resources? These questions must be answered before finalizing. It is no good to have the ultimate link building strategy defined if the publisher is unable to pursue it.
6. Finalize the Strategies to Pursue
The final step of the strategic phase is the easiest. Review the available information that has been put together and decide on an affordable but effective set of strategies to pursue. To do this, it might make sense to build out a chart that helps you visualize the level of effort involved vs. the return.
Effort to Pursue |
Campaign Value |
Low |
High |
Low |
Medium |
Low |
Low |
Medium |
High |
Medium |
Medium |
Medium |
Low |
High |
High |
High |
Medium |
High |
Low |
It may be seem a bit simplistic, but this level of visualization can be extremely helpful. Obviously you pursue the low effort – high value campaigns as a first priority, and you probably never get to the high effort – low value campaigns. You bring in other campaigns as makes sense along the way.
Execution
A world class link building campaign is always a large effort. The resources need to be lined up and focused on the goal. Publishers always experience bumps in the road and difficulties, but it is critical to line up the resources and go for it. It is also critical be persistent. It is amazing how poorly many companies execute their strategies. Publishers that execute relentlessly and with vision inevitably gain an edge over many of their competitors. Of course, if there are other competitors that also execute relentlessly, it becomes even more important to push hard on link building.
Conducting Strategic Reviews
Link building strategies should evolve in the normal course of business. As campaigns are pursued, lessons are learned, and this information can be fed back into the process. New strategies are also conceived over time, and some of these are great ones to pursue. Also, sometimes the initial strategy goes great for a while but begins to run out of steam. Publishers should have a constant stream of ideas that they are feeding into their link building plans.
Creating a Link Building Culture
Publishers should also train many people within the organization about their link building plan, its goals, and how it will help the business. The purpose of this is to engage the creativity of multiple team members in feeding the stream of link building ideas. The more ideas, the better. The quality of a link building campaign is directly proportional to the quality of the ideas that are driving it.
Never Stop
Link building is not something that you do once, or once in a while. We live in a culture where the search engine plays a large role in the well being of a business. Consider the business that implements a great link building campaign, gets to where they want to, and then stops. What happens to them when their competitors just keep on going? They get passed and left behind. Publishers who are lucky enough to get in front need to be prepared to fight to stay there. Publishers who are not out in front should fight to get there.
Section VI: Conclusion
Links are the engine that drives search engine rankings. This is not meant to minimize the importance of on page SEO (also known as “technical SEO”). A poorly optimized website will have problems getting traffic no matter how many links it gets. But, because links and their anchor text are used by the search engines to determine how important a site is, they determine how important a site is to a particular topic and are thought of as being the multiplier of traffic potential.
Think of it in housing terms. When you buy a house, the first thing you probably should do is to make sure all the major systems, electrical, heat, plumbing, etc. is working safely and correctly. Correspondingly, you don’t start with that kitchen remodeling project or adding onto the house. When you go to sell the house, however, the more visible improvements you have made to the house, such as the remodeling project, will have a direct impact on the sales price. The work you did on the major systems won’t have much impact on the price, but it will affect whether or not you will be able to sell the house (because people don’t want to move into a house with major systems problems). And so it is with technical SEO and link building. You won’t be able to get traffic if your site architecture is broken or there is not any real content on the site, but link building drives the site value.
In addition, think of link building as a marketing function, much like PR. Link building is about getting the word out about the great stuff you have on your site. Because of the dynamics of getting someone to link to you, you have to sell them on the quality of what you have, even if you make use of some form of incentives (other than compensation based incentives). Where link building differs from PR is that link building has specific goals about the outcome which is desired as a result of the promotional activities, and there are specific types of tactics and targets a link builder will choose as a result.
Finally, think of link building as a cost of doing business. You don’t want to make the mistake of doing a campaign, then stopping, then starting again, then stopping. All that this does it give a more determined competitor the chance to get in front of you and stay there because they do link building continuously and you don’t. Don’t let them do that. There is so much to be gained by increasing your search traffic – it is an opportunity that few can afford to ignore.
Go get yourself some links with Natural link building