Posts Tagged ‘Google’

SEO After Google Panda and Penguin

Published by Todd Herman on July 31st, 2012 - in Penguin update

SEO After Google Panda and Penguin

Google Panda is a kind of algorithm update which can influence SERP. It functions by reducing the traffic of low quality content site. Major factors influencing the performance of website in search results due to Google Panda effect include duplicate content, matchless headings of webpage and high amount of advertising on site.

Similar to Google Panda, Google Penguin released in April 2012 is another Google product affecting SEO marketing for your business.

The Penguin updates deal with a number of SEO factors like cloaking, keyword stuffing and content spinning. Following paragraphs will describe you SEO marketing after Google Panda and Penguin updates.

How Google Panda affect SEO?

Eliminating low quality content is one of the main intensions of Google Panda update. It drives more traffic to those sites with good content. It allows internet users to get what they need. To avoid the risk of Google Panda effect, internet marketers are advised to take care what they publish. Google promotes good, useful and real content on sites.

Tips to improve SEO after Google Panda

* Creating the finest quality content

Publishing high quality content is one of the best ways to improve SEO marketing after Google Panda. To protect website from Google Panda effect, website administrators are advised to remove keyword stuffed and filler content from their webpages.

* Design user friendly website

Website design plays an important role in SEO marketing. To overcome the troubles due to Google Panda, internet marketers are advised to choose user friendly design for their website.

Apart from choosing user friendly web design, SEO can be also improved by promoting product reviews, encouraging customer feedbacks and by active participation in social media sites like Twitter and Facebook.

* Update content

Updating content is one of the important ways to improve SEO marketing after Google Panda effect. While updating content, never hesitate to remove keyword stuffed content from site.

How Penguin affect SEO?

Penguin update from Google is dedicated to penalize spammers. Getting backlinks from content farms is one of the main factors influencing your website with Penguin updates. The best way to overcome this trouble is by creating relevant links by adding unique and quality content on site.

Tips to improve SEO marketing after Penguin

* Do not use Java Scripts

How many of you have used Java Scripts during web designing? According to studies, introducing Java Scripts or Flash can reduce the search engine visibility of site. Hence it is recommended to avoid using Java scripts during web designing.

* Limit the use of media files

Use of media files can negatively impact the performance level of website. It increases loading speed and reduces the overall performance level of website. This in turn reduces traffic to website and negatively impacts SEO marketing.

* User friendly website navigation

Following proper website structure and design is one of the main steps to improve SEO marketing after Penguin. For easier navigation, website administrators are advised to place important categories like Home on the visible area of site.

* Normalizing URLs

Normalizing URLs avoid duplicate content and protects webpage

7/31/2012 – Todd Herman

New Google Webmaster Help Videos?

Published by Todd Herman on June 25th, 2012 - in Guest Post SEO

Just received nine of new New “Google Webmaster Help Videos” and kudos goes out to them as they are very informative and easy to understand. These videos help recognize the values that place your website on the top of the Google search engine. Which helps the website owners that want to perform their own SEO services.

There are no secrets in threes videos, they are plain and to the point….the message is clear, its the same message they have been implying  from the beginning.

Take out your website garbage!

Todd.

Tags: , Help, Just received nine of new New "Google Webmaster Help Videos" and, New, Videos, Webmaster

Google, Bing & Yahoo in Partnership to Sell Top Organic Local Listings?

Published by Todd Herman on February 7th, 2012 - in Guest Post SEO

Google, Bing & Yahoo in Partnership to Sell Top Organic Local Listings?

,

A new service offered by Bruce Clay Inc. called Local Paid Inclusion should raise a few eyebrows in the search marketing community – if there’s any truth to it, that is. Officially in alpha, according to backend partner Universal Business Listings, the LPI program will offer top organic rankings in local listings for a fee, sources told Search Engine Watch.

UBL’s Doyal Bryant told SEW in a phone interview that the service is on hold, at the very least until next week, while the organizations test and troubleshoot.

However, both Google and Bing strongly deny any such program is in development.

“We are not working on any program that enables a site to pay to increase ranking in organic search results,” according to a Google spokesperson.

Bing also denied taking part in such a program.

“Bing is not working on the Local Paid Inclusion program and would not consider giving preferential treatment to advertisers in organic search results,” according to a Microsoft spokesperson.

Now UBL is also denying any involvement:

Universal Business Listing denies any association with articles and news reports about a “paid inclusion” business listing service. The company has made no such announcements or claims, particularly in regards to Google. It has no product announcements pending.

Bruce Clay Inc is a reseller of UBL’s existing business listing syndication service and is not currently testing any new service from our company.

One program partner explained to Search Engine Watch how it will work:

“Using Google as an example, a local business in the ‘organic places’ area can pay a small monthly fee and this program moves them to the top area of the Places results. So essentially, it creates a premium section at the top of the Places results that never before existed, and a local business can pay a fee to appear in that area. As a result, whenever Places appears on the first page of Google results, and you are in the Local Paid Inclusion program, you should appear in that area of the first page of Google results.”

The Local Paid Inclusion website, owned by Bruce Clay, states:

“In January of 2012 we were approached to participate in a new and exciting program: Local Paid Inclusion (LPI). We’re offering it directly to local businesses, to chains of businesses, to resellers and through large distribution channels. We have an exclusive agreement to distribute LPI to domain registrars.Local Paid Inclusion is a Google, Yahoo and Bing official service that is offered as an approved official contracted program in cooperation with those search engines. This is a program supported by the search engines directly – and you can order it here. The search engines do not sell this directly.”

The website also has pricing information:

The paid inclusion prices are based upon value: First page local results rankings for an average of less than $1.70 per day. If Call Tracking is involved that call fee is extra. This fee covers up to 30 keywords appropriate to your profile page and business, making the fee about $0.06 per day per keyword… less expensive than PPC and definitely higher impact because it is in the organic results area of the search results page.

One source said plans called for the paid listings to be categorized as organic and would not be marked paid, advertising, or sponsored – they would blend seamlessly with organic local listings. This offering is not connected with Google’s AdWords Express program or other similar programs offered by others in the arrangement, but would create new space for LPI program listings.

“It’s a really exciting program, when we’re ready we’ll start talking about it,” Bryant told us earlier today.

This arrangement raises a few important questions:

  • Should search engines profit from the sale of organic listings?
  • Should this type of paid advertising be marked as such?
  • How does paid organic inclusion affect the quality of local listings, when paid listings can outrank those chosen for the top spot based on relevancy, geography, ratings, or other factors?
  • Hasn’t paid inclusion died a slow death a few times already?

For now, there are more questions than answers. As it stands, the players have been dragged into the limelight on this one kicking and screaming, you could say; the search engines involved are reportedly working out technical issues and did not want the program announced for another two to three weeks.

Editor’s note: This story has been updated to include statements from Google, Bing, and UBL.

 

Tags:

Google May Penalize Your Site for Having Too Many Ads

Published by Todd Herman on November 30th, 2011 - in Guest Post Internet Marketing

by , 28 Comments

Google is looking at penalizing ad heavy sites that make it difficult for people to find good content on web pages, Matt Cutts, head of Google’s web spam team, said yesterday at PubCon during his keynote session.

”What are the things that really matter, how much content is above the fold,” Cutts said. “If you have ads obscuring your content, you might want to think about it,” inferring that a if a user is having a hard time viewing content that the site may be flagged as spam.

Google has been updating its algorithms over the past couple months in their different Panda updates. After looking at the various sites Panda penalized during the initial rollout, one of the working theories became that Google was dropping the rankings of sites with too many ads “above the fold.”

This is an odd stance, considering Google AdSense Help essentially tells website publishers to place ads above the fold by noting, “All other things being equal, ads located above the fold tend to perform better than those below the fold.”

Cutts also encouraged all websites that have been marked as spam and feel they should not have been marked as spam to report their sites to Cutts and his team. Cutts stated that he has a team of web spam experts looking into problem sites and that the Google algorithm still misses a site or two in its changes.

SEO is Not Dead, Is Always Evolving

Leo Laporte took the stage Tuesday as the keynote speaker at PubCon. Laporte talked about video and how getting your audience involved with you is the next step to online media.

Later, Laporte said he believes SEO will be dead in the next six months. As you’d expect, the crowd responded negatively to this assertion, even causing many of them to walk out.

Yesterday, Cutts responded during his keynote talk. Cutts started by setting the record straight, letting everyone in the audience know that SEO will still be here for the next six months, let alone the next six years.

Cutts joked by mentioning a a tweet about him “spitting out his morning coffee” in reaction to Laporte’s statement the earlier morning. He thought it was more of a joke and laughed about the whole thing.

SEO will always be evolving, Cutts told the audience. Search will always be getting better, getting more personalized for each one of us. Google will always be striving to help people to get the best results possible while getting fresh real time results.

Later he talked about how if Google and every other search engine were to die that Internet marketing and SEO would still be alive because of social. Looks like SEO is here to stay!

Tags: ads, , Penalize

Google Algorithm Changes: Google Lists Ten New Ones

Published by Todd Herman on November 15th, 2011 - in Guest Post SEO

Google put out a blog post highlighting ten recent algorithm changes. Don’t get too excited about the SEO possibilities though, because Google only listed the ones it deemed non-gamable.

These are changes that Google has made over the past couple of weeks. Here’s the list, as explained on Google’s Inside Search Blog:

  • Cross-language information retrieval updates: For queries in languages where limited web content is available (Afrikaans, Malay, Slovak, Swahili, Hindi, Norwegian, Serbian, Catalan, Maltese, Macedonian, Albanian, Slovenian, Welsh, Icelandic), we will now translate relevant English web pages and display the translated titles directly below the English titles in the search results. This feature was available previously in Korean, but only at the bottom of the page. Clicking on the translated titles will take you to pages translated from English into the query language.
  • Snippets with more page content and less header/menu content: This change helps us choose more relevant text to use in snippets. As we improve our understanding of web page structure, we are now more likely to pick text from the actual page content, and less likely to use text that is part of a header or menu.
  • Better page titles in search results by de-duplicating boilerplate anchors: We look at a number of signals when generating a page’s title. One signal is the anchor text in links pointing to the page. We found that boilerplate links with duplicated anchor text are not as relevant, so we are putting less emphasis on these. The result is more relevant titles that are specific to the page’s content.
  • Length-based autocomplete predictions in Russian: This improvement reduces the number of long, sometimes arbitrary query predictions in Russian. We will not make predictions that are very long in comparison either to the partial query or to the other predictions for that partial query. This is already our practice in English.
  • Extending application rich snippets: We recently announced rich snippets for applications. This enables people who are searching for software applications to see details, like cost and user reviews, within their search results. This change extends the coverage of application rich snippets, so they will be available more often.
  • Retiring a signal in Image search: As the web evolves, we often revisit signals that we launched in the past that no longer appear to have a significant impact. In this case, we decided to retire a signal in Image Search related to images that had references from multiple documents on the web.
  • Fresher, more recent results: As we announced just over a week ago, we’ve made a significant improvement to how we rank fresh content. This change impacts roughly 35 percent of total searches (around 6-10% of search results to a noticeable degree) and better determines the appropriate level of freshness for a given query.
  • Refining official page detection: We try hard to give our users the most relevant and authoritative results. With this change, we adjusted how we attempt to determine which pages are official. This will tend to rank official websites even higher in our ranking.
  • Improvements to date-restricted queries: We changed how we handle result freshness for queries where a user has chosen a specific date range. This helps ensure that users get the results that are most relevant for the date range that they specify.
  • Prediction fix for IME queries: This change improves how Autocomplete handles IME queries (queries which contain non-Latin characters). Autocomplete was previously storing the intermediate keystrokes needed to type each character, which would sometimes result in gibberish predictions for Hebrew, Russian and Arabic.

Google said last week that it aims to be more transparent about algorithm changes going forward, and intends to announce major updates as they launch, much like they did with Panda and the recent freshness update.

Google also says it is testing algorithm changes that will look more closely at ad-to-content ratio for the portion of a page that resides “above the fold.” Expect this to be a more critical signal in 2012.

By Chris Clum

Tags: algorithm updates, , search

Google Places SEO

Published by Todd Herman on July 6th, 2011 - in Guest Post SEO

Google Places SEO: Lessons Learned from Rank Correlation Data

In early June of this year, SEOmoz released some ranking correlation data about Google’s web results and how they mapped against specific metrics. This exciting work gave us valuable insight into Google’s rankings system and both confirmed many assumptions as well as opened up new lines of questions. When Google announced their new Places Results at the end of October, we couldn’t help but want to learn more.

In November, we gathered data for 220 search queries – 20 US cities and 11 business “types” (different kinds of queries). This dataset is smaller than our web results, and was intended to be an initial data gathering project before we dove deeper, but our findings proved surprising significant (from a statistical standpoint) and thus, we’re making the results and report publicly available.

As with our previous collection and analysis of this type of data, it’s important to keep a few things in mind:

  1. Correlation ? Causation – the findings here are merely indicative of what high ranking results are doing that lower ranking results aren’t (or, at least, are doing less of). It’s not necessarily the case that any of these factors are the cause of the higher rankings, they could merely be a side effect of pages that perform better. Nevertheless, it’s always interesting to know what higher ranking sites/pages are doing that they’re lower ranking peers aren’t.
  2. Statistical Signifigance – the report specifically highlights results that are more than two standard errors away from statistical significance (98%+ chance of non-zero correlation). Many of the factors we measured fall into this category, which is why we’re sharing despite the smaller dataset. In terms of the correlation numbers, remember that 0.00 is no correlation and 1.0 is perfect correlation. It’s in our opinion that in algorithms like Google’s, where hundreds of factors are supposedly at play together, data in the 0.05-0.1 range is interesting and data in the 0.1-0.3 range potentialy worth more significant attention.
  3. Ranked Correlations – the correlations are comparing pages that ranked higher vs. those that ranked lower, and the datasets in the report and below are reporting on average correlations across the entire dataset (except where specified), with standard error as a metric for accuracy.
  4. Common Sense is Essential – you’ll see some datapoints, just like in our web results set, that would suggest that sites not following the  commonly held “best practices” (like using the name of the queried city in your URL) results in better rankings. We strongly urge readers to use this data as a guideline, but not a rule (for example, it could be that many results using the city name in the URL are national chains with multiple “city” pages, and thus aren’t as “local” in Google’s eyes as their peers).

With those out of the way, let’s dive into the dataset.

  • The 20 cities included:
    • Indianapolis
    • Austin
    • Seattle
    • Portland
    • Baltimore
    • Boston
    • Memphis
    • Denver
    • Nashville
    • Milwaukee
    • Las Vegas
    • Louisville
    • Albuquerque
    • Tucson
    • Atlanta
    • Fresno
    • Sacramento
    • Omaha
    • Miami
    • Cleveland
  • The 11 Business Types / Queries included:
    • Restaurants
    • Car Wash
    • Attorneys
    • Yoga Studio
    • Book Stores
    • Parks
    • Ice Cream
    • Gyms
    • Dry Cleaners
    • Hospitals

Interestingly, the results we gathered seem to indicate that across multiple cities, the Google Places ranking algorithm doesn’t differ much, but when business/query types are considered, there’s indications that Google may indeed be changing up how the rankings are calculated (an alternative explanation is that different business segments simply have dramatically different weights on the factors depending on their type).

For this round of correlation analysis, we contracted Dr. Matthew Peters (who holds a PhD in Applied Math from Univ. of WA) to create a report of his findings based on the data. In discussing the role that cities/query types played, he noted:

City is not a significant source of variation for any of the variables, suggesting that Google’s algorithm is the same for all cities. However, for 9 of the 24 variables we can reject the null hypothesis that business type is a not significant source of variation in the correlation coefficients at a=0.05. This is highly unlikely to have occurred by chance. Unfortunately there is a caveat to this result. The results from ANOVA assume the residuals to be normally distributed, but in most cases the residuals are not normal as tested with a Shapiro-Wilk test.

You can download his full report here.

Next, let’s look at some of the more interesting statistical findings Matt discovered. These are split into 4 unique sections, and we’re looking only at the correlations with Places results (though the data and report also include web results).

Correlation with Page-Specific Link Popularity Factors

NOTE: In this data, mozRank and PageRank are not significantly different than zero.

Domain-Wide Link Popularity Factors

NOTE: In this data, all of the metrics are significant.

Keyword Usage Factors

All data comes directly from the results page URL or the Places page/listing. Business keyword refers to the type, such as “ice cream” or “hospital” while city keyword refers to the location, such as “Austin” or “Portland.” The relatively large, negative correlation with the city keyword in URLs is an outlier (as no other element we measured for local listings had a significant negative correlation). My personal guess is nationwide sites trying to rank individually on city-targeted pages don’t perform as well as local-only results in general and this could cause that biasing, but we don’t have evidence to prove that theory and other explanations are certainly possible.

NOTE: In this data, correlations for business keyword in the URL and city keyword in the title element were not significantly different than zero.

Places Listings, Ratings + Reviews Factors

All data comes directly from Google Places’ page about the result.

NOTE: In this data, all of the metrics are significant.

Interest Takeaways and Notes from this Research:

  • In Places results, domain-wide link popularity factors seem more important than page-specific ones. We’ve heard that links aren’t as important in local/places and the data certainly suggest that’s accurate (see the full report to compare correlations), but they may not be completely useless, particularly on the domain level.
  • Using the city and business type keyword in the page title and the listing name (when claiming/editing your business’s name in the results) may give a positive boost. Results using these keywords seem to frequently outrank their peers. For example:

  • More is almost always better when it comes to everything associated with your Places listing – more related maps, more reviews, more “about this place” results, etc. However, this metric doesn’t appear as powerful as we’d initially thought. It could be that the missing “consistency” metric is a big part of why the correlations here weren’t higher.
  • Several things we didn’t measure in this report are particularly interesting and it’s sad we missed them. These include:
    • Proximity to centroid (just tough to gather for every result at scale)
    • Consistency of listings (supposedly a central piece of the Local rankings puzzle) in address, phone number, business name, type
    • Presence of specific listing sources (like those shown on GetListed.org for example)
  • This data isn’t far out of whack with the perception/opinions of Local SEOs, which we take to be a good sign, both for the data, and the SEOs surveyed :-)

Our hope is to do this experiment again with more data and possibly more metrics in the future. Your suggestions are, of course, very welcome.


As always, we invite you to download the report and raw data and give us any feedback or feel free to do your own analyses and come to your own conclusions. It could even be valuable to use this same process for results you (or your clients) care about and find the missing ingredients between you and the competition.

p.s. Special thanks to Paris Childress and Evgeni Yordanov for help in the data collection process.

© 2013 https://canadaseopro.ca - Canadian SEO Blog