You are here: Home > SEO > Google Algorithms

Category Archives: Google Algorithms

Feed Subscription

Post-Penguin SEO In 2013: Foundational Strategies

According to Matt Cutts, Google’s well-known search engineer, Google’s algorithms change as frequently as once a day. On average, the algorithm changes around 500 times per year. If you think about it, keeping up with this many algorithm changes is impossible. Instead of focusing your energies on Penguin, Panda and other algorithm updates, you need to go deeper and beyond the commonly employed SEO techniques. Let’s take a look at some of them:

Share-worthy Guest Posts

We all know that guest blogging is a hotly debated topic in the SEO community. It is arguably one of the best ways to get traffic, backlinks and improve your social profile, all at the same time. But in the same way we permanently ruined article directories by stuffing our articles with keywords, many people are also misusing guest blogging. There are hundreds of thousands of blogs out there, full of useless, keyword-infused content with no real value.

The links within these so-called guest blogs never get clicked on, and the pages have abnormally high bounce-rates. However, the fact that there are tons of mediocre guest blogs out there makes it easier for you to stand out, especially if your content is unique, original and share-worthy. Truly useful content often gets passed around on social networks, and legitimate referral traffic from social networks is an indicator of human activity. Search engines look for pages with high levels of human activity, and actively measure social signals such as: facebook likes, tweets, shares and comments.

With this said, you have an even greater incentive to write useful content rather than writing content for the sake of it. Think of guest blogging as an opportunity to make an impression on the host website’s visitors. You will only get one chance, but if you write an innovative and unique post with a twist, the host might invite you to write more posts.

Author Rank

Author rank is yet another step by Google towards a better web. The idea is to rank authors based on the authority and quality of their content as determined by Google’s algorithms. So, how does one go about building author rank?

Google+ comes in handy here. Go to the “About” section of your Google+ profile and add links to the webpages you regularly contribute content to. It is important to remember that Google+ plays an integral part in building your author rank, and Google considers many factors to compute it. These factors include: the number of people in your circles; the number of people who have you in their circles; your frequency of content generation, and the level of social activity on your Google+ profile.

This is what you should do to increase your author rank:

1. Create great content regularly
2. Be active on social media
3. Share useful content with people in your network

Co-Citation

Co-citation is yet another hotly debated topic among SEOs, and there seems to be some disagreement on what co-citation is, in addition to co-occurrence. Some theorize co-citation as a step by Google towards a more intelligent and semantic web. The concept itself is pretty simple to understand, and there are real examples in which co-citation has helped websites rank for keywords that were neither in their title, nor in their description tags.

Co-citation can best be explained with an example. Suppose Rob’s Ice Cream Parlor makes great “Blueberry Ice Cream.” In this case, search engines cannot make a connection between the keywords “Rob’s Ice Cream Parlor” and “Blueberry Ice Cream” unless they have been mentioned together on various websites. Google can now spot if there is a connection between two keywords or key-phrases, like the correlation between a company and its products.

There are numerous examples of co-citation that help websites rank for keywords they aren’t even attempting to rank. SEOmoz’s famous, Open Site Explorer ranks for the search term, “backlink analysis,” even though it isn’t mentioned in title, description or the page itself. Google picked up some other article from somewhere else on the web for use in the search snippet and that article does mention “backlink analysis” and “Open Site Explorer” together.

Press Releases

Some things never get old. Press releases are one of the oldest tricks in a SEO’s arsenal, but they still work. A well-positioned press release with useful information about your company’s products or services not only drives traffic to your website, but also helps increase awareness about your brand amongst the public. Apart from usefulness of your content, you can also insert relevant anchor texts where necessary to your home and inner pages. This helps you get backlinks and traffic, simultaneously.

How you distribute your press release is also very important. You can choose to email it to journalists and online news outlets manually, or use the services of a company like PRWeb or PR Newswire. Most of these companies guarantee syndication of your press release to hundreds of online news outlets, so if you know how to write a great press release, it can prove to be a great tool (especially when you’re launching a new product, service or campaign).

It is important to note that Matt Cutts of Google has stated that backlinks from press releases do not contribute towards page rank, but some recent tests have proven otherwise.

Social Media Signals

A couple of years ago, Matt Cutts denied the effect of social signals on rankings, but his denial has now changed into guilty admission. Not only do social signals affect rankings, but Google also made social signals an integral part of their ranking algorithm with the introduction of Google+.

So, what’s all the fuss about? It’s quite simple, really. Social signals, as we mentioned earlier, are an indicator of human activity. The number of times something gets liked, tweeted or commented on becomes an indicator of social value. The more popular something is, the more authoritative it becomes. This means social signals not only have the ability to affect rankings, but also the perceived value and authority of a website in the eyes of search engines.

Google and Bing, both major search engines, actively consider social signals to help rank websites and make SERPs more relevant to the user. Therefore, it is more important than ever to create share-worthy content, which can even go viral if you’re lucky, and focus your efforts on creating value rather than brute link building.

Understanding the Value of Latent Semantic Indexing (LSI) Content

LSI content is becoming an increasingly important factor in search engine optimization (SEO). LSI stands for “latent semantic indexing” and it is a method of analyzing the concepts connected with a document or a collection of documents.

Optimizing for LSI is not replacing traditional SEO, but instead, it works together with the older techniques. LSI content will likely become more important in the future as the major search engines try to deliver web content that provides closer matches for user queries.

What is latent semantic indexing?

The word “semantics” refers to the meaning of words and LSI attempts to discover the concepts associated with web pages by analyzing how words work in combination with other words. Many terms will have different meanings depending on the context of the document.

What LSI attempts to do is to map out the relationships between words in order to help decipher the meaning of the text. While the algorithms used to rank web pages cannot “think” like humans, they can compare the words used in a particular document along with categorizing the structure of that document.

Additionally, search ranking algorithms can analyze collections of documents to help in better determining the related concepts of the collection.

How LSI can change search engine result pages (SERPs)

One of the tangible results of latent semantic indexing is that it can have marked impact in many cases on search engine listings.

Previously, a user query would only return pages that contained all the specific keywords in the query. However, with LSI that no longer is the case. While in most instances, one would find all the included keywords, there can also be results that might miss a keyword or even all the keywords in the query.

Such results would be more common with queries that contain multiple keywords that the search algorithm recognizes as LSI content. The algorithm can make a probabilistic guess on the concepts that the user wants, and then look for pages that best match those concepts even if they do not contain all the query keywords.

For example, if a user searches using the query “fast Italian cars,” a specific page may match the general concept of the query precisely. However, that page may not use any of the keywords contained in the query. For example, it might not use the words “car” or “cars,” but instead, replaces them with other words like “automobile,” “vehicle” and “roadster.” Words like “powerful” and “speedy” might appear instead of “fast.” Rather than the keyword “Italian,” the text would simply mention popular Italian sports car brands like Ferrari and Lamborghini.

Why good writing is important for LSI content

One of important features of latent semantic indexing is its ability to recognize what types of words to expect when dealing with specific concepts. In this way, it can detect, to some extent, the “meaning” of a particular document or collection of documents.

One of the main techniques of traditional SEO is to place a certain density range of keywords and related keywords throughout the text and particularly in certain parts of the text like the first sentence and the last paragraph.

While such techniques may still be important, they are not sufficient to give a document or collection of documents a high semantic score. That is because latent semantic indexing recognizes the types of words used in high quality documents for specific concepts.

An SEO specialist is unlikely to be able to guess these word combinations or to figure them out well using the Google Keyword Tool or other applications. These words along with the desired ratios tend to come out when a knowledgeable writer creates a well-written document.

Site structure and links

LSI does not just analyze single web pages, as was mostly the case with the old algorithms. The methodology places more emphasis on analyzing collections of documents.

Just how search ranking algorithms determine what constitutes a document collection is an important question. Many SEO specialists use the “silo” method to create a file structure on the site server that organizes web pages in “collections.”

For example, on a car site there may be a particular folder or silo that deals only with Italian sports cars. All the important topics on that car site will have their own silos, i.e., British cars, American cars, Japanese cars, etc.

In this way, the SEO team hopes the algorithm will recognize the structure and analyze the documents in the folders as collections. Of course, it is very important to ensure that all web pages and other documents in these folders are directly on topic.

Another way that algorithms may detect document collections is through links. For this reason, it is important to include links to other documents that deal with the same specific sub-topic. If you do have links that diverge from the particular focus of the collection, it is best to create these as no-follow links.

How to create LSI content

Most SEO experts use applications like the Google Keyword Tool to find related semantic terms. Obviously, it would seem best to use keywords that people search for commonly.

At the same time, though, it is good not to overuse related keywords in a way that is unnatural. The algorithms pay close attention to densities and positioning of words and, in most cases, it will be able to detect excessive keyword stuffing.

The best idea is to create content that reads well and covers the important concepts related to the targeted keyword.

Remember that search engines are constantly trying to provide higher quality results to their users. They do not want to deliver results from spammy, poorly written pages. LSI is part of the effort to analyze content that reads like other highly regarded documents for the same concepts.

One thing to ask yourself when analyzing LSI content is: does it explain the topic well or answer a specific question? If the content leaves you better informed after reading it, then it will likely past the LSI test.

However, if the content leaves you confused or unsatisfied, then it probably will need more work.

How Google’s Panda and Penguin Changed the SEO Landscape

The world of search engine optimization has evolved in the past few years. Each time Google rolls out an update, panic spreads like wildfire across forums, blogs, and social networks. For marketers, knowing the date of these updates can help explain changes in organic website traffic and rankings. Google’s recent Penguin and Panda algorithm updates have dramatically affected search results. No one knows what Google’s next move will be.

Why the Updates?

Achieving high search engine rankings has become critical to business success. Google aims to deliver the best user experience. Panda and Penguin updates are a boost for marketers who have worked hard to provide quality content. Their goal is to identify websites that use questionable SEO strategies and improve search engine results. It is important to understand how these updates affect your website so that you can optimize your pages and make sure you don’t lose out.

Since Panda and Penguin target two different issues, it is essential to know the exact algorithm that hit your website. Penguin targets unnatural backlinks and spam, while Panda is all about low quality content. Both updates will be rolled out periodically. Therefore, you have to continuously improve your SEO strategy and provide the highest quality content. In order to be successful and maintain your search engine rankings, it is necessary to make a few changes to your website, including:

– Remove poor quality content
– Create fresh, interesting content
– Build relevant links
– Eliminate unnatural back links
– Vary your anchor text
– Ensure proper navigation
– Optimize pages for multiple keyword phrases
– Remove duplicate content
– Avoid over optimization and keyword stuffing

The best way to determine whether you have been hit by Penguin or Panda is to launch Google Analytics and check your stats. You must know the problem in order to address it. Based on how Google rolled out these changes recently, many marketers are confused about which update hit their websites. These updates force people to keep producing quality content and use ethical SEO techniques.

Google Panda Update Overview

Google Panda was released in February 2011. This update aimed to lower the rank of “thin sites” and “low quality sites” that provide little or no value to users. At the same time, Google Panda provided better rankings for high quality websites with original content and fresh information. This change to Google’s search results ranking algorithm has had a great impact on content farms. These website were loaded with hundreds of low quality, keyword stuffed articles.

According to Google, over 12 percent of searches in the United States were affected by Panda. This update is just one of more than 200 different factors that Google uses to rank pages. In the last year, Panda began to have a monthly schedule. As a result, many websites tried to diversify their content beyond search engines and create better quality articles. To lose the penalty, webmasters must remove or improve low quality content.

This update made people realize that some widely used practices were actually going against Google’s recommended Best Practice guidelines. Panda affected web pages that don’t have relevant content, but simply exist to push users to cloned sites, landing pages, parked pages and more. Many websites hit by Google Panda are loaded with intrusive ads or provide duplicate content. It takes just one or two pages of poor quality content, and your entire website can be penalized.

How to Recover from Google Panda

The best thing you can do to recover from Google Panda is to constantly update your website with fresh, relevant, and informative content. Webmasters should also avoid over optimization. Try to keep the optimization ratio around 70 percent. Every time you write a new post, ask yourself a few questions. Is it well researched? Is it useful for readers? Are you providing information from reputable sources? Focus on creating unique content. For example, if you have an ecommerce website or an online store, write unique product descriptions.

What Is Google Penguin?

Google Penguin was released on April 24, 2012. This update affected over three percent of English-language search queries. It aimed to decrease rankings of websites that violate Google’s Webmaster Guidelines by using black hat SEO tactics, including content spinning, link schemes, cloaking, and keyword stuffing.

Many SEO experts agree that Penguin was just the official word that Google is taking action against people who are trying to cheat the system and use unnatural linking practices. Website owners should check their Google Webmaster accounts for any messages from Google warning about a potential penalty. Some contributing factors to Penguin may include:

– Keyword stuffing in internal/external links
– Blog spam
– Low quality article marketing
– Overuse of exact-match domains
– Excessive links from poor quality websites’
– Aggressive internal linking
– Unnatural inbound links
– Paid links
– Link exchanges

Google Penguin is cracking down a common black hat SEO strategy: abusing links to achieve higher search engine rankings. If your website has links from low quality directories, link schemes, and dubious sites, you may receive a penalty. It is estimated that 94 percent of Google Penguin victims did not fully recover. Overly-optimized websites will continue to be targets. For now, the best approach is to focus on the quality of your site in terms of content value, link value, and visitor interaction.

How to Recover from Google Penguin

The first step to recover from Google Penguin is to analyze your website’s traffic data and identify the problem. Check the backlinks pointing to your site and determine which of them are contributing to your penalty. Remove as many bad links as you can. Send out link removal requests to webmasters and then start a new link building campaign. Safe post-Penguin link-building anchor types include hybrid-branded anchors, universal anchors, and branded anchors.

Focus on link quality. Work on collaborative projects with reputable webmasters. Offer you services in exchange for relevant backlinks. Publish guest posts on high quality sites. Spend your time building connections with industry influencers. Keep your bounce rate low and write great content. Make sure your website has a diversified link building strategy.

Scroll To Top
Automatic Coupon Codes on Checkout.
110.000+ Stores 765.000+ Coupons & Deals Updated Daily!
INSTALL EXTENSION