Category Archives: Penalty Updates

Google Penguin Update Recovery Tips & Advice

This entry was posted in Penalty Updates, SEO on by .

The importance of Penguin recovery service is becoming clearer to most online marketers and brand professionals. However, there are still quite a few people with questions who are apparently confused by the whole process. Here are a few quick points and tips that can help to clear up some of that confusion.

What in the World is Penguin?

Google Penguin was designed to modify the search engine results pages, punishing pages that have been using Google to distribute spam and bad quality links. Having Penguin enforced allows users to benefit from a drastically reduced amount of “cloaking” and “keyword stuffing” in the world of web content.

How Can I Recover from Penalties?

If your website has been penalized or punished by Google Penguin for one offense or another, this does not necessarily mean that it Is the end of the road for you. The first step that you should take is to get rid of all signs of spam that may exist within your site. Keep in mind that it was likely the spam activity that led to you considering Penguin recovery service in the first place, which is why it is the first target that you want to eliminate from your online presence.

What Happened with the Parked Domains Error?

For a substantial period of time, Google incorrectly classified a substantial number of websites – causing them to lose a considerable amount of traffic as a result. The issue was resolved in a relatively short period of time, but the overall lapse and discontinuity was seen and noticed firsthand by consumers and critics alike.

Is There a Way to Avoid Google Penguin?

A short yet simple answer to this question is a resounding “no.” Keep in mind that organic exposure occurs within search engines – especially Google. Therefore, if you are serious about boosting your traffic and increasing the bottom-line revenue of your business or brand, then it would be in your best interest to work within the boundaries of expectations established by Google. Trying to fight against it or do something differently could very well end badly for you and your online presence.

Top Tools for Recovering from a Google Penalty

This entry was posted in Penalty Updates, SEO on by .

In today’s heavily-digital marketplace, online traffic is a must for any business hoping to secure steady profit. This need is what drives search engine optimization – or SEO – and creates the demand for websites that push the boundaries of what’s possible with creating that traffic. However, it’s the pushing of those limits that can land some sites in trouble, and lead to the need for a Google or other search engine penalty recovery service.

What Exactly is a Google Penalty?

One of the most common problems faced by websites who are trying to make the most of their online traffic flow is the Google penalty. Google has guidelines for proper usage of their index, and any site that violates these will be penalized by being removed from the Google rankings. This penalty can come as either a manual or algorithmic penalty, and can be extremely damaging to your company’s business either way. You can use Fetch as Google to determine whether your site is in violation of the engine’s guidelines either before or after a penalty is received. If it is, the best course of action may be to seek prompt assistance from an SEO professional with penalty recovery experience.

Getting Out of Hot Water

One of the primary reasons for receiving a penalty from Google or another major search engine is your site’s usage of links. Bad backlinks create major issues, because they are often neglected in terms of updates. If one or more of your linked sites is in violation of Google’s guidelines, your site will be, also. Fixing or deleting these links is a good first step in recovering your rankings.

Some other steps to consider include:

  • Find bad backlinks and monitor the rest consistently with Ahrefs and Monitor Backlinks.
  • Run your site through CopyScape to avoid plagiarism issues.
  • Choose a scanner like Screaming Frog Web Crawler to search your site for any other issues that might trigger a penalty.

If none of these steps fix the problem you’re facing, it’s probably in your best interest to consult a penalty recovery service. By seeking professional help, you’ll be able to rest assured that the problem will be addressed promptly and correctly, giving you more time to focus on your business and bounding back from your penalty. Bad things happen to the best of businesses, but a penalty doesn’t have to spell the end of online traffic for yours!

Steps to Remove Google Penalty from Website

This entry was posted in Penalty Updates, SEO on by .

Receiving a Google penalty can come as a shock if you run a quality site, but there are a variety of reasons it occurs. Here are the steps to take to remove penalties.

Understanding Penalties

There are two types of penalties that you can be slapped with by Google that will require action toward SEO penalty recovery:

  • Algorithm: A drop in ranking when Google’s own algorithms change in Panda or Penguin. Panda picks up on subpar content, excessive ads, and slow traffic speed. Penguin addresses issues that ping the negative radar – including over-using links, over-optimization of content, and links from sketchy sources that lead back to your site.
  • Manual: This is a penalty where live human reviewers assess whether or not your site is spam-filled or deserves to be penalized. This is a more severe type of penalty to be taken seriously and uses human intelligence.

 

Identifying and Understanding the Penalties

Any type of penalty you’ve received will appear in Google webmaster tools dash, and Manual penalties will be clear and present in messages. Algorithm penalties require reviews of recent changes to Google’s own algorithms and how your site’s ranking has suffered as a result. Manual penalties are the ones to watch out for, and require more complicated action.

 

Action to Be Taken

Identify culprits that contain too many links to your site, or links that originate from suspicious, low ranked sites that are identified as being spam-oriented. You’ll need to go back and ask for the links to be removed from the sites, which can be time consuming. Using SEO professionals to do this is much easier than trying to compile the information yourself and conduct the step-by-step process of contacting individual webmasters, in addition to addressing other problems. After you’ve made the fix, you’ll need to submit a request to Google for reconsideration.

 

By taking these simple steps, the penalty can be resolved. It’s all about closely monitoring the status of your Google metrics and activity, and ensuring that you’re paying close attention to how your site ranking is performing. Regardless of how large your website is, no one can afford to ignore Google results in this day and age.

Why You Need to Focus More on Page Speed?

This entry was posted in Content, Penalty Updates, SEO, Web Design Posts on by .

Speed is the reason why people take Interstates over back-roads, shop at convenience stores instead of supermarkets and skip commercials with a DVR remote instead of watching them in real-time. Most consumers want to get from Point A to Point B as quickly as they can, which is exactly why your website’s page loading speed is critical. A web development services agency in Los Angeles and Ventura would tell you that “less is more” – especially if it allows you to quickly convert your short-term guests into long-term customers.

Delayed Page, Lost Customers

It has been proven by numerous studies that delays in page loading times lead to abandoned online shopping carts and dissatisfied customers. One study shows that 47 percent of consumers expect websites to load within a maximum of 2 seconds, per Business 2 Community.

The same report claims that 40 percent of consumers will leave a webpage if they must wait longer than 3 seconds for it to load completely. The “it’s worth the wait” concept clearly does not apply in this scenario, especially if they are able to leave your website and get what they want from your competitors’ site instead in a fraction of the time.

Raising Your Awareness of Loaded Content

Another key reason why you should pay close attention to your page speed is it will help you monitor your loaded content effectively. Most delays, freezes and website crashes are caused by an excessive amount of data trying to load at the same time. If your pages have a lot of integrated media, embedded images and videos and social media links, all these content items can quickly clash against each other – fighting over the same bandwidth to an extent where nothing gets done. While designing the site, you may think that all this loaded content is a perk. However, you will soon discover that it is not as advantageous as you think when you realize that not many people are sticking around long enough to access it.

Conclusion

It is true that you should want to present quality content on your company’s website. However, you also need to find a balance between quality content and expeditious loading times. In order to engage your target audience, it is very necessary to make it narrative over descriptive.

Website Maintenance How to Close a Site For a Day

This entry was posted in Penalty Updates, Web Design Posts on by .

There are always times – dire times, but these exist for everyone – when a website needs to take a few hours or days of me-time. That’s not so much personal time, as it is maintenance time – time for you or your webmaster to make important changes, fix graphical errors and grammatical mistakes, use a Penguin recovery service from SeoTuners, update outdated information and generally polish the website up to be a bit better than it was yesterday.

One way or another, simply disappearing off the face of the Earth or cutting off all information on your site is a bad idea for two simple reasons: 1.) you’ll alienate visitors by presenting them with absolutely nothing when you’ve promised them content, and 2.) Google will still crawl your page if you’re not careful, and rank you according to your nonexistent, downed page.

To avoid that and make sure you get a clean, temporary break without majorly affecting your overall traffic numbers or your ranking online, take the following steps:

Get Your 503 Right

The HTML 503 error is a rather common one you may have encountered several times throughout your surfing life, and it specifically refers to an “Unavailable Service” as per Lifewire. Be sure to offer both some type of interstitial or temporary maintenance site (and perhaps a pop-up) and a 503 error.

End Crawling

In your site’s robots.txt file or through a robots meta tag, you should be able to add a quick clause that tells Google there’s nothing worth crawling here at the moment as per Google. This is down through a “disallow” directive in the document itself.

Announce the Maintenance

For regular users of your service, there’s nothing quite like unannounced maintenance. Most people can live with being slapped with a 503 on a nonessential service as long as they get a succinct and rational explanation through Facebook or your mailing list.

No matter how you choose to announce the maintenance, it’s important to do so at least about a business week in advance and with the right explanation – regular maintenance, site update, etc. As long as you keep your customers in the loop and Google happy, taking your site off for a day shouldn’t hurt you at all.

Steps to Avoid Penalty on Your Website

This entry was posted in Content, Penalty Updates, SEO on by .

If your website hasn’t been gaining visitors or has seen a sudden dip, it could be because you are using outdated SEO practices and received a penalty from one of the major search engines. An organic SEO company can help you to spot and prevent these penalties, but there are some steps you can take yourself to help them out:

  1. Use Keywords Wisely

Keywords do still work, but Your Story says that keyword density has become more important, as search engines will penalize websites which stuff their content with poor keywords or shove them in without thinking. They suggest instead repeating shorter, highly relevant keywords but only a few times per article, and within high quality content.

 

  1. Stop Abusing Links

Inbound and outbound links have traditionally been an integral part of driving relevant traffic to a website, but it is also easy to abuse the amount of links you use. A high amount of links may seem like a good idea, and perhaps it was once, but now search engines will pick them up as spam, or even as paid links which have been banned. A good organic SEO company will therefore generate relevant links within the search engine parameters, rather than filling up the entire internet with irrelevant links. It is also best to check that any outbound links on your site aren’t directing to spam, adult content, or suspicious sites. Some people will try to post these links in your comments or on your forum. It is important to include at least some outbound links, perhaps to the websites of guest bloggers, to avoid suspicion.

 

  1. Keep Up to Date

Even if you leave some content archived, it is best to perform a regular audit to check that everything is still working and hasn’t become outdated. Kissmetrics says, for example, that broken links or now banned paid links aren’t favored by search engines and could result in penalties.

 

 

Keeping yourself informed on new SEO strategies and avoiding outdated ones will help you to bring in a constant stream of relevant website visitors and avoid penalties. A trusted local SEO firm will help you with this and eliminate any problems that you may already have been experiencing.

How Old SEO Off-Page Optimization Techniques Penalize your Website?

This entry was posted in Content, Penalty Updates, SEO on by .

A modern website is not the type of marketing tool that is built once and then left. It is an ever changing organism that needs to grow and adapt to changes and SEO updates. But it can be difficult to get out of old habits, especially when they used to work at one time. Some archived articles are fine as they are, but many companies don’t realize how much older SEO techniques are bringing the entire website down in the search rankings. Betaout says that this is because building inbound links was essential for SEO, and still is in other forms, but was changed drastically after recent Google updates in order to remove spam and keep junk content out of the page rankings. These are some of them and ways that the right Los Angeles and Ventura SEO services can help you:

  1. Paid Links

Search Engine Journal says that paid links used to be common practice in the SEO industry but are now banned by Google as spam. But some firms still rent or buy links for their clients illegally, costing them money and resulting in dropped traffic.

 

  1. Permanent Links

These were everywhere in the 2000’s as a ‘high quality alternative’ to rented links. They were links that were brought and used permanently rather than rented temporarily. They were similarly banned by Google and now do no good for a website. Good Los Angeles and Ventura SEO services will never offer you permanent paid links.

 

  1. Keywords

These are still an important part of SEO, but only when used correctly. The trouble is that SEO firms used to stuff content full of weak keywords in a desperate attempt to get them picked up by search engines. The search engines picked up on this and recognized these articles as junk content. A much better approach now is to use one or two particularly strong keywords per article and focus on producing quality content instead.

 

 

If you are still holding onto any of these outdated practices, it is best to remove them as soon as possible so that they won’t drag your website down. You should also avoid any SEO firms or individuals who try to use them in your SEO campaign.

What is Google Penguin? Understanding Google’s Engine

This entry was posted in Penalty Updates, SEO on by .

In the world of SEO, content may be king, but Google keeps the keys to the kingdom. Among the larger search engines out there, Google is the undisputed king. As per Statista, for the past five years, Google has held well over 80 percent of the global market share in all things related to search engines. When you want to look something up, you “Google it.”

google-penguin-for-seo

That doesn’t mean capturing other search engine demographics isn’t important, but even then, there are similarities across all search engines in the way that they work. In fact, as per SEO Book, Yahoo! Search is powered by Bing.

 

Still, following and understanding Google updates is the biggest obstacle and basis of research for SEO professionals. Knowing how Google works is key to a good SEO service, and it’s no easy job.

 

Google’s search algorithm has existed for years, but it’s only recently been named. As of 2013, the company’s 15th anniversary, Google’s algorithm according to Search Engine Land, is called Hummingbird. Within Hummingbird, however, the company has employed several different programs to optimize the way it works and categorizes websites.

 

These are programs like Google Panda, Pigeon, PageRank, and Penguin. These programs or ingredients number in the hundreds. No one in the SEO world really knows how Google works a hundred percent of the time, but we do know enough to optimize websites. So the question is this: what are these programs, and how do they affect your company’s SEO?

 

Tackling Google Penguin

 

Google Penguin was first announced in 2012, and since then, it’s been assigned the job of crawling through websites and penalizing those that were buying links or utilizing other black hat methods to generate inbound links.

 

To understand what Google Penguin does, you have to understand link building. Link building is the activity of building your website’s reputation and search engine rank through inbound links. Inbound links are links on other websites leading to your own website.

 

How Link Building Works

 

In the past, just having inbound links was enough to generate a great rank on your website. Inbound links are used as a factor for search engine ranking for a simple reason: if people link to your website, it’s a sign that there’s something there that’s worthy of a read or visit. Basically, your inbound links are a sign of popularity, and quality.

 

The problem, in this situation, is the way marketers and SEO specialists abused inbound links to artificially inflate their SEO rankings. Through private blog networks and rewritten content, websites could get quick, cheap, and often self-generated inbound links that helped boost their ranking for little legwork.

 

Getting Help

 

Today, link building is a little more complex. You can’t buy inbound links, and inbound links from less reputable or unknown websites doesn’t account for much. In fact, these can even get you penalized. The holy grail these days is being mentioned in mainstream news media. Second to that, you have educational websites, like university blogs and article entries, which are also highly reputable. If you’re looking for affordable SEO solutions in Los Angeles and Ventura, you’ll need the services of a qualified local provider, like SeoTuners.

 

While search engine specialists argue that website reputation doesn’t matter much, there is a little controversy in the issue. The point, however, is this: spamming links no longer works, and probably never will. Google’s algorithms are updated on a regular basis to weed out quick and easy ways to raise your company’s ranking, placing value on good content and organic ranking techniques.

Google Says Penguin Refresh Months Away From Happening

This entry was posted in Advertising, Marketing, Penalty Updates, SEO on by .

Google’s reminds us that the Penguin algorithm refresh is still months and months away, as Google works to make Penguin run in real-time.

google-penguin-yellow2-fade-ss-1920

Gary Illyes from Google’s Webmaster Trends Analyst team said over the weekend on Twitter that the Google Penguin refresh won’t be happening for some time.

Gary Illyes said the Penguin update/refresh is “months away.”

This is not completely shocking, being that at SMX Advanced Gary told us that Penguin currently runs slowly but they are working on making it run faster. Back then, Illyes said this this kind of change is months away and that is what Google has been working towards.

Google is working towards making Penguin run in real-time, with continuously fresh link data and an algorithm that runs in real time.

But again, to get there, Gary Illyes from Google said, would take months and he just reiterated that in Twitter this weekend.

Here are dates of all Penguin releases:

  • Penguin 1.0 on April 24, 2012 (impacting ~3.1% of queries)
  • Penguin 1.1 on May 26, 2012 (impacting less than 0.1%)
  • Penguin 1.2 on October 5, 2012 (impacting ~0.3% of queries)
  • Penguin 2.0 on May 22, 2013 (impacting 2.3% of queries)
  • Penguin 2.1 on Oct. 4, 2013 (impacting around 1% of queries)
  • Penguin 3.0 on October 17, 2014 (impacting around 1% of queries)

Will it be another year until we see a Penguin refresh? It is totally feasible at this rate.

Googles New Pigeon Algorithm Launches

This entry was posted in Penalty Updates, SEO on by .

Experts Weigh In On Google’s “Pigeon” Update Aimed at Improving Local Search Results

Last week, Google made new search waves when it rolled out updates to its local search algorithm.

The “Pigeon” update (the name Search Engine Land gave it in absence of an official name from Google) aims to deliver improved local search results, with enhanced distance and location ranking parameters.

According to Google, the new local search algorithm ties deeper into the site’s web search capabilities, leveraging hundreds of ranking signals, along with search features like spelling correction capabilities, synonyms and Google’s knowledge graph.

Search Engine Land reported last week on how the “Pigeon” update solved Google’s “Yelp problem,” with local directory sites already experiencing improved visibility in Google search results:

It looks like Yelp and other local directory-style sites are benefiting with higher visibility after the Pigeon update, at least in some verticals. And that seems logical since, as Google said, this update ties local results more closely to standard web ranking signals. That should benefit big directory sites like Yelp and TripAdvisor — sites that have stronger SEO signals than small, individual restaurants and hotels are likely to have.

Now that we’re a week out, we asked a few local search experts what they have seen since Google set its “Pigeon” update free. Here’s what they had to say:

David Mihm, Director of Local Search Strategy at Moz

Overall, this update seems like an amplification of the previous silent Hummingbird update from last fall.  Just like last time, I would argue that the quality of the SERPs has been downgraded, with “search results within search results” (i.e. directories) getting rewarded relative to their pre-Pigeon position.

Directories with strong brands (like Yelp, as Matt McGee already pointed out) often show up multiple times for the same search, especially on recovery searches for specific small businesses – many of which occur when the searcher clicks a Carousel result.  But they’re even prevalent on far less-specific discovery searches, and on searches performed on mobile devices (in my own limited testing).

I fail to see how this is an improved, let alone a good, experience for searchers.

The outcome of Pigeon unfortunately rewards Yelp’s recent “whining,” and with the EU antitrust settlement largely behind them, it seems an odd time to move the SERPs in this direction.

A number of folks have commented in places like Max Minzer’s Local Search community, and Casey Meraz highlighted it as well, that there seem to be many more two and three-packs than there were before, which takes even more real estate away from small businesses and increases the relative opportunity for directories.

Perhaps in competitive industries where most companies have already maximized citations, reviews and user-generated content about their businesses, it is simply getting harder and harder for Google to identify the best of the best businesses by place-related signals?  That defeatism seems decidedly un-Google, however.

I’m at a bit of a loss as to any economic benefit this boost to directories (with easier-to-reach, larger Adwords budgets) might provide Google, but I’m looking forward to hearing what other commenters have to say.

Greg Gifford, Director of Search and Social at Autorevo

In the automotive niche, we seem to be isolated a bit from the crazy effects we’re hearing about elsewhere. In some cities, we haven’t even seen a significant shift in map pack rankings (any more than what we’d see on a monthly basis anyway).

We’re still seeing map packs on auto dealer related search queries, and the vast majority of results are mostly the same.

We have noticed a few random anomalies though. In the past, “used cars CITY” always brought up a map pack. We’ve seen a few isolated cities where the map pack has disappeared for that query. For example, in Louisville, Kentucky, in an incognito search with location set to Louisville, we saw:

  • “used cars” = seven-pack
  • “used cars louisville” = no map pack
  • “used cars louisville ky” = three-pack

Before Pigeon, those would have all resulted in seven-packs. Other than a few random map pack switcheroos like that, we’re not seeing much difference.

Nicole Hess, Senior SEO Strategist at Delphic Digital

After reading about the potentially spammy results being brought in by the newest Google local algorithm update, I immediately began wondering the affect of it on several of my clients.

First and foremost, a national client of mine has hundreds of locations that conduct business independently and need organic traffic to produce valuable business leads.  I began digging into the data to spot any trends that already may be happening or developing and take action on it.

In reviewing the local rankings pack, I did not find spammy results creep into listings; although, I have seen this in some searches – such as “Casino” and “Interior Design” – but not in this client’s space.

My three primary observations:

  1. Locations not appearing in local results: There were a few locations that are not appearing in the local pack of results, though at some previous point did appear there. The average drop in traffic for a location that is no longer in the local pack is 16% less traffic month over month (and this is in a good season where overall organic traffic is increasing).
  2. Locations appearing in local results, less traffic: Of the 50 locations I reviewed, seven are receiving less organic traffic month over month, though still rank in the Local results and have the same organic rankings. Five of the seven locations rank second in a pack of seven local results and for each of these, there are paid ads with star ratings that appear above the local pack.
  3. Locations getting more traffic: Ten of the 50 locations I reviewed are receiving more organic traffic, on average 24% more organic traffic than the same week of the previous month.  Each location ranks in the local pack and most rank No. 1 or No. 2 in the local pack. Their organic rankings have also maintained steady positions month over month, so that factor can be eliminated.

Also, while there were still paid ads, most listings had paid ads that didn’t have star ratings to detract from the organic results. Noting that this is a good season for the client where organic traffic is improving in general, I’m not ascribing all the lift to the local pack rankings, though the lift in traffic for these locations is greater than the month over month lift in organic traffic overall.

So it appears there has been some favorable shifts caused by Pigeon driving more organic traffic.

From what I have witnessed, some local ranking shift has occurred and is driving more organic traffic to several locations. Being out of the local pack correlates with a loss of organic traffic for a few locations. A loss of organic traffic is also occurring where listings are competing against paid ads that have star ratings.

Andrew Shotland, Local Search Engine Optimization Consultant at LocalSEOGuide.com

We are really interested in how this update moved Google more in the direction of hyperlocal search. Something that has been flying under the radar on this update is the neighborhood specific location settings that previously seemed to be just a test are now live everywhere as far as I can tell.

I am also seeing a number of the local directory type sites I work with have almost all seen five to ten percent increases in organic traffic since the update. This lines up with the contraction and elimination of many of the local pack results that others are reporting. Directories would be one of the benefactors of this.

We are waiting to see if this holds over the next week before publishing any of the data. It’s highly likely there will be a fair amount of algo “tuning” so I wouldn’t be surprised if the results we are talking about change dramatically over the next few days or weeks.

Mike Blumenthal, Search Expert and author of Google Places and Local Search blog Blumenthals.com

To a large degree the jury is still out on the what, whys and outcomes of the recent Local algo update. Things have been changing since the roll out Thursday evening and are just now stabilizing.

Things we do know: there seem to be fewer seven-pack results than before although the drop is not as big as first reported as Google seems to have changed the impact of some local query modifiers. It was originally reported as a sixty percent drop in MozCast, and by their metric it was. However many of their search queries no longer seem to function the same way.

Things that seem to be “more so” since the change include:

  • Localization of geo search results appear to have increased based on user’s location.
  • Brands appear to have benefited with additional listings in the pack results and more three-packs.

The update does appear to have reduced duplication between the organic and local results. After the October 2013 update that ended blended results, a number of sites were seeing both organic and local pack results. Those seem to have been reduced to one or the other.

The directories, at least anecdotally, appear to have benefited from the change.

On many searches the radius of the “view port” of the Map has changed. This obviously leads to an effective ranking shake up as the businesses visible within the view have changed. On some searches we are seeing cross geo border expansion of the port and on others a reduction in the radius, totally excluding the locations in the burbs.

Whether this is a cause or effect, we simply can’t yet tell but it does lead to turmoil in the rankings.

One could group this update with a number of other recent Google updates that have reduced visual “distractions” from the main search results; loss of video snippets, the loss of author photos, reduction in the number of review stars shown, etc. etc.

The impact is still unclear; we will have to wait for analytics data to accumulate to assess the net of the change both specifically and more broadly.

Mary Bowling, Co-founder at Ignitor Digital

I think it’s too soon to tell what may be temporary and what might stick, but overall I think Google may be trying to hyper-localize desktop results more.

Google has made several moves lately for the purpose of better aligning desktop and mobile results. Google’s interpretation of the searcher’s location may now be playing more into which results they see on their desktop, just the way it has been playing into which results they see on smartphones.

Some of the things people are reporting are a reduction in the number of local packs seen in the SERPs and a widespread reduction from 7 results in the local packs to 3 results. This may also be an attempt to better mirror on the desktop what mobile searchers see.

Chris Smith, President and Strategist at Argent Media

It’s actually still early to definitively state precisely what all Google may have changed to produce the results we’re seeing. While it is very clear that a significant number of local search queries have stopped displaying local search results, some of the anecdotal reports have been a bit too all-encompassing in declaring particular search queries as “no longer displaying local packs.”

For instance, while the term “house rentals” appears to invoke the local pack in far fewer cases, there are still significant markets where that query continues to invoke local pack results (at least, when I test the search in combo with city names). Searching for “house rentals estes park” or “house rentals gatlinburg” still has good seven-packs of local listings embedded in the SERP.

This suggests that the part of the search results page composition algorithm that handles determining when to serve local pack results has undergone a revision rather than elimination for many of these effected terms. The dial has been turned back some, if you will, and other qualifying elements have been introduced in how it functions.

Specificity of the query is an additional element. When Google first began displaying the local pack, they inferred locality intent associated with queries like “house rentals” or “pizza”, etc. For whatever reason, the assumption of local intent has now been dialed back in a number of cases, most likely based upon some sort of usability testing, or out of desire to further reduce “clutter” in search results.

Overall, the news that this update bumps up web search ranking signals more so than some of the local factors doesn’t necessarily pose a huge fear factor for local businesses. On the other hand, local companies that were enjoying good local pack rankings, despite having an SEO-weak website presence, will now have to step up their game in order to recover.

Some have reported spammy local companies have enjoyed better rankings since the update; but, I don’t think the dust has altogether settled. These companies may have a lot more to fear after another few weeks.

Finally, some directory sites appear to have benefited. To me, the recent shift has heavily benefit Yelp (I think they likely need to Shut-The-Front-Door on whining about Google mistreatment). Yellowpages.com also appears quite prominently in my sampling, as well as some vertical directories.

Some of the more marginal, less-popular online yellow pages and business directories are not all that visible or prominent these days. In some business category and market combinations, the organic search results are more populated by these directory sites than by the websites of local businesses – which will necessitate a bit of a shift in local companies’ online strategies.

If these ranking changes for local-intent queries were intentional upon Google’s part, it seems clear that they feel that there are many cases where searchers desire to perform comparative research to decide upon businesses prior to selecting listings. Businesses will have to adjust their strategic approaches accordingly.