RICK REA: Helping You Grow Through Online Marketing
  • Home
  • Blog
    • Social Media News
    • SEO Marketing News
    • Digital Trends News
    • Photography News
    • Mobile Marketing
    • Business News
    • Gadget News
    • Printing News
  • Contact
  • About
  • Subscribe

SEO Marketing News

Unfiltered: How to Show Up in Local Search Results

10/31/2017

0 Comments

 
http://ift.tt/2iNKtGs

Unfiltered: How to Show Up in Local Search Results

http://ift.tt/2z7nuMG

Posted by sherrybonelli

If you're having trouble getting your local business' website to show up in the Google local 3-pack or local search results in general, you're not alone. The first page of Google's search results seems to have gotten smaller over the years – the top and bottom of the page are often filled with ads, the local 7-pack was trimmed to a slim 3-pack, and online directories often take up the rest of page one. There is very little room for small local businesses to rank on the first page of Google.

To make matters worse, Google has a local "filter" that can strike a business, causing their listing to drop out of local search results for seemingly no reason – often, literally, overnight. Google's local filter has been around for a while, but it became more noticeable after the Possum algorithm update, which began filtering out even more businesses from local search results.

If you think about it, this filter is not much different than websites ranking organically in search results: In an ideal world, the best sites win the top spots. However, the Google filter can have a significantly negative impact on local businesses that often rely on showing up in local search results to get customers to their doors.

What causes a business to get filtered?

Just like the multitude of factors that go into ranking high organically, there are a variety of factors that go into ranking in the local 3-pack and the Local Finder.

http://ift.tt/2iNGREw

Here are a few situations that might cause you to get filtered and what you can do if that happens.

Proximity matters

With mobile search becoming more and more popular, Google takes into consideration where the mobile searcher is physically located when they're performing a search. This means that local search results can also depend on where the business is physically located when the search is being done.

A few years ago, if your business wasn't located in the large city in your area, you were at a significant disadvantage. It was difficult to rank when someone searched for "business category + large city" – simply because your business wasn't physically located in the "large city." Things have changed slightly in your favor – which is great for all the businesses who have a physical address in the suburbs.

According to Ben Fisher, Co-Founder of SteadyDemand.com and a Google Top Contributor, "Proximity and Google My Business data play an important role in the Possum filter. Before the Hawk Update, this was exaggerated and now the radius has been greatly reduced." This means there's hope for you to show up in the local search results – even if your business isn't located in a big city.

Google My Business categories

When you're selecting a Google My Business category for your listing, select the most specific category that's appropriate for your business.

However, if you see a competitor is outranking you, find out what category they are using and select the same category for your business (but only if it makes sense.) Then look at all the other things they are doing online to increase their organic ranking and emulate and outdo them.

If your category selections don't work, it's possible you've selected too many categories. Too many categories can confuse Google to the point where it's not sure what your company's specialty is. Try deleting some of the less-specific categories and see if that helps you show up.

Your physical address

If you can help it, don't have the same physical address as your competitors. Yes, this means if you're located in an office building (or worse, a "virtual office" or a UPS Store address) and competing companies are also in your building, your listing may not show up in local search results.

When it comes to sharing an address with a competitor, Ben Fisher recommends, "Ensure that you do not have the same primary category as your competitor if you are in the same building. Their listing may have more trust by Google and you would have a higher chance of being filtered."

Also, many people think that simply adding a suite number to your address will differentiate your address enough from a competitor at the same location — it won't. This is one of the biggest myths in local SEO. According to Fisher, "Google doesn't factor in suite numbers."

Additionally, if competing businesses are located physically close to you, that, too, can impact whether you show up in local search results. So if you have a competitor a block or two down from your company, that can lead to one of you being filtered.

Practitioners

If you're a doctor, attorney, accountant or are in some other industry with multiple professionals working in the same office location, Google may filter out some of your practitioners' listings. Why? Google doesn't want one business dominating the first page of Google local search results. This means that all of the practitioners in your company are essentially competing with one another.

To offset this, each practitioner's Google My Business listing should have a different category (if possible) and should be directed to different URLs (either a page about the practitioner or a page about the specialty – they should not all point to the site's home page).

For instance, at a medical practice, one doctor could select the family practice category and another the pediatrician category. Ideally you would want to change those doctors' landing pages to reflect those categories, too:

http://ift.tt/2zkeBAt
http://ift.tt/2iNKs5m

Another thing you can do to differentiate the practitioners and help curtail being filtered is to have unique local phone numbers for each of them.

Evaluate what your competitors are doing right

If your listing is getting filtered out, look at the businesses that are being displayed and see what they're doing right on Google Maps, Google+, Google My Business, on-site, off-site, and in any other areas you can think of. If possible, do an SEO site audit on their site to see what they're doing right that perhaps you should do to overtake them in the rankings.

When you're evaluating your competition, make sure you focus on the signals that help sites rank organically. Do they have a better Google+ description? Is their GMB listing completely filled out but yours is missing some information? Do they have more 5-star reviews? Do they have more backlinks? What is their business category? Start doing what they're doing – only better.

In general Google wants to show the best businesses first. Compete toe-to-toe with the competitors that are ranking higher than you with the goal of eventually taking over their highly-coveted spot.

Other factors that can help you show up in local search results

As mentioned earlier, Google considers a variety of data points when it determines which local listings to display in search results and which ones to filter out. Here are a few other signals to pay attention to when optimizing for local search results:

Reviews

If everything else is equal, do you have more 5-star reviews than your competition? If so, you will probably show up in the local search results instead of your competitors. Google is one of the few review sites that encourages businesses to proactively ask customers to leave reviews. Take that as a clue to ask customers to give you great reviews not only on your Google My Business listing but also on third-party review sites like Facebook, Yelp, and others.

Posts

Are you interacting with your visitors by offering something special to those who see your business listing? Engaging with your potential customers by creating a Post lets Google know that you are paying attention and giving its users a special deal. Having more "transactions and interactions" with your potential customers is a good metric and can help you show up in local search results.

Google+

Despite what the critics say, Google+ is not dead. Whenever you make a Facebook or Twitter post, go ahead and post to Google+, too. Write semantic posts that are relevant to your business and relevant to your potential customers. Try to write Google+ posts that are approximately 300 words in length and be sure to keyword optimize the first 100 words of each post. You can often see some minor increases in rankings due to well-optimized Google+ posts, properly optimized Collections, and an engaged audience.

Here's one important thing to keep in mind: Google+ is not the place to post content just to try and rank higher in local search. (That's called spam and that is a no-no.) Ensure that any post you make to Google+ is valuable to your end-users.

Keep your Google My Business listing current

Adding photos, updating your business hours for holidays, utilizing the Q&A or booking features, etc. can help you show off in rankings. However, don't add content just to try and rank higher. (Your Google My Business listing is not the place for spammy content.) Make sure the content you add to your GMB listing is both timely and high-quality content. By updating/adding content, Google knows that your information is likely accurate and that your business is engaged. Speaking of which...

Be engaged

Interacting with your customers online is not only beneficial for customer relations, but it can also be a signal to Google that can positively impact your local search ranking results. David Mihm, founder of Tidings, feels that by 2020, the difference-making local ranking factor will be engagement.

engagement-ranking-factor.jpg

(Source: The Difference-Making Local Ranking Factor of 2020)

According to Mihm, "Engagement is simply a much more accurate signal of the quality of local businesses than the traditional ranking factors of links, directory citations, and even reviews." This means you need to start preparing now and begin interacting with potential customers by using GMB's Q&A and booking features, instant messaging, Google+ posts, responding to Google and third-party reviews, ensure your website's phone number is "click-to-call" enabled, etc.

Consolidate any duplicate listings

Some business owners go overboard and create multiple Google My Business listings with the thought that more has to be better. This is one instance where having more can actually hurt you. If you discover that for whatever reason your business has more than one GMB listing, it's important that you properly consolidate your listings into one.

Other sources linking to your website

If verified data sources, like the Better Business Bureau, professional organizations and associations, chambers of commerce, online directories, etc. link to your website, that can have an impact on whether or not you show up on Google's radar. Make sure that your business is listed on as many high-quality and authoritative online directories as possible – and ensure that the information about your business – especially your company's Name, Address and Phone Number (NAP) -- is consistent and accurate.

So there you have it! Hopefully you found some ideas on what to do if your listing is being filtered on Google local results.

What are some tips that you have for keeping your business "unfiltered"?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!





SEO

via SEOmoz Blog https://moz.com/blog

October 31, 2017 at 03:43PM
0 Comments

Now You Can Share Links to Snapchat from Third Party Apps on iOS by @MattGSouthern

10/31/2017

0 Comments

 
http://ift.tt/2gZiljk

Now You Can Share Links to Snapchat from Third Party Apps on iOS by @MattGSouthern

http://ift.tt/2h0jmre


Snapchat recently updated its iOS app with a feature that was previously only available on Android — link sharing from third-party apps.

When you’re viewing content in another app and tap the share button at the bottom of the screen, you will now see Snapchat as an option of other apps to share to.

A link to the content can then be sent as a private message in Snapchat to one or as many people as you like.

This opens up another source of social media referral traffic for publishers, which is becoming ever more elusive with major social networks tweaking their algorithms to favor paid content.

Along with this update, the latest version of Snapchat has another new feature that is worth mentioning.

Ghost Mode

Want to go off the grid and share Snaps without alerting others of your location? Turn on Ghost Mode.

Enabling Ghost Mode from the settings menu allows users to stop sharing their location for a duration of 3 hours, 24 hours, or indefinitely until turned off.

This will be useful for people who want to share pictures and videos without broadcasting their exact location at the time.

Both of these updates are available now in the latest version of Snapchat for iOS.





SEO

via Search Engine Journal http://ift.tt/1QNKwvh

October 31, 2017 at 03:31PM
0 Comments

SearchCap: Google service ads local search & Halloween

10/31/2017

0 Comments

 
http://ift.tt/2kAEqFY

SearchCap: Google service ads, local search & Halloween

http://ift.tt/2gQRTVm

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Google service ads, local search & Halloween appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.




SEO

via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/1BDlNnc

October 31, 2017 at 03:01PM
0 Comments

Google: Improving Low Quality Content is Better Than Deleting It by @MattGSouthern

10/31/2017

0 Comments

 
http://ift.tt/2iPv6gK

Google: Improving Low Quality Content is Better Than Deleting It by @MattGSouthern

http://ift.tt/2iNi57B


Google’s John Mueller has cleared up any confusion around what to do with low quality content when he was asked about it today during a Google Webmaster Hangout.

First, Mueller gives a decidedly non-technical definition of low quality content:

“So, in general, when it comes to low quality content that’s something where we see your website is providing something but it’s not really that fantastic.”

What should you do if you happen to come across low quality content on your site? There are two approaches, Mueller says, so don’t jump to the conclusion that the content should be deleted.

The best solution is to keep the content where it is and improve upon it.

“There are two approaches to tackling this. On one hand you can improve your content, and from my point of view if you can improve your content that’s probably the best approach possible.”

Really take a look at the content and think about why you published it in the first place. If it has a purpose then make sure it adds value.

“You clearly had a reason for putting this out, now be serious about the content you put out and make sure it’s useful.”

Mueller adds that there may be extenuating circumstances where deleting the content is a better option.

Sometimes you may not be able to improve on low quality content because there’s just so much, or maybe it was all auto-generated at one point.

Rather than improving some pieces of content and not improving others, Mueller recommends being consistent and cleaning it all up.

Noindexing low quality content is an acceptable way of “cleaning it up,” Mueller says.

So there you have it. If there’s low quality content on your site that can be improved upon — do it. If you can’t, get rid of it.





SEO

via Search Engine Journal http://ift.tt/1QNKwvh

October 31, 2017 at 02:27PM
0 Comments

Customer Experience in the Age of Social Media

10/31/2017

0 Comments

 
http://ift.tt/2gZViFc

Customer Experience in the Age of Social Media

http://ift.tt/2gY4Lg4

Join our social media and CX experts as they explain how social customer service tools can help brands provide winning digital customer experiences. They’ll discuss how to manage that experience across multiple social touchpoints, leverage evolving social customer service tools and platforms to deliver long-term value, and act on real-time customer insights to drive social ROI.

Attend this webinar and learn:

  • social strategies that drive loyalty and advocacy throughout the customer journey.
  • social customer service response techniques that meet – and exceed – customer expectations.
  • how global brands use social networks and communities to grow their customer bases.

Register today for “CX in the Age of Social Media,” produced by Digital Marketing Depot and sponsored by Lithium.


Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Digital Marketing Depot

is a resource center for digital marketing strategies and tactics. We feature hosted white papers and E-Books, original research, and webcasts on digital marketing topics -- from advertising to analytics, SEO and PPC campaign management tools to social media management software, e-commerce to e-mail marketing, and much more about internet marketing. Digital Marketing Depot is a division of Third Door Media, publisher of Search Engine Land and Marketing Land, and producer of the conference series Search Marketing Expo and MarTech. Visit us at

http://ift.tt/XKa9gM

.





SEO

via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/1BDlNnc

October 31, 2017 at 02:10PM
0 Comments

Oh no! AdWords can now spend double your budget. Or not

10/31/2017

0 Comments

 
http://ift.tt/2gQqu63

Oh, no! AdWords can now spend double your budget. Or not…

http://ift.tt/2z1E0vD

In case you hadn’t already heard, AdWords can now spend up to double your campaign’s daily budget… which is pretty darned irritating!

Fortunately, your favorite PPC superhero is here to save the day.

Yep, here I am! So let’s see if we can’t script our way out of this mess.

For 99 percent of campaigns, I’d normally recommend not using budget caps at all — I like to “tap it not cap it,” which basically means it’s better to control spend by bids (/ROI) rather than closing up shop with budgets.

However, there are certain instances where budgets are not just useful, but essential — for example, if a client has a specific budget attached to a particular campaign. Yes, Google, some people actually have limited marketing budgets!

At the very least, you should know when the overspend is happening, so you can judge for yourself whether said overspend should continue.

If you’d really like to keep a close eye on costs, have a look at our script to track your account’s spend every hour. For those who only want to be alerted when campaigns are over their budgets, this is where the new script comes in!

This latest script from Brainlabs (my employer) checks each campaign’s spend and budget. All you need to do is set a multiplier threshold — if the spend is larger than the budget multiplied by the threshold, then the campaign is labeled. You’ll get an email listing the newly labeled campaigns, along with their spend and budgets. And if you want, you can set another threshold so that if the spend gets too far over your budget, the campaign will be paused.


The script also checks if the campaign’s spend is under your labeling and pausing thresholds, so it can unlabel and unpause them. That means that when it’s a new day and nothing has been spent yet, the labels will be removed, and anything the script has paused will be reactivated. It also means that if a campaign is over budget, but you increase its budget, the labeling and status will reflect the new, increased budget.

To use the script, copy the code below into a new AdWords Script and change the settings at the top:

  • campaignNameContains and campaignNameDoesNotContain filter which campaigns the script will look at. For example, if campaignNameContains is  [“Generic”, “Competitor”] then only campaigns with names containing “generic” or “competitor” will be labeled or paused. If campaignNameDoesNotContain is [“Brand”, “Key Terms”] then any campaigns containing “brand” or “key terms” in the name will be ignored (and can overspend as they like).
    • This is not case-sensitive.
    • Leave blank, [], to include all campaigns.
    • If you need to put a double quote into campaignNameContains or campaignNameDoesNotContain, put a backslash before it.
  • email is a list of addresses that will be emailed when campaigns are labelled or paused.
    • Note that emails will be sent even when the script is being previewed and not making changes.
  • currencySymbol, thousandsSeparator and decimalMark are used to format the budget and spend numbers in the email.
  • labelThreshold determines how much the campaign must spend, compared to its budget, for the script to label it as overspending.
    • For example, if you set labelThreshold to 1, then campaigns will be labeled and you will be emailed if the spend is equal to the budget. If you set it to 1.2, then the campaign is labeled and email sent if spend is 120 percent of the budget.
  • labelName is the name of the label that will be applied to overspending campaigns.
  • Set campaignPauser to true if you want campaigns too far over their budgets to be paused. Set it to false if you do not want the script to pause campaigns, no matter how much they spend (the script will still label and email you according to the labelThreshold).
  • pauseThreshold determines how much the campaign must spend, compared to its budget, for the script to pause it (if campaignPauser is true).
    • This works the same as labelThreshold: If it is 1.2, then campaigns will be paused if their spend is 120 percent of the budget.
    • This must be greater than or equal to the labelThreshold. The script needs the paused campaigns to be labeled so it knows which to reactivate when their spend becomes lower.

Preview the script to make sure it’s working as expected (and check the logs in case there are any warnings). Then set up a schedule so the script runs hourly.

A few things to note:

  • The script only works with search and display campaigns! It can’t help with video, shopping or universal app campaigns.
  • This script can’t completely prevent your going over budget. The script only runs hourly, so the campaign can go over the spend threshold between runs. And there’s a 15- to 20-minute lag in the spend data.
  • Scheduled scripts don’t run on the hour, so campaigns will not be reactivated as soon as a new day begins. Instead, they’ll be reactivated when the script first runs on the new day, sometime between midnight and 1:00 a.m. Most campaigns receive little traffic at this time anyway — but if that’s not the case for you, you might want to set up automated rules to unpause things exactly as midnight strikes.
  • You can set labelThreshold to be less than 1. For example, if you set it as 0.9, you’ll get an email when the campaign reaches 90 percent of its budget.

Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Daniel Gilbert is the CEO at

Brainlabs

, the best paid media agency in the world (self-declared). He has started and invested in a number of big data and technology startups since leaving Google in 2010.





SEO

via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/1BDlNnc

October 31, 2017 at 01:56PM
0 Comments

The ever-growing local search universe

10/31/2017

0 Comments

 
http://ift.tt/2iPSuuK

The ever-growing local search universe

http://ift.tt/2gZqdRK

For those who missed it, Whitespark’s overhaul of the US Local Search Ecosystem interactive tool was recently released, and it does a fantastic job of showing how vast and complex the search industry has become. The ecosystem visualizes the web of search engines, data providers, publishers, directories and other businesses that use local data about businesses to power one simple action that people do every day: search online.

For example, the infographic identifies Infogroup, Acxiom, Neustar/Localeze and Factual as the primary data aggregators, which collect and validate location data from businesses and share that data with publishers such as Apple, Bing, Foursquare and Google. (I refer to data aggregators and large publishers collectively as data amplifiers because they share a business’s location data not just directly with searchers, but also with other apps, tools, websites and businesses that, in turn, reshare that data to people across the digital world.)

In Whitespark’s words, the ecosystem “shows how business information is distributed online, who the primary data providers are, how search engines use the data, and how it flows.” The interactive tool helps you understand the importance of sharing accurate location data and the consequences of maintaining inaccurate data.

For example, because data aggregators influence a web of businesses across the ecosystem, it’s imperative that businesses meet the data formatting requirements of each aggregator. And as you can see, the ecosystem is complex:

The Local Search Ecosystem [click to enlarge]

Local search expert David Mihm originally developed this infographic in 2009, and over the years, the ecosystem has changed dramatically to reflect the rich palette of destinations that people weave together throughout the process of discovery, as well as the number of companies that influence whether a business’s location data appears as it should when, say, a searcher finds them on Facebook, Yelp or Uber.

A post on the Whitespark blog by Nyagoslav Zhekov dramatizes this evolution, tracing some of the businesses that have joined and departed. For instance, back in 2009, Apple did not even appear on the ecosystem, and Myspace did. In 2017, Apple is one of the principal data amplifiers, and Myspace is not a factor. You can tell by a quick glance of the 2009 version of the infographic how far the industry as grown:

2009 Local Search Ecosystem

Now, here’s the interesting part: As far-reaching as the new infographic is, it’s just the tip of the iceberg. The infographic does not come close to identifying all the companies that license business information from data amplifiers or use it as a starting point to build out their own curated business directory. For instance, a quick glance at the following three lists of local citation sources shows dozens of additional places where business information exists:

Many of the businesses that appear on these lists overlap with those on Whitespark’s local search ecosystem, and they have the same role: receiving and sharing location data that influences which locations appear in search results. But many names on the top citations lists didn’t make the cut and are not part of the infographic. Why? Because of two factors that influence each other:

  • Consumers are using search in more far-reaching and sophisticated ways. They’re using apps, social media sites, websites, search engines and a host of other touch points to do increasingly refined searches for things to do, places to go, services to use and things to buy. They expect the digital world to provide instant access to restaurants, plumbers, museums, tattoo parlors, places for Magic the Gathering meet-ups, places to find spoken poetry and so on. Because of this behavior, the thousands of mobile app platforms, ad networks, navigation systems, data services, social media companies, search engines, directories and so on currently using business information provided by the data amplifiers would make the infographic difficult to comprehend — similar to the Marketing Technology Landscape.
  • At the same time, the ephemeral nature of many of these tools means that the infographic would rapidly be out of date as the various startups or branches of larger organizations either sunset or consolidate into a larger entity. I find it interesting that the fundamental reason the infographic can never be a truly representative look at the scale of the local search ecosystem is the exact reason that focusing your location data management on the data amplifiers is so critical today — something the infographic illustrates well.

The 2017 local search ecosystem is a brilliant foundation to get businesses grounded in the most influential sources of location data. But as the above examples demonstrate, the scope of location data companies far exceeds the Whitespark infographic. Put another way: Consider each wedge on the infographic to be a gateway to even more specialty sites by category.

The scope of location data directories, publishers and aggregators can seem overwhelming. But if you manage multiple brick-and-mortar storefronts, don’t despair. You need not have a presence on every directory on the lists I’ve cited. It’s far more important to focus your efforts on building relationships with data amplifiers. When you share your data with the core aggregators and publishers, you create two advantages for yourself:

  • Amplifiers do the heavy lifting for you by disseminating your data among all the places that require it, however obscure, where your data appears.
  • You stay up-to-date on the emerging technologies and products that the data amplifiers create. Google alone constantly updates its algorithms and products to improve search. By having a relationship with Google — such as publishing your data on Google My Business — you are on the ground floor when product updates happen and when Google launches new products.

Understand the scope and richness of the location data ecosystem. Make sure you are constantly optimizing your data and content to be found everywhere. And let the data amplifiers help you succeed across the ecosystem.


Some opinions expressed in this article may be those of a guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Adam Dorfman is the Senior Vice President of Product & Technology at

SIM Partners

where he leads the teams responsible for the best in class local automation platform Velocity. Follow him on Twitter

@phixed

.





SEO

via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/1BDlNnc

October 31, 2017 at 10:47AM
0 Comments

6 Reasons Why Some People Think Black Hat SEO Tactics Work by @benjarriola

10/31/2017

0 Comments

 
http://ift.tt/2xFgHq9

6 Reasons Why Some People Think Black Hat SEO Tactics Work by @benjarriola

http://ift.tt/2xEEUNl


Everybody has their own definition of “black hat SEO”.

Put simply, black hat SEO includes any techniques that are against Google’s guidelines.

Some people view them as a fast track to achieve higher rankings.

In fact, many SEO practitioners believe black hat SEO tactics are useful and they encourage others to use them.

Some common black hat SEO practices involve:

  • cloaked content
  • automated link spamming
  • keyword stuffing
  • use of private blog networks (PBNs)
  • forum and social bookmarking
  • profile link spamming
  • and more.

Make no mistake, however:

If your site uses black hat SEO, and Google finds out algorithmically or manually, you will be penalized.

Sometimes, you may notice a site that ranks well and clearly uses these black hat techniques without penalty.

This can (and has) misled some SEO pros into reconsidering whether it’s worth a move to the dark side of black hat SEO.

Instead of immediately believing that these black hat techniques are good ideas just because you find them working on a certain website, let’s try to understand why situations like this happen.

Here are six reasons why a site using black hat SEO tactics may still rank well in search engine results.

1. The SEO Campaign Is Still in an Early Stage

SEO is a marathon, not a sprint.

Many black hat SEO techniques will cause a sudden boost in rankings.

This allows black hat SEO practitioners to provide much faster short-term results.

But these quick results will also disappear quickly.

If the sudden improvement in rankings was shown in a presentation, a conference, or a meeting with someone that only saw the rise in ranking, and never looked at how it progressed in the future, they will never see the performance in the long run, and will tend to believe only what they saw in the short period of time thinking it is safe to do this in the long haul.

Many black hat SEO practitioners are even aware of this yet do it anyway.

Why?

Some of them do the churn and burn type of SEO, where they typically can be ad publishers or affiliate marketers*; doing SEO on their own websites, on their own domain name. If the site gets a sudden boost, makes a profit, then crashes down in rankings and gets banned, they simply throw away the domain and start over with a new one.

It’s great that some random black hat SEO guy did something that gave quick, immediate gains.

But that doesn’t mean you can do it with your SEO clients.

You want to make sure your clients remain in the search results for the long run.

Slow and steady progress is better than quick and unstable growth that may give risky results.

*Not all affiliate marketers and ad publishers are black hat SEOs. But many SEOs that do the churn and burn are often affiliate marketers or ad publishers because they are not invested in the long-term survival of a domain name.

2. Some Industries Are Competitive & Spammier Than Others

Many industries use aggressive marketing strategies.

You’ve likely seen some of the largest offenders in your email spam folder.

Oftentimes, the people participating in industries that do aggressive black hat SEO typically use every digital marketing channel aggressively.

Take, for example, industries such as adult entertainment, online casino, prescription medicine, and even mortgage and loans. These fields often include black hat SEO as part of their online marketing arsenal.**

So many of the sites within these particular industries use black hat SEO tactics. Google doesn’t really have much of a choice other than to include these black hat sites in the top ranking simply because there is no one else to choose.

SEOs that see this in practice may initially believe the black hat practices work, especially when every one of the top results in a competitive market is using black hat strategies.

Fortunately, most of the tactics used in industries saturated with black hat practices simply won’t work in less aggressive industries.

What are the chance of you ever getting close to the top ranking sites using spammy tactics (as defined by Google)? Slim – especially for the long run.

**Sometimes the companies in these industries aren’t the ones doing black hat SEO, but it is the resellers, dealers, distributors, or affiliates of these companies.

3. The Targeted Market Is in a Less Competitive Geographic Area

When listening to conversations with other SEO professionals from other countries in casual discussions, to online forum chat rooms, and even formal conference presentations, you’ll sometimes notice one or two SEOs preaching about how they believe black hat techniques work.

They will show complete case studies and actual results; therefore, you cannot deny the fact that it did work.

But, don’t be tricked by this.

Not all countries are at the same SEO maturity level.

Thus, the lack of competition alone provides an environment in which a site doing any kind of SEO will rank higher than those with no SEO at all, even if the SEO is black hat.

There are still some countries that can get away with these techniques, not because Google has a different algorithm in each country (although they do have small tweaks here and there), but because there are far fewer websites competing for rankings.

When listening to other people about their black hat successes, try to consider all factors of their situation, including geographic location of the target audience.

4. They Are Targeting a Very Long-Tailed Keyword

Black hat or white hat SEO, if the keyword phrase is long enough, it is more than likely that this phrase is very unique to the site you are optimizing; therefore, you are likely to rank for this term regardless of the type of SEO you utilize.

If someone is preaching how great black hat SEO is – and is presenting their ranking for an eight-word keyword phrase that includes a word or two that is totally unique – that is not impressive at all.

Almost anyone doing SEO can achieve this.

Most SEOs are knowledgeable enough to not fall for this sham, but I know there are still many SEO newbies that might go down the wrong path if they are not fully aware.

5. Even If They Use Some Black Hat SEO Tactics, They Are Overpowered by Their White Hat SEO Tactics

Many SEO campaigns involve a variety of tactics and strategies.

SEO practitioners doing multiple things simultaneously can get confused about which tactics are actually helping SEO performance the most.

If an SEO did a hundred white hat tactics and one black hat tactic resulting in outstanding rankings, the white hat SEO might be overpowering the black hat SEO at a large scale.

But, depending on who is interpreting these SEO ranking factors, some might be putting too much focus on the black hat tactics identified and failing to look at the white hat tactics implemented, likewise falsely attributing the ranking success to the black hat tactics.

When looking at a site’s SEO performance, and even if you identified some black hat SEO going on, try to assess all ranking factors involved.

6. Google Hasn’t Detected It Yet and No One Has Reported It

Black hat SEO tactics tend to be be very old.

Google is well aware of these methods.

They don’t work anymore in the long term.

Yet, there are times when SEO practitioners discover a new black hat tactic and seem to get away with it.

The tactic’s newness is like a loophole that Google just doesn’t know how to handle it yet.

Other black hat SEOs may notice it, see it’s working, and start doing it as well.

When it starts to become a popular black hat SEO tactic, more SEOs start finding out about it and some may report it to Google’s Spam Report.

As more reports are submitted, and Google becomes aware of it, they might start taking manual actions and begin working on an algorithm update to prevent these tactics from working.

This is the whole basis of many of the biggest Google algorithm updates.

Google’s algorithms today help prevent many identified black hat SEO tactics that were rampant in the past.

Consider All Factors of the Situation Before Deciding If an SEO Technique Works

This advice applies to not only assessing black hat technique but any technique in general.

Conduct a good scientific analysis and look at all factors possible to be able to draw the right conclusions.

If a theory comes out of your observations, just remember a good theory is a repeatable one.

If you think a specific factor is the main cause of a ranking change, you can confirm that by trying your best to keep other factors constant while observing the effect of a specific factor and changing it.

More Black Hat SEO Resources:





SEO

via Search Engine Journal http://ift.tt/1QNKwvh

October 31, 2017 at 10:25AM
0 Comments

One of Googles Biggest Mistakes Was Naming Algorithm Updates by @askreinhart

10/31/2017

0 Comments

 
http://ift.tt/2z0Skon

One of Google’s Biggest Mistakes Was Naming Algorithm Updates by @askreinhart

http://ift.tt/2ij7du3


When I began my career in digital marketing, I had a tough time describing what I did to other people. SEO wasn’t a known profession back then, and it certainly wasn’t something that you could easily bring up in conversation.

I remember the weird looks, the fake head nods, and the smiles from people acknowledging that I did something they hadn’t heard of, but politely not wanting to admit it.

As the years went on, things changed. People know what SEO is now, they care about it, they are worried about it, they are held against numbers that depend on it — it’s a thing. All because it was given a name.

When something doesn’t have a name, it’s not real, and as most things at their humble beginnings, SEO was no different. No one was worried about it, it was just this thing they had heard about that may or may not affect them sometime in the distant future.

Now it has a name. And everyone is interested in this set of rules and activities that result in “free” traffic to their websites and helps their bottom line.

What’s in a Name?

So why the diatribe about the name of the profession?

I want to get the point across that when you give something a name you make it real, which gives it power — and the largest mistake Google made in SEO was naming their algorithm updates.

Why?

Because it made algorithm names a thing, which gave them an inordinate amount of power over this profession that still exists today.

Going back to the Boston and Florida updates in 2003, up until the Penguin 4 in 2016, we have enjoyed a host of memorable names that have been applied to updates, and with every update came confusion, anger, and message boards filled with hate for Google.

Matt Cutts, the former head of the web spam team at Google, stepped in and became the scapegoat for all the ire that webmasters felt toward Google. Cutts did his best to answer questions about updates, what they were, why they were doing this — to no avail.

People were mad at these updates.

They ruined businesses.

SERPs became less predictable.

In short, Google had made organic search harder.

SEOs lashed out at Panda and Penguin in comment sections across the internet and spent days and months digging into each nuance to try and find the reason why their sites dropped and cursed Google every step along the way.

People got lost in that anger and over-analysis, and ultimately spent more time in search of silver bullets than they did accepting the real reason they got hit by the shift:

Their site probably sucks.

When you think about the amount of time and energy that is wasted on a daily, weekly, monthly, and annual basis investigating Google’s algorithms and how they may or may not have affected a site, the picture is worrisome.

What everyone should be doing is taking that time and energy and harnessing it into making their sites better, not complaining in comment sections.

We’re only hurting ourselves because while analysis into traffic drops is necessary, we need to control how far we follow that rabbit down the hole — and for the past few years, we’ve all been Alice.

We need a different perspective on algorithm changes.

Looking at It from a Different Perspective

One of the more impactful statements that I have heard in the last couple of years on this subject comes from the CEO of my company, Seth Besmertnik.  He showed the featured image at the top of this post and said:

“There is only one algorithm that really matters. It’s the heart, mind, and soul of your customers.”

This really stuck with me. He has used it in several presentations since then, and I have even included it in a few of my monthly webinars, but that message has changed the way that I look at Google updating its algorithm:

Every algorithm update is a quality update.

Google hasn’t given an official name to an algorithm in quite some time, but folks in the industry have.

Pigeon, Possum, Fred, Columbus, etc. were never named by Google. They were all just labeled “Quality Updates.”

Folks in our industry took it upon themselves to give these updates a name, which gave them power and started the cycle of complaining and analysis paralysis all over again.

What if we all just thought about all algorithm updates as “quality updates” and not these separate entities that are working against us?

What if we took the changes in stride, and when we saw a decline we took action instead of going to comment sections, social media, or forums to complain?

A good example of this out in the real world:

I had a conversation with a friend the other day who was becoming increasingly worried about the shifts he was reading about from Glenn Gabe, president of G-Squared Interactive. He follows Gabe religiously and emailed me for advice on what he could do because he was seeing some fluctuations with his traffic.

I dug through his analytics and rankings and any other data I could get my hands on and realized that he had led me down the rabbit hole again. I then thought to myself, “this is an opportunity to change someone’s mind,” and I wrote him the following response:

I don’t see anything alarming, we are still up and pretty comparable to the previous weeks, so while tomorrow could look different, today we seem to be fine.

Have to look at your site through some blinders sometimes. Your site is not any other site on the web.  All the work you have done has been towards upping the quality of the site, which is why you have done well. All of these algorithm shifts are quality updates, regardless if they have a name or specific purpose or not. Try not to obsess too much over them, it’s out of our control and all we can control is how well the site is taken care of.

As long as you keep down that road, you’ll be fine. Ups and downs will happen, but algo shifts are like death and taxes, there’s nothing we can do about them, so we might as well not worry about it.

His response?

Well put. Thanks.

I caught up with him a few days later and asked him how it was going. He told me his fears were dispelled and he wound up writing five really killer blog posts that are already seeing a lift for a few niche products he offers in his space.

He stopped worrying about an algorithm update, and he instead spent that time in a more productive manner which improved the quality of his site.

This is how we should be spending our time.  This how I am telling my clients to spend their time.

If you are seeing declines when these shifts happen, you only need to do one thing:

Take a good hard look at your site and make it better.

Every update is a quality update, regardless of whether it has a name or specific function.

Stop spending your time searching for silver bullets and start making your site better.


Image Credit

Featured Image by Conductor. Used with permission.





SEO

via Search Engine Journal http://ift.tt/1QNKwvh

October 31, 2017 at 10:25AM
0 Comments

8 Ways to Measure Social with Google Analytics by @coreydmorris

10/31/2017

0 Comments

 
http://ift.tt/2xGO5wz

8 Ways to Measure Social with Google Analytics by @coreydmorris

http://ift.tt/2gWSeJI

The maturity and widespread acceptance of social media marketing, combined with the expectation of being able to track everything in the era of big data, has created a lot of expectations.

It has also raised deeper questions about performance.

Using Google Analytics, we have the power to go deeper to prove the impact and value of our marketing efforts.

We should never start the answer to a question months into a social media campaign with “I think” when we have the capacity to know the impact of digital marketing activities for sure.

Google Analytics can be a great source of deeper insights for the social media marketer.

This post will show you eight specific reports and areas to leverage for painting a clearer picture of how social media is involved in driving website traffic and conversions.

What is often overlooked when using Google Analytics is the full customer journey. By default, Google Analytics features “last click” attribution to conversion reporting which doesn’t show the full picture of how visitors are interacting with our websites.

1. Audience > Demographics & Interests

When running a social media campaign, we need to make sure we’re getting the right target personas to the website.

Not all clicks are the same.

When we look only at our social media platform analytics (Facebook, Twitter, LinkedIn, etc.), we’re just seeing engagement and clicks from their perspective. We don’t see what the visitors ultimately do when they land on our site.

Even with our best attempt at demographic and interest targeting, we aren’t guaranteed that the right people are clicking through.

Google Analytics allows us to see the basic demographic information, as well as interests, of our visitors. We can filter it down to social media traffic as a segment.

This is powerful information to ensure that our social media targeting is actually driving the traffic we expect and can help with insights for related audiences to expand our targeting.

2. Audience > User Explorer

To validate what we expect and assume in the customer journey, the audience user explorer is a great tool for drilling down to see how a sampling of visitors encountered our website.

While the users are anonymous and we don’t get full details, it’s still helpful to see how repeat visitors entered the site, navigated through it, and when they returned.

Using this info you can find patterns and challenge assumptions about how visitors engage with your content.

User Explorer

3. Conversions

Conversions can be configured for e-commerce sales, lead form submissions, email signups, sessions that include visits to a specific page, and other self-defined goals.

Like most reports in Google Analytics, we can isolate social media traffic as a segment to view conversions.

What is critical to understand is that, by default, when we look at conversions in Google Analytics, we’re seeing them classified by what source a visitor entered the site (direct, SEO, PPC, social, email, referral, etc.) when they met the conversion criteria.

This is the “last click” attribution model of giving credit to the source that the conversion actually happened. This doesn’t take into consideration if the user came to the site three times prior to converting and what the source of the first and second visits were.

In a lot of cases, we see that social media drives early visits in the customer journey even when it ultimately isn’t the source of the “last click” to the site that led to the conversion.

When we only look at last click we aren’t seeing the full picture and impact of all of the marketing channels.

4. Conversions > Model Comparison Tool

To help combat the short-sighted data view of last click attribution that Google Analytics defaults to, we can use the attribution model comparison tool.

It allows you to drill down into your conversion data by channel and switch the model from “last click” to “first click,” “linear,” and other options.

You can even import models from others or create your own custom model.

You don’t have to be a statistician to learn about the attribution model concepts and Google has links to some helpful content in this area so you can better understand how the different models impact your reporting.

Attribution Model Comparison Tool

5. Conversions > Assisted Conversions

Assisted conversions is an underutilized statistic and report. It has been around for years, but wasn’t as prominent until more recently when attribution became a digital marketing industry buzzword. It can be helpful in painting a complete picture of the impact of different channel sources.

Assisted conversions are credited when Google Analytics has tracked multiple sessions for a user prior to the conversion.

While the default of last click attribution is applied, it gives credit to sources that a user entered the site through in sessions prior to the final one where they took a conversion action.

For channels like social media, this is another valuable area where you can track how much social contributes as a source earlier in the customer journey.

If you’re running an e-commerce site and have revenue data in Google Analytics, you can see an actual dollar amount attributed to the source as an “assist.”

Assisted Conversions

6. Conversions > Top Conversion Paths

While assisted conversions is an aggregated stat for channels that drove visits earlier in the customer journey, it’s important to know what specific journeys or (as Google Analytics labels them) paths to conversion based on sessions.

This report is great at showing the specific combinations of sources driving visits to the site and frequency of the combinations.

You can really see how the customer journey is playing out in this report.

Conversion Paths

7. Acquisition > Social Reports

Google Analytics has added some reports exclusively focusing on social media. While you can find all of this information in other areas of Google Analytics by going into the separate reports and selecting social media as a filtered segment, it’s nice to have all of these preset and tailored to social traffic.

The reports include:

  • An overview.
  • Drill-down into specific social media sites that are sending traffic.
  • The top landing pages for traffic coming from social.
  • Conversions specific to social media networks.
  • The flow of users within the site showing how they engage with content once they are on your site.

These specific reports can provide valuable insight into whether your social traffic is meeting your goals once those visitors are on your site.

While some social strategies focus on engagement and brand awareness that stays on the social media platforms, if you’re focusing on getting users to your site, you’ll want to know as much as possible about what they do when they get there.

Social Reports

8. Benchmarking

The benchmarks report in Google Analytics is great because you can compare your traffic metrics to those in your industry.

You can even change the dropdown to different industries and segments and see how the numbers change.

Whether you operate in a niche or broad category, this tool is incredibly helpful for setting expectations and goals for improvement across different channels including social media traffic.

Benchmarking

Takeaways

The key to reporting and showing our successes in social media marketing rely heavily on showing the full picture.

The true impact of social media can be known within the customer journey.

By focusing on the whole journey, and steps of consideration before a prospect becomes a lead or customer, we can get attribution right.

When we know what we’re measuring and where to get the data, we can create the reporting plan that is right for our needs to keep meaningful data coming to us and integrated into our mindset rather than having it all live in a place in the big data world that seems hard to find or too overwhelming to piece together.

 More Google Analytics Resources Here:


Image Credits
Screenshots by Corey Morris. Taken August 2017.





SEO

via Search Engine Journal http://ift.tt/1QNKwvh

October 31, 2017 at 10:25AM
0 Comments
<<Previous

    Categories

    All
    Conversions
    Landing Pages
    Lead Generation
    Link Building
    Search Ranking
    SEO

    Archives

    November 2020
    October 2020
    September 2020
    January 2020
    November 2019
    July 2019
    June 2019
    May 2019
    April 2019
    March 2019
    February 2019
    January 2019
    December 2018
    November 2018
    October 2018
    September 2018
    August 2018
    July 2018
    June 2018
    May 2018
    April 2018
    March 2018
    February 2018
    January 2018
    December 2017
    November 2017
    October 2017
    September 2017
    August 2017
    July 2017
    June 2017
    May 2017
    April 2017
    March 2017
    February 2017

    RSS Feed

All content copyrighted (C) 2010 ~ 2020
​All Photos & Content Used Under Creative Commons
​www.RickRea.com 701-200-7831
Privacy Policy
  • Home
  • Blog
    • Social Media News
    • SEO Marketing News
    • Digital Trends News
    • Photography News
    • Mobile Marketing
    • Business News
    • Gadget News
    • Printing News
  • Contact
  • About
  • Subscribe