RICK REA: Helping You Grow Through Online Marketing
  • Home
  • Blog
    • Social Media News
    • SEO Marketing News
    • Digital Trends News
    • Photography News
    • Mobile Marketing
    • Business News
    • Gadget News
    • Printing News
  • Contact
  • About
  • Subscribe

SEO Marketing News

SearchCap: Google audiobooks local spam & SEO metrics

11/30/2017

1 Comment

 
http://ift.tt/2kAEqFY

SearchCap: Google audiobooks, local spam & SEO metrics

http://ift.tt/2i6jdzC

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Google audiobooks, local spam & SEO metrics appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.




SEO

via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/1BDlNnc

November 30, 2017 at 03:00PM
1 Comment

Daily Search Forum Recap: November 30 2017

11/30/2017

0 Comments

 


Daily Search Forum Recap: November 30, 2017

http://ift.tt/2AJEb2H

Here is a recap of what happened in the search forums today...





SEO

via Search Engine Roundtable http://ift.tt/1sYxUD0

November 30, 2017 at 03:00PM
0 Comments

Law firms spamming Google My Business: Dont trust your money or your life to them!

11/30/2017

0 Comments

 
http://ift.tt/2jAMvqy

Law firms spamming Google My Business: Don’t trust your money or your life to them!

http://ift.tt/2jy9PF2

Last year, I wrote a piece addressed to SEO companies showing how much they were spamming Google Maps and giving the industry a bad reputation. If I worked at Google, this type of stuff would make me hate SEO companies and have no desire to help them.

Lately, I’ve been seeing this same level of spam (or worse) in the legal industry. If you’re an attorney or a marketing agency that works with attorneys, this article is for you.

Personally, if I were looking to hire an attorney and trust my money and my life to someone, the last place I would look is Google, due to my knowledge about how unreliable the information is and how fabricated the reviews are. Let’s get into some specifics.

Fake reviews

Attorneys often complain about how hard it is to get their clients to leave reviews. I get it. Someone rarely wants to publicize who they hired to help them with their divorce or admit that they had to hire a criminal lawyer. This does not, however, excuse what attorneys are doing to get reviews in spite of this.

One common trend amongst attorneys currently is review swapping. Although sites like Avvo might have sections that encourage peer reviews, they do a good job of separating them so that consumers realize they are not reviews from clients.

Google has no such distinction and is very clear in their guidelines that reviews should be about the customer experience. Attorneys you are friends with all around the country do not count as customer reviews. I say this because so far, every review that fits this scenario that I’ve reported to Google has been removed.

In addition to violations of Google’s guidelines, quid pro quo attorney review circles may violate attorney ethics rules. According to Gyi Tsakalakis, a digital marketer with a focus on law firms:

Per the ABA Model Rules, with limited exceptions, lawyers aren’t supposed to give anything of value to a person for recommending the lawyer’s services. The quid pro quo nature of some of these review circles could be construed as a violation of this rule. At the very least, these communications could be interpreted as misleading, which is also prohibited by most states’ rules of professional responsibility.

There also could be legal implications to review swapping. In addition to it being against Google’s guidelines, it could also get you in trouble with the FTC. In an article I wrote on fake reviews earlier this year, Brandon J. Huffman, attorney at Odin Law, mentioned:

The FTC looks at whether you got something of value in exchange for your review. The thing of value is usually cash or a free product of some kind, but the positive review you receive is also something of value. So, this is really no different than a typical paid-for review under the regulations. Businesses would need to disclose that they received a positive review in exchange for their positive review.

Review swaps aren’t the only thing that can get lawyers in trouble with their state Bar Associations. A variety of fake review tactics are likely to lead to sanctions, such as having your employees pose as clients to leave reviews or paying someone to write fake reviews. Indeed, many law firms are just flat-out getting fake reviews posted.

Recently, in looking at the top 20 listings that ranked for personal injury lawyers in a major city in the USA, I found eight that had fake reviews (40 percent).

Fake listings

The most common practice for attorneys who want to rank in several cities is to create listings at virtual offices. When these are reported, Google has been pretty good at removing them. However, attorneys (and their marketing companies) are getting smart at this stuff and have found ways to trick Google My Business support into thinking their fake locations are real locations.

These are also clearly false, or at least misleading, communications about the lawyer’s’ services — a clear violation of attorney ethics rules.

Fake photos

I have experienced this one many times. An attorney will submit photos on their listing that “prove” they exist there, even though the address belongs to a virtual office service provider. These photos are often:

• photoshopped.
• signs that were taped to a wall, only to be removed after the photo was taken.
• photos of a completely different location.

I actually visited an office recently that an attorney was using for a listing on Google. The photos of the signs that he posted did not exist there in real life. So he was willing to actually show up at the office and tape signs to the wall just to “show” Google that he is really at that location. There is a word we use in my circles to describe this type of thing — and it’s called lying.

As business author Stephen Covey says:

The more people rationalize cheating, the more it becomes a culture of dishonesty. And that can become a vicious, downward cycle. Because suddenly, if everyone else is cheating, you feel a need to cheat, too.

Using other attorneys’ addresses

This is another tactic I’m seeing on the rise in the attorney world. One attorney will get another attorney to accept the postcard from Google My Business so they can get an “address” in that town. Usually, they aren’t competition and practice different types of law, so there isn’t any negative impact on either party. This is also against the guidelines, and when caught, will be removed by Google.

I’m seeing more and more videos being used as evidence on the Google My Business forum to help prove businesses don’t exist at the address they are using. User Garth O’Brien posted another clever idea as a comment on an article by Mockingbird Marketing:

I was aware of a local law firm that did this in Washington. Their competitors called up each city and pointed out that law firm had a physical presence within their city. They inquired if that law firm was paying B&O tax in each city. The law firm was not, so each city called up and asked them to fork over some tax money. That law firm quickly erased each profile for every city [where] they did not have a physical presence.

Keyword stuffing

The final tactic I see being used frequently is keyword stuffing. It’s an old trick that still works well. If you want to rank higher on Google, just shove “Best Attorney Ever City Name” into your business name field in Google My Business.

The problem is that Google will remove the keywords when they catch you. I have also seen them recently suspend a listing for an attorney who wouldn’t stop doing it. Currently, this guy has no ability to edit or control his listing on Google.

Summary

If you are sick of the spam you see in the legal industry, please to continue to report it on the Google My Business forum. I urge you not to let these people get away with the tactics they are using. Also, no matter how tempting it is — never join them!


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Joy Hawkins is a Local SEO expert who is a Google My Business Top Contributor. She regularly contributes to many online communities in the Local SEO world, including the Google My Business forum (Top Contributor), the Local Search Forum (Top Contributor), and the Local University Forum (Moderator). She is also a contributor to the Moz Local Search Ranking Factors survey. Joy is the owner of

Sterling Sky

in Canada and is the author of the

Expert's Guide to Local SEO

, which is an advanced training manual for people wanting a detailed look at what it takes to succeed in the Local SEO space.





SEO

via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/1BDlNnc

November 30, 2017 at 12:36PM
0 Comments

Get the most out of Data Studio Community Connectors

11/30/2017

0 Comments

 


Get the most out of Data Studio Community Connectors

http://ift.tt/2isPeWf

Data Studio Community Connectors enable direct connections from Data Studio to any internet accessible data source. Anyone can build their own Community Connector or use any available ones.

Try out the new Community Connectors in the gallery

We have recently added additional Community Connectors to the Data Studio Community Connector gallery from developers including: DataWorx, Digital Inspiration, G4interactive, Kevpedia, Marketing Miner, MarketLytics, Mito, Power My Analytics, ReportGarden, and Supermetrics. These connectors will let you access data from additional external sources, leveraging Data Studio as a free and powerful reporting and analysis solution. You can now use more than 50 Community Connectors from within the Gallery to access all your data.

Try out these free Community Connectors: Salesforce, Twitter, Facebook Marketing.

Find the connector you need

In the Data Studio Community Connector gallery, it is possible for multiple connectors to connect to the same data source. There are also instances where a single connector can connect to multiple data sources. To help users find the connector they need, we have added the Data Sources page where you can search for Data Sources and see what connectors are available to use. The connector list includes native connectors in Data Studio as well as verified and Open Source Community Connectors. You can directly use the connectors by clicking the direct links on the Data Sources page.

Vote for your data source

If your data source is not available to use through any existing connector, you can Vote for your data source. This will let developers know which Data Sources are most in demand. Developers should also let us know which Community Connector you are building. We will use this information to update the Data Sources page.

Tell us your story

If you have any interesting connector stories, ideas, or if you’d like to share some amazing reports you’ve created using Community Connectors please let us know by giving us a shout or send us your story at [email protected].

Posted by Minhaz Kazi, Data Studio Developer Relations Team




SEO

via Google Analytics Blog http://ift.tt/1Yd8Id0

November 30, 2017 at 12:32PM
0 Comments

My 12 most important SEO metrics to monitor

11/30/2017

0 Comments

 
http://ift.tt/2Bor6rP

My 12 most important SEO metrics to monitor

http://ift.tt/2jzoi3C

As a digital marketer, you can measure the success of your work in several ways. One of those ways is by examining key SEO metrics.

Fortunately, there are plenty of tools that provide you with easy-to-read reports so you can check those metrics. Two of the best utilities, Google Search Console and Google Analytics, are not only offered for free, but most of the metrics you need to focus on can be gathered from either one of those tools.

But which metrics are the most important to track? Here are 12 that stand out from the pack.

1. Organic traffic

Organic traffic is defined as traffic you earn from appearing in the search engine results pages (SERPs) without paying for placement.

That’s the essence of SEO, after all. You want your site to rank for keywords related to your niche.

It’s important to track your overall organic traffic so that you can see how many people are visiting your site as a result of your SEO strategy.

By landing page

Overall organic traffic is sitewide. You also need to track organic traffic by landing page. Why? Because that’s how you can determine where you need improvement.

If you find that some pages are ranking on page 1 while others are on page 7, you know that you need to direct your SEO efforts towards those pages that are ranking poorly.

Additionally, if you’re using different SEO strategies for different pages, you’ll get an idea of which strategies work best when you compare rankings.

By location

It’s important to track where your organic traffic comes from. This is especially true if your SEO efforts are meant to target specific geographic locations or if you’re planning to expand your business into new markets.

First, you should track organic traffic by country. You might be surprised to learn that you have a strong fan base overseas. If that’s the case, then you may want to consider updating your marketing strategy to include expansion into these markets. (Time for some international SEO!)

Alternatively, if you’re seeing heavy organic traffic from countries that aren’t profitable for your business, you may want to figure out why that is. It’s possible that you may need to adjust your SEO strategy to focus more on your target countries.

Even if the vast majority of your organic traffic comes from within the US, it’s possible that your product or service appeals to people in some states more than others. The only way you can know that is by tracking organic traffic by state.

If you find that people in certain states like your brand better than people in other states, you can divert more marketing resources into those states so that you can improve sales. If states that are important to your business aren’t performing well, that may be a sign that you need to tweak your website experience to better target this audience segment.

Drilling down even further, it might be the case that your brand appeals to people in metropolitan areas. That’s why it’s good to examine organic traffic by city.

Again, allocate your resources where you’re likely to get the best ROI.

2. Organic bounce rate

The bounce rate tells you how many people “bounced” away from your site after only viewing one page. It’s measured as a percentage of visitors, with a lower number being better.

If you see that you have a high bounce rate, that may mean you need to do some on-site work to keep people around. For example, you could show links to related posts or other items of interest in the right-hand sidebar.

By landing page

It’s also a good idea to inspect the bounce rate by landing page. That way, you can see which landing pages tend to turn away visitors and which ones keep them hanging around for more.

If a landing page has a high bounce rate, that could indicate that the content on the page didn’t match the keyword the visitor plugged into the search engine. (It could also mean the person quickly found what they needed and left, so be careful here.)

3. Organic conversion rate

Remember: Organic traffic only gets people to your website — it doesn’t mean you’ve made the sale. That’s why you need to measure the conversion rate as well.

You’ll want to check your aggregate conversion rate for organic traffic. That way, you’ll get an idea of how well you’re appealing overall to people who arrive at your site from the search results. However, you’ll also want to drill down into various segments to see what factors are impacting conversion rates.

By landing page

You may wish to measure conversion rate by landing page. Why? Because conversions are usually won or lost on the page itself. If you find that one page has a much higher conversion rate than another, then that could mean one doesn’t have an effective marketing message.

By location

By tracking organic conversions by geographic location, you might find that your messaging appeals to people in specific areas. If you do find that your message resonates with people in one or more locations, follow basic principles of Business 101 and push more marketing dollars into those regions.

By device

It’s almost impossible to capture a healthy market share unless you appeal to a mobile audience. To check how well your site appeals to people on mobile devices, you need to check the conversion rate by device for organic traffic.

If you find that your conversions for desktop users are unusually higher than conversions for smartphone or tablet users, then your site probably isn’t optimized for a mobile audience. Run some tests and contact your development team to improve the mobile experience.

By browser

Your job would be a lot easier if there were only one browser and everybody used it. Unfortunately, that’s not the case.

That’s why you need to check conversion rate by browser for organic traffic.

If you find that people on one browser convert much higher than people on other types of browsers, that usually means that your site is user-hostile to people using those other browsers. Contact your development team and ask them to ensure that the site works across all popular browsers.

I recently worked with a client and found their site didn’t work on Samsung Galaxy phones. When we fixed it, they started making an extra $50,000 a month.

4. Top exit pages for organic traffic

Exit pages are the last pages that people visit before they leave your site. It’s important that you track the top exit pages. Why? Because those pages are probably your “problem children.”

They’re pages that cause people to lose interest in your site and go elsewhere. See what you can do to improve those pages so that visitors hang around for a little longer.

5. Breakdown of organic traffic from Bing and Google

Although Google is the most popular search engine, it’s not the only search engine. Many of your customers use Bing, too.

That’s why you should examine your organic traffic breakdown between those two search engines.

If you find that you’re not pulling in the expected traffic you think you should from one search engine or the other, it’s probably a great idea to update your SEO strategy.

I often see that people do not focus enough on Bing when looking at this report.

6. Keywords ranked in Google

You may wish to use a keyword tracking tool like SEMrush to determine the total number of keywords for which your site ranks in Google. Once you know what keywords your site is ranking for, there are numerous ways you can use that data to inform your SEO strategy.

Take note of which keywords you want to rank for but aren’t yet — these are the keywords you may want to focus on in your SEO campaigns.

It’s also a good idea to capitalize on your existing success. If you find that your site ranks in the top 10 for some high-converting keywords, continue using those keywords in your content marketing campaigns to ensure that you stay there. Your top-ranking keywords are likely bringing you the most traffic, so make sure that the landing pages associated with those keywords are relevant to keep your bounce rate low.

7. Local visibility

If your business has one or more physical locations that local customers can visit directly, it’s very important that you keep track of your local visibility.

Specifically, is your site appearing in the local 3-pack for keywords related to your niche? Is it appearing when people type the name of your town or city plus the name of your industry? If not, it’s time to work on some local SEO.

8. Click-through rate (CTR)

Google Search Console offers a Search Analytics report that shows the average percentage of people who click on one of your links after seeing it in the search results. That percentage is called the click-through rate (CTR). It’s a stat you should pay attention to because it tells you more than just how well your pages rank in the SERPs. It also tells you how much the content appeals to people.

If people like what they see of your content in the search results, they’ll click the link. If not, they’ll move on to another result.

By landing page

Examining CTR by landing page will show you your money-makers from an SEO perspective. Those are the pages that get the most attention from the search results.

You should also look at the pages with the lowest CTRs and optimize them.

By top keywords

Another stat to check is the CTR of your top search terms in Google Search Console. If you see that a term is getting you a lot of clicks, you should determine which pages are ranking for those keywords and ensure that your page content accurately reflects searcher intent. It might be a good idea to test conversion optimization elements on these pages, too.

On the flip side, if you observe a low CTR for a valuable search term, you should look at the page(s) optimized for that term and find out why. It might be that the title or description associated with the page isn’t relevant or enticing.

9. Pages indexed in Google Search Console

One thing is certain: Nobody is going to find a webpage in the search results if it isn’t indexed. That’s why you need to pay attention to the number of pages on your website that have been indexed.

If you find that it takes an unusually long time for your pages to get indexed, you can always submit them manually using the Crawl>Fetch as Google option in the Search Console.

You should also take note of how many pages are indexed relative to how many pages have been submitted. Again, if you find that a small percentage of your submitted pages are indexed, you might need to manually request indexing via the Search Console.

10. Pages crawled per day

The Google Search Console will also show you how many pages have been crawled every day for the last 90 days.

If you have thousands of pages, and only a small percentage of them are getting crawled, that could point to a problem with your crawl budget. Google won’t crawl your entire site if it looks like its bot will consume too many of your system resources in doing so.

11. Duplicate titles and descriptions

You can also use Google Search Console to check the number of duplicate titles and descriptions on your site. As a rule of thumb, duplicate content is a no-no. When multiple pages have the same title tags and meta descriptions, that tells search engines that all those pages are about the same topic; this can dilute your topical authority and limit your ability to rank well for those terms.

If you find that you’ve got duplicate content on your site, it’s a good idea to update it so that it’s unique or block it.

12. Crawl errors

Google Search Console also provides you with crawl errors. Although the default report shows sitewide errors, you can also use a filter to view errors by segment. Any crawl errors you find should be addressed right away.

Follow your SEO metrics closely

I find it fascinating how many SEO metrics there really are. And the ones I mentioned here are just the start.

The longer I work in digital marketing, the more I learn. I encourage you to really dive deep into your analytics and get good at determining which data is most helpful for measuring SEO success.


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

John Lincoln is CEO of

Ignite Visibility

, a digital marketing teacher at the University of California San Diego and author of the book

Digital Influencer, A Guide to Achieving Influencer Status Online

. Throughout his career, Lincoln has worked with hundreds of websites, ranging from start-ups to household names, and has won awards in SEO, CRO, analytics and Social Media. In the media, Lincoln has been featured on sites such as Forbes, Entrepreneur Magazine, Inc. Magazine, CIO magazine and more.





SEO

via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/1BDlNnc

November 30, 2017 at 11:58AM
0 Comments

Safe or Risky SEO: How Dangerous Is It REALLY to Change Your Article Dates? by @ab80

11/30/2017

0 Comments

 
http://ift.tt/2ALciqT

Safe or Risky SEO: How Dangerous Is It REALLY to Change Your Article Dates? by @ab80

http://ift.tt/2iqOIZ9

Fresh content is a powerful currency when it comes to search.

Not only does fresh content keep your brand relevant and help you engage with your audience’s current interests and pain points, but ever since Google’s Freshness update, the more recent and relevant your content is, the better it’s likely to rank in SERPs.

But what about the appearance of freshness? What happens when a website tweaks old content or changes the date on an article from 2014 to today? What are the SEO benefits – and the consequences?

Most importantly, how do you keep “evergreen content” – content that’s meant to withstand the test of time – current and relevant year after year?

In this article, I’ll answer these questions by examining how key influencers in the SEO and digital marketing industry treat date stamps, and I’ll discuss my strategy for combating outdated content.

Why Might You Change Article Dates?

Conrad O’Connell, digital marketing strategist and consultant, has recently encountered an interesting “fake freshness” case.  He noticed a discrepancy between the date on one of Airbnb’s property listings and the number of reviews it had received.

Specifically, a cabin listing stamped with the current date had somehow accrued more than 5 million reviews.

Airbnb’s property listing with date discrepancy

Upon further examination, it appears that various Airbnb pages auto-generate fake dates that roughly correspond to the last time Google crawled them.

The benefit of changing article dates like this is subtle, and it doesn’t obviously improve your SEO (the date listed in your SERP meta description won’t inherently impact your freshness).

Instead, what date manipulation does is appeal to user bias. Users are naturally drawn to the most current, up-to-date information, and there’s a good chance that a user who sees a cabin listing from 2012 and another from 2017 in their SERPs will click the most recent result.

Conrad confirms that he’s seen some pages with a CTR as high as 55 percent after a simple update to the month and year in their title tag. And, date manipulation in SERP meta descriptions isn’t currently penalized, this benefit comes at little personal risk.

The deceitfulness of the date manipulation did leave some people wondering, however:

Shady or smart? Kind of a fine line there…

— Mike Archer (@MrMikeArcher) November 15, 2017

Will Changing Article Dates Impact Your SEO Negatively?

While changing dates in article snippets seemed like a shortcut to higher CTR, it leaves one wondering about the potential future ramifications. After all, just because date manipulation isn’t currently penalized doesn’t mean it won’t be in the future.

Jennifer Slegg from TheSEMPost shared my concerns. In October 2017, she asked Google’s Gary Illyes whether auto-generating dates might have consequences, such as triggering Google’s spam filter:

“From our perspective, from Core Ranking perspective, I’d like to believe that in some way that will hurt you.  At least from, let’s say, we will not believe your dates anymore.

Typically when you search something, especially if it is newsy content, or your query is newsy, then I found that those date bylines in the search results can be very helpful in determining if it is relevant to your query – the result – or not.

Imagine if you were a news publisher and suddenly your byline dates would be gone overnight because we believe you were abusing them, you probably don’t want that.  So I would advise against that.”

The answer seems to be that while you might get away with changing a date a few times, there could certainly be consequences down the road. Consequences could even include the complete removal of dates for your site, which would be a huge blow to news sites.

Furthermore, ShoutMeLoud’s Harsh Agrawal published a recent case study, the results of which seem to contradict Conrad’s findings that recent dates always improve CTR.

While Harsh’s blog posts have always included a “last updated” tag, he has typically not included dates in his site’s snippets. Upon the reintroduction of snippet dates, his blog traffic dropped by almost 40 percent.

Despite the fact that dates should enhance user experience, snippet dates had a significant negative influence on ShoutMeLoud’s keyword ranking and blog traffic and only removing the dates allowed him to recover in SERPs.

How Do You Safely Keep Your Content Fresh?

Keeping your content fresh has little to do with the date on your article, at least as far as Google’s concerned.

There are many factors that affect freshness, including the:

  • Frequency of your updates.
  • Amount of content changed.
  • Rate of new link growth.

The date an article is published is only one of these factors.

 In other words, what really matters is the quality additions you’ve made to an existing page.

There are three main strategies for breathing fresh life into your old content, and all of them hinge on one simple principle: your content needs to be timeless, relevant, and valuable.

 1. Use the Same URL but Refresh the Date

The most common strategy you’ll see is to add even more value to posts that were proven top performers. Typically you do this either by supplementing the article’s original publication date with a “last updated” date stamp or an updated date beneath it.

Search Engine Journal uses this strategy, so I reached out to Danny Goodwin, SEJ’s Executive Editor, to learn what makes this strategy so effective:

“Search Engine Journal has been around since 2003 and published thousands of posts through the years. In an industry that moves as quickly as SEO and digital marketing, information can quickly become outdated – sometimes in as little as a year or sometimes even a few months.

Outdated information is bad for users – which will reflect back on you as a brand/business. If you have a post ranking #1 that was written in 2013, it makes perfect sense to update it.

Like cars, content typically loses value as time goes by. Traffic declines. And if you’re ranking well, those rankings tend to go away as fresher and more up to date (or more thorough) content is published by your competitors.

Here’s one example. When I was editing for Search Engine Watch, we had a popular post, How to Use HTML Meta Tags. I believe it was originally written in 2005 (if not earlier), ranked number one at the time, and typically drove more than 1,000 pageviews to the site every day.

But by 2012, it needed a refresh. SEO had changed quite a bit in those many years. We kept the title, URL, and changed everything else. After republishing (maintaining the same URL), it maintained its top spot and in fact surged for a while.

Here’s another example. Over the years, SEJ had published probably 4-5 posts about optimizing your URL structure. None of them were driving much traffic, and all of them had been published at least five years ago.

So, earlier this year, one of our authors rewrote it, turning it into a comprehensive post on the topic, 8 SEO Tips to Optimize Your URL Structure. After publishing, our developer 301 redirected all the old/outdated posts to the new post.

The final result: Traffic increased by 8x!

seo friendly url post - analytics screenshot

While all these posts existed before, we updated them. That included the publication date. I would never recommend just changing dates for the heck of it – you need to make some significant changes.

Also, with Google showing dates posts were published in the SERPs, I can pretty much guarantee you that will impact which post searchers will click on. If I have the choice of reading content published in 2017 that ranks #3 or something published in 2011 that ranks #1, I often find myself clicking on the newer result. Most of us humans are biased toward the ‘new.’”

2. Add Live Updates to a Single Page

Another strategy that’s been gaining steam in recent years is to publish news as it happens, by updating a single page with live coverage. On these pages, you’ll typically time stamp each new entry as you post them.

Examples of this type of post include FiveThirtyEight’s live coverage of the U.S. elections in March 2016 and the BBC’s live coverage of the solar eclipse in August 2017.

live updates

Live updating a single page and time stamping each new addition to the page is also one of the methods Illyes recommended to Slegg – in addition to adding an updated date to old articles:

“I know that especially the news team are working with lots of news publishers, for example the BBC, on trying to figure out how to put content online that is better for the users, and BBC has these very interesting live coverage pages and basically they just time stamp every single addition to the page – that works too.”

3. Create New Landing Pages with Distinct URLs

At the SEO PowerSuite blog, we use a third method to refresh valuable, old content.

When we find a blog post that’s in need of an update, and we can see that it’s performed well in the past, we prefer to create a new landing page altogether. That means creating a new URL that shares some of the same content, with extra added value for our users.

We then specify to Google that the new page’s URL is canonical, to avoid posting duplicate content. A crucial final step is to support the new page in social media.

Conclusion

My own preference is not to change article dates. But naturally the path you choose is up to you. No matter which strategy you choose, you need to remember that not every article is worth updating.

The best types of content to breathe more life into are those that are evergreen – content that is just as relevant now as it was when you first published it. The content should also be detailed enough to have resonated with your audience when it was first published, and there should be enough new, helpful information about that topic to add new value without derailing your original article’s subject.

And, just as you would with any other new article, don’t forget to encourage new social shares. Use social media to tell your users that you’ve updated your content, and actively engage with them through social media and your comments section to keep them interested in your update. That’s the best way to ensure your content stays fresh.

More SEO & Content Resources Here:


Image Credits

Featured Image and In-Post screenshots: Created by Aleh Barysevich, November 2017.





SEO

via Search Engine Journal http://ift.tt/1QNKwvh

November 30, 2017 at 11:09AM
0 Comments

Google book search now includes audiobook results

11/30/2017

0 Comments

 
http://ift.tt/2zThHvW

Google book search now includes audiobook results

http://ift.tt/2ipKh0r

Google has added an audiobook option to its book search feature.

Now, if you search for a specific book title, the Google book search feature includes an “Audiobook” button under the “Get Book” tab that will display different audiobook platforms offering the title.

The book search update was announced via the following tweet:

Have audiobook, will travel. Book the perfect holiday road trip read with new audiobook options, now in Search. http://pic.twitter.com/GHPmAaDWjX

— Google (@Google) November 29, 2017

To actually listen to the audiobook, users must select their preferred audiobook app.


About The Author

Amy Gesenhues is Third Door Media's General Assignment Reporter, covering the latest news and updates for Search Engine Land and Marketing Land. From 2009 to 2012, she was an award-winning syndicated columnist for a number of daily newspapers from New York to Texas. With more than ten years of marketing management experience, she has contributed to a variety of traditional and online publications, including

MarketingProfs.com

,

SoftwareCEO.com

, and Sales and Marketing Management Magazine. Read more of Amy's articles.





SEO

via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/1BDlNnc

November 30, 2017 at 10:27AM
0 Comments

LaterPay offers first paywall platform for AMP pages

11/30/2017

0 Comments

 
http://ift.tt/2AKud0P

LaterPay offers first paywall platform for AMP pages

http://ift.tt/2zDxif1

AMP (accelerated mobile pages) is designed to deliver publishers’ pages quickly on mobile devices, but the stripped-down format lacks functionality in some areas.

This week, the German-Swiss online payment infrastructure provider LaterPay is releasing what it says is the first AMP-enabled paywall and subscription platform, called AMP Access.

While there are other custom solutions, such as from The Washington Post, LaterPay CEO and founder Cosmin Ene told me he is unaware of any other out-of-the-box offering.

[Read the full article on MarTech Today.]


About The Author

Barry Levine covers marketing technology for Third Door Media. Previously, he covered this space as a Senior Writer for VentureBeat, and he has written about these and other tech subjects for such publications as CMSWire and NewsFactor. He founded and led the web site/unit at PBS station Thirteen/WNET; worked as an online Senior Producer/writer for Viacom; created a successful interactive game, PLAY IT BY EAR: The First CD Game; founded and led an independent film showcase, CENTER SCREEN, based at Harvard and M.I.T.; and served over five years as a consultant to the M.I.T. Media Lab. You can find him at LinkedIn, and on Twitter at xBarryLevine.





SEO

via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/1BDlNnc

November 30, 2017 at 10:09AM
0 Comments

A Complete Guide to the Google Penguin Algorithm Update by @TaylorDanRW

11/30/2017

0 Comments

 
http://ift.tt/2irMbO6

A Complete Guide to the Google Penguin Algorithm Update by @TaylorDanRW

http://ift.tt/2kaM1eG

Editor’s note: This post is part of an ongoing series looking back at the history of Google algorithm updates. Enjoy!


In 2012, Google officially launched the “webspam algorithm update,” an algorithm specifically targeting link spam and manipulative link building practices.

The webspam algorithm later became known (officially) as the Penguin algorithm update via a tweet from Matt Cutts, who was then head of the Google webspam team. While Google officially named the algorithm Penguin, there is no official word on where this name came from.

The Panda algorithm name came from one of the key engineers involved with it, and it’s more than likely that Penguin originated from a similar source. One of my favorite Penguin naming theories is that it pays homage to The Penguin, from DC’s Batman.

Minor weather report: We pushed 1st Penguin algo data refresh an hour ago. Affects <0.1% of English searches. Context: http://t.co/ztJiMGMi

— Matt Cutts (@mattcutts) May 26, 2012

Prior to the Penguin algorithm, link volume played a larger part in determining a webpage’s scoring when crawled, indexed, and analyzed by Google.

This meant when it came to ranking websites by these scores for search results pages, some low-quality websites and pieces of content appeared in more prominent positions of the organic search results than they should have.

Why Google Penguin Was Needed

Google’s war on low-quality started with the Panda algorithm, and Penguin was an extension and addition to the arsenal to fight this war.

Penguin was Google’s response to the increasing practice of manipulating search results (and rankings) through black hat link building techniques. Cutts, speaking at the SMX Advanced 2012 conference, said:

We look at it something designed to tackle low-quality content. It started out with Panda, and then we noticed that there was still a lot of spam and Penguin was designed to tackle that.

The algorithm’s objective was to gain greater control over, and reduce the effectiveness of, a number of black hat spamming techniques.

By better understanding and process the types of links websites and webmasters were earning, Penguin worked toward ensuring that natural, authoritative and relevant links rewarded the websites they pointed to, while manipulative and spammy links were downgraded.

Initial Launch & Impact

When Penguin first launched in April 2012, it affected more than 3 percent of search results, according to Google’s own estimations.

Minor weather report: We pushed 1st Penguin algo data refresh an hour ago. Affects <0.1% of English searches. Context: http://t.co/ztJiMGMi

— Scary Matt Cutts (@mattcutts) May 26, 2012

Penguin 2.0, the fourth update (including the initial launch) to the algorithm was released in May 2013, and affected roughly 2.3 percent of all queries.

Key Google Penguin Updates & Refreshes

There have been a number of updates and refreshes to the Penguin algorithm since it was launched in 2012, and possibly a number of other tweaks that have gone down in history as unknown algorithm updates.

Penguin 1.1: March 26, 2012

This wasn’t a change to the algorithm itself, but a refresh of the data within it.

In this instance, websites that had initially been affected by the launch who had been proactive in clearing up their link profiles saw some recovery, while others who hadn’t been caught by Penguin first time round saw an impact.

This was the first data refresh.

Penguin 1.2: October 5, 2012

While this was another data refresh, I feel this is worth mentioning as it didn’t only affect queries in the English language, but also affected international queries.

Weather report: Penguin data refresh coming today. 0.3% of English queries noticeably affected. Details: http://t.co/Esbi2ilX

— Scary Matt Cutts (@mattcutts) October 5, 2012

Penguin 2.0: May 22, 2013

This was a more technically advanced version of the Penguin algorithm and changed how the algorithm impacted search results.

Penguin 2.0 impacted around 2.3 percent of English queries, as well as other languages proportionately.

This was also the first Penguin update to look deeper than the websites homepage and top-level category pages for evidence of link spam being directed to the website.

The first refresh to Penguin 2.0 (2.1) came on October 24 of the same year. It affected a further 1 percent of queries.

While there was no official explanation from Google, data suggests that the 2.1 data refresh also advanced on how deep Penguin looked into a website and crawled deeper and conducted further analysis as to whether spammy links were contained.

Penguin 3.0: October 17, 2014

While this was named like a major update, it was, in fact, another data refresh; allowing those impacted by previous updates to emerge and recover, while many others who had continued to utilize spammy link practices, and had escaped the radar of the previous impacts saw an impact.

Googler Pierre Far confirmed this through a post on his Google+ profile and that the update would take a “few weeks” to roll out fully.

Far also stated that this update affected less than 1 percent of English search queries.

Penguin 4.0, September 23, 2016

Almost two years after the 3.0 refresh, the final Penguin algorithm update was launched.

The biggest change with this iteration was that Penguin became a part of the core algorithm.

When algorithm transcends to become a part of the core, it doesn’t mean that the algorithm’s functionality has changed or may change dramatically again. It means that Google’s perception of the algorithm has changed, not the algorithm itself.

Now running concurrently with the core, Penguin evaluates websites and links in real-time. This meant that you can see (reasonably) instant impacts of your link building or remediation work.

The new Penguin also wasn’t closed-fisted in handing out link-based penalties but rather devalued the links themselves. This is a contrast to the previous Penguin iterations, where the negative was punished.

That being said, studies and, from personal experience, algorithmic penalties relating to backlinks still do exist.

Data released by SEO professionals (e.g., Michael Cottam), as well as seeing algorithmic downgrades lifted through disavow files after Penguin 4.0, enforce this belief.

Penguin Algorithmic Downgrades

Soon after the Penguin algorithm was introduced, webmasters and brands who had used manipulative link building techniques or filled their backlink profiles with copious amounts of low-quality links began to see decreases in their organic traffic and rankings.

Not all Penguin downgrades were site-wide – some were partial and only affected certain keyword groups that had been heavily spammed and over optimized, such as key products and in some cases even brand.

A website impacted by a Penguin penalty, which took 17 months to lift.

A website impacted by a Penguin penalty, which took 17 months to lift.

The impact of Penguin can also pass between domains, so changing domain and redirecting the old one to the new can cause more problems in the long run.

Experiments and research shows that using a 301 or 302 redirect won’t remove the effect of Penguin, and in the Google Webmasters Forum, John Mueller confirmed that using a meta refresh from one domain to a new domain could also cause complications.

In general, we recommend not using meta-refresh type redirects, as this can cause confusion with users (and search engine crawlers, who might mistake that for an attempted redirect).

Google Penguin Recovery

The disavow tool has been an asset to SEO practitioners, and this hasn’t changed even now that Penguin exists as part of the core algorithm.

As you would expect, there have been studies and theories published that disavowing links doesn’t, in fact, do anything to help with link-based algorithmic downgrades and manual actions, but this has theory has been shot down by Google representatives publicly.

That being said, Google recommends that the disavow tool should only be used as a last resort when dealing with link spam, as disavowing a link is a lot easier (and a quicker process in terms of its effect) than submitting reconsideration requests for good links.

Monitoring backlinks is also an essential task, as sometimes the industry we work in isn’t entirely honest and negative SEO attacks can happen. This also means that using the disavow feature without a clear sign of an algorithmic penalty or a notification of a manual action.

Interestingly, however, a poll conducted by SEJ in September found that 38 percent of SEOs never disavow backlinks. Going through a backlink profile, and scrutinizing each linking domain as to whether it’s a link you want or not, is not a light task.

Google recommends that you attempt to outreach to websites and webmasters where the bad links are originating from first and request their removal before you start disavowing.

While this is probably the most effective way to recover from a link-based penalty, it isn’t always necessary. The Penguin algorithm also takes into account the link profile as a whole, and the volume of high-quality, natural links versus the number of spammy links.

While in the instances of a partial penalty (impacting over-optimized keywords) the algorithm may still affect you, the essentials of backlink maintenance and monitoring should keep you covered.

Some webmasters even go as far as including “terms” within the terms and conditions of their website and actively outreaching to websites they don’t feel should be linking to them:

TOS linking

Website terms and conditions regarding linking to the website in question.

No Recovery in Sight?

Sometimes after webmasters have gone to great lengths to clean up their link profiles, and even after a known Penguin refresh, they still don’t see an increase in traffic or rankings.

There are a number possible reasons behind this, including:

  • The initial traffic and ranking boost seen prior to the algorithmic penalty was unjustified (and likely short-term) and came from the bad backlinks.
  • When links have been removed, no efforts have been made to gain new backlinks of greater value.
  • Not all the negative backlinks have been disavowed/a high enough proportion of the negative backlinks have been removed.
  • The issue wasn’t link-based, to begin with.

Penguin Myths & Misconceptions

One of the great things about the SEO industry and those involved in it is that it’s a very active and vibrant community and there are always new theories and experiment findings being published online daily.

Naturally, this has led to a number of myths and misconceptions being born about Google’s algorithms. Penguin is no different.

Here are a few myths and misconceptions about the Penguin algorithm we’ve seen over the years.

Myth: Penguin Is a Penalty

One of the biggest myths about the Penguin algorithm is that people call it a penalty (or what Google refers to as a manual action).

Despite the fact that an algorithmic change and a penalty can both cause a big downturn in website rankings, there are some pretty drastic differences between them.

A penalty (or manual action) happens when a member of Google’s webspam team has responded to a flag, investigated and felt the need to enforce a penalty on the domain. You will receive a notification through Google Search Console relating to this manual action.

When you get hit by a manual action, not only do you need to review your backlinks and submit a disavow for the spammy ones that go against Google’s guidelines, but you also need to submit a reconsideration request to the Google webspam team.

If successful, the penalty will be revoked, and if unsuccessful it’s back to reviewing the backlink profile.

A Penguin downgrade happens without any involvement of a Google team member. It’s all done algorithmically.

Previously, you would have to wait for a refresh or algorithm update, but now Penguin runs in real time so recoveries can happen a lot faster (if enough remediation work has been done).

Myth: Google Will Notify You if Penguin Hits Your Site

Another myth about the Google Penguin algorithm is that you will be notified if it has been applied.

Unfortunately, this isn’t true. The Search Console won’t notify you that your rankings have taken a dip because of the application of the Penguin.

Again, this shows the difference between an algorithm and a penalty – you would be notified if you were hit by a penalty. However, the process of recovering from Penguin is remarkably similar to that of recovering from a penalty.

Myth: Disavowing Bad Links Is the Only Way to Reverse a Penguin Hit

While this tactic will remove a lot of the low-quality links, it is utterly time-consuming and a potential waste of resources.

Google Penguin looks at the percentage of good quality links compared to those of a spammy nature.

So, rather than focusing on manually removing those low-quality links, it may be worth focusing on increasing the number of quality links your website has. This will have a better impact on the percentage Penguin takes into account.

Myth: You Can’t Recover From Penguin

Yes, you can recover from Penguin.

It is possible, but it will require some experience in dealing with the fickle nature of Google algorithms.

The best way to shake off the negative effects of Penguin is to forget all of the existing links on your website, and begin to gain original editorially-given links.

The more of these quality links you gain, the easier it will be to release your website from the grip of Penguin.


Image Credits
Featured Image: Shutterstock, modified by Danny Goodwin
Screenshots taken by author





SEO

via Search Engine Journal http://ift.tt/1QNKwvh

November 30, 2017 at 02:57AM
0 Comments

The Complete Guide to Direct Traffic in Google Analytics

11/29/2017

0 Comments

 
http://ift.tt/2Aggldx

The Complete Guide to Direct Traffic in Google Analytics

http://ift.tt/2AjhP7g

Posted by tombennet

When it comes to direct traffic in Analytics, there are two deeply entrenched misconceptions.

The first is that it’s caused almost exclusively by users typing an address into their browser (or clicking on a bookmark). The second is that it’s a Bad Thing, not because it has any overt negative impact on your site’s performance, but rather because it’s somehow immune to further analysis. The prevailing attitude amongst digital marketers is that direct traffic is an unavoidable inconvenience; as a result, discussion of direct is typically limited to ways of attributing it to other channels, or side-stepping the issues associated with it.

In this article, we’ll be taking a fresh look at direct traffic in modern Google Analytics. As well as exploring the myriad ways in which referrer data can be lost, we’ll look at some tools and tactics you can start using immediately to reduce levels of direct traffic in your reports. Finally, we’ll discover how advanced analysis and segmentation can unlock the mysteries of direct traffic and shed light on what might actually be your most valuable users.

What is direct traffic?

In short, Google Analytics will report a traffic source of "direct" when it has no data on how the session arrived at your website, or when the referring source has been configured to be ignored. You can think of direct as GA’s fall-back option for when its processing logic has failed to attribute a session to a particular source.

To properly understand the causes and fixes for direct traffic, it’s important to understand exactly how GA processes traffic sources. The following flow-chart illustrates how sessions are bucketed — note that direct sits right at the end as a final "catch-all" group.

Broadly speaking, and disregarding user-configured overrides, GA’s processing follows this sequence of checks:

AdWords parameters > Campaign overrides > UTM campaign parameters > Referred by a search engine > Referred by another website > Previous campaign within timeout period > Direct

Note the penultimate processing step (previous campaign within timeout), which has a significant impact on the direct channel. Consider a user who discovers your site via organic search, then returns via direct a week later. Both sessions would be attributed to organic search. In fact, campaign data persists for up to six months by default. The key point here is that Google Analytics is already trying to minimize the impact of direct traffic for you.

What causes direct traffic?

Contrary to popular belief, there are actually many reasons why a session might be missing campaign and traffic source data. Here we will run through some of the most common.

1. Manual address entry and bookmarks

The classic direct-traffic scenario, this one is largely unavoidable. If a user types a URL into their browser’s address bar or clicks on a browser bookmark, that session will appear as direct traffic.

Simple as that.

2. HTTPS > HTTP

When a user follows a link on a secure (HTTPS) page to a non-secure (HTTP) page, no referrer data is passed, meaning the session appears as direct traffic instead of as a referral. Note that this is intended behavior. It’s part of how the secure protocol was designed, and it does not affect other scenarios: HTTP to HTTP, HTTPS to HTTPS, and even HTTP to HTTPS all pass referrer data.

So, if your referral traffic has tanked but direct has spiked, it could be that one of your major referrers has migrated to HTTPS. The inverse is also true: If you’ve migrated to HTTPS and are linking to HTTP websites, the traffic you’re driving to them will appear in their Analytics as direct.

If your referrers have moved to HTTPS and you’re stuck on HTTP, you really ought to consider migrating to HTTPS. Doing so (and updating your backlinks to point to HTTPS URLs) will bring back any referrer data which is being stripped from cross-protocol traffic. SSL certificates can now be obtained for free thanks to automated authorities like LetsEncrypt, but that’s not to say you should neglect to explore the potentially-significant SEO implications of site migrations. Remember, HTTPS and HTTP/2 are the future of the web.

If, on the other hand, you’ve already migrated to HTTPS and are concerned about your users appearing to partner websites as direct traffic, you can implement the meta referrer tag. Cyrus Shepard has written about this on Moz before, so I won’t delve into it now. Suffice to say, it’s a way of telling browsers to pass some referrer data to non-secure sites, and can be implemented as a <meta> element or HTTP header.

3. Missing or broken tracking code

Let’s say you’ve launched a new landing page template and forgotten to include the GA tracking code. Or, to use a scenario I’m encountering more and more frequently, imagine your GTM container is a horrible mess of poorly configured triggers, and your tracking code is simply failing to fire.

Users land on this page without tracking code. They click on a link to a deeper page which does have tracking code. From GA’s perspective, the first hit of the session is the second page visited, meaning that the referrer appears as your own website (i.e. a self-referral). If your domain is on the referral exclusion list (as per default configuration), the session is bucketed as direct. This will happen even if the first URL is tagged with UTM campaign parameters.

As a short-term fix, you can try to repair the damage by simply adding the missing tracking code. To prevent it happening again, carry out a thorough Analytics audit, move to a GTM-based tracking implementation, and promote a culture of data-driven marketing.

4. Improper redirection

This is an easy one. Don’t use meta refreshes or JavaScript-based redirects — these can wipe or replace referrer data, leading to direct traffic in Analytics. You should also be meticulous with your server-side redirects, and — as is often recommended by SEOs — audit your redirect file frequently. Complex chains are more likely to result in a loss of referrer data, and you run the risk of UTM parameters getting stripped out.

Once again, control what you can: use carefully mapped (i.e. non-chained) code 301 server-side redirects to preserve referrer data wherever possible.

5. Non-web documents

Links in Microsoft Word documents, slide decks, or PDFs do not pass referrer information. By default, users who click these links will appear in your reports as direct traffic. Clicks from native mobile apps (particularly those with embedded "in-app" browsers) are similarly prone to stripping out referrer data.

To a degree, this is unavoidable. Much like so-called “dark social” visits (discussed in detail below), non-web links will inevitably result in some quantity of direct traffic. However, you also have an opportunity here to control the controllables.

If you publish whitepapers or offer downloadable PDF guides, for example, you should be tagging the embedded hyperlinks with UTM campaign parameters. You’d never even contemplate launching an email marketing campaign without campaign tracking (I hope), so why would you distribute any other kind of freebie without similarly tracking its success? In some ways this is even more important, since these kinds of downloadables often have a longevity not seen in a single email campaign. Here’s an example of a properly tagged URL which we would embed as a link:

http://ift.tt/2ifQkVi?..._medium=offline_document&utm_campaign=201711_utm_whitepaper

The same goes for URLs in your offline marketing materials. For major campaigns it’s common practice to select a short, memorable URL (e.g. moz.com/tv/) and design an entirely new landing page. It’s possible to bypass page creation altogether: simply redirect the vanity URL to an existing page URL which is properly tagged with UTM parameters.

So, whether you tag your URLs directly, use redirected vanity URLs, or — if you think UTM parameters are ugly — opt for some crazy-ass hash-fragment solution with GTM (read more here), the takeaway is the same: use campaign parameters wherever it’s appropriate to do so.

6. “Dark social”

This is a big one, and probably the least well understood by marketers.

The term “dark social” was first coined back in 2012 by Alexis Madrigal in an article for The Atlantic. Essentially it refers to methods of social sharing which cannot easily be attributed to a particular source, like email, instant messaging, Skype, WhatsApp, and Facebook Messenger.

Recent studies have found that upwards of 80% of consumers’ outbound sharing from publishers’ and marketers’ websites now occurs via these private channels. In terms of numbers of active users, messaging apps are outpacing social networking apps. All the activity driven by these thriving platforms is typically bucketed as direct traffic by web analytics software.

People who use the ambiguous phrase “social media marketing” are typically referring to advertising: you broadcast your message and hope people will listen. Even if you overcome consumer indifference with a well-targeted campaign, any subsequent interactions are affected by their very public nature. The privacy of dark social, by contrast, represents a potential goldmine of intimate, targeted, and relevant interactions with high conversion potential. Nebulous and difficult-to-track though it may be, dark social has the potential to let marketers tap into elusive power of word of mouth.

So, how can we minimize the amount of dark social traffic which is bucketed under direct? The unfortunate truth is that there is no magic bullet: proper attribution of dark social requires rigorous campaign tracking. The optimal approach will vary greatly based on your industry, audience, proposition, and so on. For many websites, however, a good first step is to provide convenient and properly configured sharing buttons for private platforms like email, WhatsApp, and Slack, thereby ensuring that users share URLs appended with UTM parameters (or vanity/shortened URLs which redirect to the same). This will go some way towards shining a light on part of your dark social traffic.

Checklist: Minimizing direct traffic

To summarize what we’ve already discussed, here are the steps you can take to minimize the level of unnecessary direct traffic in your reports:

  1. Migrate to HTTPS: Not only is the secure protocol your gateway to HTTP/2 and the future of the web, it will also have an enormously positive effect on your ability to track referral traffic.
  2. Manage your use of redirects: Avoid chains and eliminate client-side redirection in favour of carefully-mapped, single-hop, server-side 301s. If you use vanity URLs to redirect to pages with UTM parameters, be meticulous.
  3. Get really good at campaign tagging: Even amongst data-driven marketers I encounter the belief that UTM begins and ends with switching on automatic tagging in your email marketing software. Others go to the other extreme, doing silly things like tagging internal links. Control what you can, and your ability to carry out meaningful attribution will markedly improve.
  4. Conduct an Analytics audit: Data integrity is vital, so consider this essential when assessing the success of your marketing. It’s not simply a case of checking for missing track code: good audits involve a review of your measurement plan and rigorous testing at page and property-level.

Adhere to these principles, and it’s often possible to achieve a dramatic reduction in the level of direct traffic reported in Analytics. The following example involved an HTTPS migration, GTM migration (as part of an Analytics review), and an overhaul of internal campaign tracking processes over the course of about 6 months:

But the saga of direct traffic doesn’t end there! Once this channel is “clean” — that is, once you’ve minimized the number of avoidable pollutants — what remains might actually be one of your most valuable traffic segments.

Analyze! Or: why direct traffic can actually be pretty cool

For reasons we’ve already discussed, traffic from bookmarks and dark social is an enormously valuable segment to analyze. These are likely to be some of your most loyal and engaged users, and it’s not uncommon to see a notably higher conversion rate for a clean direct channel compared to the site average. You should make the effort to get to know them.

The number of potential avenues to explore is infinite, but here are some good starting points:

  • Build meaningful custom segments, defining a subset of your direct traffic based on their landing page, location, device, repeat visit or purchase behavior, or even enhanced e-commerce interactions.
  • Track meaningful engagement metrics using modern GTM triggers such as element visibility and native scroll tracking. Measure how your direct users are using and viewing your content.
  • Watch for correlations with your other marketing activities, and use it as an opportunity to refine your tagging practices and segment definitions. Create a custom alert which watches for spikes in direct traffic.
  • Familiarize yourself with flow reports to get an understanding of how your direct traffic is converting. By using Goal Flow and Behavior Flow reports with segmentation, it’s often possible to glean actionable insights which can be applied to the site as a whole.
  • Ask your users for help! If you’ve isolated a valuable segment of traffic which eludes deeper analysis, add a button to the page offering visitors a free downloadable ebook if they tell you how they discovered your page.
  • Start thinking about lifetime value, if you haven’t already — overhauling your attribution model or implementing User ID are good steps towards overcoming the indifference or frustration felt by marketers towards direct traffic.

I hope this guide has been useful. With any luck, you arrived looking for ways to reduce the level of direct traffic in your reports, and left with some new ideas for how to better analyze this valuable segment of users.

Thanks for reading!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!





SEO

via SEOmoz Blog https://moz.com/blog

November 29, 2017 at 04:46PM
0 Comments
<<Previous

    Categories

    All
    Conversions
    Landing Pages
    Lead Generation
    Link Building
    Search Ranking
    SEO

    Archives

    November 2020
    October 2020
    September 2020
    January 2020
    November 2019
    July 2019
    June 2019
    May 2019
    April 2019
    March 2019
    February 2019
    January 2019
    December 2018
    November 2018
    October 2018
    September 2018
    August 2018
    July 2018
    June 2018
    May 2018
    April 2018
    March 2018
    February 2018
    January 2018
    December 2017
    November 2017
    October 2017
    September 2017
    August 2017
    July 2017
    June 2017
    May 2017
    April 2017
    March 2017
    February 2017

    RSS Feed

Powered by Create your own unique website with customizable templates.
  • Home
  • Blog
    • Social Media News
    • SEO Marketing News
    • Digital Trends News
    • Photography News
    • Mobile Marketing
    • Business News
    • Gadget News
    • Printing News
  • Contact
  • About
  • Subscribe