RICK REA: Helping You Grow Through Online Marketing
  • Home
  • Blog
    • Social Media News
    • SEO Marketing News
    • Digital Trends News
    • Photography News
    • Mobile Marketing
    • Business News
    • Gadget News
    • Printing News
  • Contact
  • About
  • Subscribe

SEO Marketing News

Evidence of the Surprising State of JavaScript Indexing

5/29/2017

0 Comments

 
http://img.youtube.com/vi/0wvZ7gakqV4/0.jpg

Evidence of the Surprising State of JavaScript Indexing

http://ift.tt/2qxWjmk

Posted by willcritchlow

Back when I started in this industry, it was standard advice to tell our clients that the search engines couldn’t execute JavaScript (JS), and anything that relied on JS would be effectively invisible and never appear in the index. Over the years, that has changed gradually, from early work-arounds (such as the horrible escaped fragment approach my colleague Rob wrote about back in 2010) to the actual execution of JS in the indexing pipeline that we see today, at least at Google.

In this article, I want to explore some things we've seen about JS indexing behavior in the wild and in controlled tests and share some tentative conclusions I've drawn about how it must be working.

A brief introduction to JS indexing

At its most basic, the idea behind JavaScript-enabled indexing is to get closer to the search engine seeing the page as the user sees it. Most users browse with JavaScript enabled, and many sites either fail without it or are severely limited. While traditional indexing considers just the raw HTML source received from the server, users typically see a page rendered based on the DOM (Document Object Model) which can be modified by JavaScript running in their web browser. JS-enabled indexing considers all content in the rendered DOM, not just that which appears in the raw HTML.

There are some complexities even in this basic definition (answers in brackets as I understand them):

  • What about JavaScript that requests additional content from the server? (This will generally be included, subject to timeout limits)
  • What about JavaScript that executes some time after the page loads? (This will generally only be indexed up to some time limit, possibly in the region of 5 seconds)
  • What about JavaScript that executes on some user interaction such as scrolling or clicking? (This will generally not be included)
  • What about JavaScript in external files rather than in-line? (This will generally be included, as long as those external files are not blocked from the robot — though see the caveat in experiments below)

For more on the technical details, I recommend my ex-colleague Justin’s writing on the subject.

A high-level overview of my view of JavaScript best practices

Despite the incredible work-arounds of the past (which always seemed like more effort than graceful degradation to me) the “right” answer has existed since at least 2012, with the introduction of PushState. Rob wrote about this one, too. Back then, however, it was pretty clunky and manual and it required a concerted effort to ensure both that the URL was updated in the user’s browser for each view that should be considered a “page,” that the server could return full HTML for those pages in response to new requests for each URL, and that the back button was handled correctly by your JavaScript.

Along the way, in my opinion, too many sites got distracted by a separate prerendering step. This is an approach that does the equivalent of running a headless browser to generate static HTML pages that include any changes made by JavaScript on page load, then serving those snapshots instead of the JS-reliant page in response to requests from bots. It typically treats bots differently, in a way that Google tolerates, as long as the snapshots do represent the user experience. In my opinion, this approach is a poor compromise that's too susceptible to silent failures and falling out of date. We've seen a bunch of sites suffer traffic drops due to serving Googlebot broken experiences that were not immediately detected because no regular users saw the prerendered pages.

These days, if you need or want JS-enhanced functionality, more of the top frameworks have the ability to work the way Rob described in 2012, which is now called isomorphic (roughly meaning “the same”).

Isomorphic JavaScript serves HTML that corresponds to the rendered DOM for each URL, and updates the URL for each “view” that should exist as a separate page as the content is updated via JS. With this implementation, there is actually no need to render the page to index basic content, as it's served in response to any fresh request.

I was fascinated by this piece of research published recently — you should go and read the whole study. In particular, you should watch this video (recommended in the post) in which the speaker — who is an Angular developer and evangelist — emphasizes the need for an isomorphic approach:

Resources for auditing JavaScript

If you work in SEO, you will increasingly find yourself called upon to figure out whether a particular implementation is correct (hopefully on a staging/development server before it’s deployed live, but who are we kidding? You’ll be doing this live, too).

To do that, here are some resources I’ve found useful:

  • Justin again, describing the difference between working with the DOM and viewing source
  • The developer tools built into Chrome are excellent, and some of the documentation is actually really good:
    • The console is where you can see errors and interact with the state of the page
    • As soon as you get past debugging the most basic JavaScript, you will want to start setting breakpoints, which allow you to step through the code from specified points
  • This post from Google’s John Mueller has a decent checklist of best practices
  • Although it’s about a broader set of technical skills, anyone who hasn’t already read it should definitely check out Mike’s post on the technical SEO renaissance.

Some surprising/interesting results

There are likely to be timeouts on JavaScript execution

I already linked above to the ScreamingFrog post that mentions experiments they have done to measure the timeout Google uses to determine when to stop executing JavaScript (they found a limit of around 5 seconds).

It may be more complicated than that, however. This segment of a thread is interesting. It's from a Hacker News user who goes by the username KMag and who claims to have worked at Google on the JS execution part of the indexing pipeline from 2006–2010. It’s in relation to another user speculating that Google would not care about content loaded “async” (i.e. asynchronously — in other words, loaded as part of new HTTP requests that are triggered in the background while assets continue to download):

“Actually, we did care about this content. I'm not at liberty to explain the details, but we did execute setTimeouts up to some time limit.

If they're smart, they actually make the exact timeout a function of a HMAC of the loaded source, to make it very difficult to experiment around, find the exact limits, and fool the indexing system. Back in 2010, it was still a fixed time limit.”

What that means is that although it was initially a fixed timeout, he’s speculating (or possibly sharing without directly doing so) that timeouts are programmatically determined (presumably based on page importance and JavaScript reliance) and that they may be tied to the exact source code (the reference to “HMAC” is to do with a technical mechanism for spotting if the page has changed).

It matters how your JS is executed

I referenced this recent study earlier. In it, the author found:

Inline vs. External vs. Bundled JavaScript makes a huge difference for Googlebot

The charts at the end show the extent to which popular JavaScript frameworks perform differently depending on how they're called, with a range of performance from passing every test to failing almost every test. For example here’s the chart for Angular:

Slide5.PNG

It’s definitely worth reading the whole thing and reviewing the performance of the different frameworks. There's more evidence of Google saving computing resources in some areas, as well as surprising results between different frameworks.

CRO tests are getting indexed

When we first started seeing JavaScript-based split-testing platforms designed for testing changes aimed at improving conversion rate (CRO = conversion rate optimization), their inline changes to individual pages were invisible to the search engines. As Google in particular has moved up the JavaScript competency ladder through executing simple inline JS to more complex JS in external files, we are now seeing some CRO-platform-created changes being indexed. A simplified version of what’s happening is:

  • For users:
    • CRO platforms typically take a visitor to a page, check for the existence of a cookie, and if there isn’t one, randomly assign the visitor to group A or group B
    • Based on either the cookie value or the new assignment, the user is either served the page unchanged, or sees a version that is modified in their browser by JavaScript loaded from the CRO platform’s CDN (content delivery network)
    • A cookie is then set to make sure that the user sees the same version if they revisit that page later
  • For Googlebot:
    • The reliance on external JavaScript used to prevent both the bucketing and the inline changes from being indexed
    • With external JavaScript now being loaded, and with many of these inline changes being made using standard libraries (such as JQuery), Google is able to index the variant and hence we see CRO experiments sometimes being indexed

I might have expected the platforms to block their JS with robots.txt, but at least the main platforms I’ve looked at don't do that. With Google being sympathetic towards testing, however, this shouldn’t be a major problem — just something to be aware of as you build out your user-facing CRO tests. All the more reason for your UX and SEO teams to work closely together and communicate well.

Split tests show SEO improvements from removing a reliance on JS

Although we would like to do a lot more to test the actual real-world impact of relying on JavaScript, we do have some early results. At the end of last week I published a post outlining the uplift we saw from removing a site’s reliance on JS to display content and links on category pages.

odn_additional_sessions.png

A simple test that removed the need for JavaScript on 50% of pages showed a >6% uplift in organic traffic — worth thousands of extra sessions a month. While we haven’t proven that JavaScript is always bad, nor understood the exact mechanism at work here, we have opened up a new avenue for exploration, and at least shown that it’s not a settled matter. To my mind, it highlights the importance of testing. It’s obviously our belief in the importance of SEO split-testing that led to us investing so much in the development of the ODN platform over the last 18 months or so.

Conclusion: How JavaScript indexing might work from a systems perspective

Based on all of the information we can piece together from the external behavior of the search results, public comments from Googlers, tests and experiments, and first principles, here’s how I think JavaScript indexing is working at Google at the moment: I think there is a separate queue for JS-enabled rendering, because the computational cost of trying to run JavaScript over the entire web is unnecessary given the lack of a need for it on many, many pages. In detail, I think:

  • Googlebot crawls and caches HTML and core resources regularly
  • Heuristics (and probably machine learning) are used to prioritize JavaScript rendering for each page:
    • Some pages are indexed with no JS execution. There are many pages that can probably be easily identified as not needing rendering, and others which are such a low priority that it isn’t worth the computing resources.
    • Some pages get immediate rendering – or possibly immediate basic/regular indexing, along with high-priority rendering. This would enable the immediate indexation of pages in news results or other QDF results, but also allow pages that rely heavily on JS to get updated indexation when the rendering completes.
    • Many pages are rendered async in a separate process/queue from both crawling and regular indexing, thereby adding the page to the index for new words and phrases found only in the JS-rendered version when rendering completes, in addition to the words and phrases found in the unrendered version indexed initially.
  • The JS rendering also, in addition to adding pages to the index:
    • May make modifications to the link graph
    • May add new URLs to the discovery/crawling queue for Googlebot

The idea of JavaScript rendering as a distinct and separate part of the indexing pipeline is backed up by this quote from KMag, who I mentioned previously for his contributions to this HN thread (direct link) [emphasis mine]:

“I was working on the lightweight high-performance JavaScript interpretation system that sandboxed pretty much just a JS engine and a DOM implementation that we could run on every web page on the index. Most of my work was trying to improve the fidelity of the system. My code analyzed every web page in the index.

Towards the end of my time there, there was someone in Mountain View working on a heavier, higher-fidelity system that sandboxed much more of a browser, and they were trying to improve performance so they could use it on a higher percentage of the index.”

This was the situation in 2010. It seems likely that they have moved a long way towards the headless browser in all cases, but I’m skeptical about whether it would be worth their while to render every page they crawl with JavaScript given the expense of doing so and the fact that a large percentage of pages do not change substantially when you do.

My best guess is that they're using a combination of trying to figure out the need for JavaScript execution on a given page, coupled with trust/authority metrics to decide whether (and with what priority) to render a page with JS.

Run a test, get publicity

I have a hypothesis that I would love to see someone test: That it’s possible to get a page indexed and ranking for a nonsense word contained in the served HTML, but not initially ranking for a different nonsense word added via JavaScript; then, to see the JS get indexed some period of time later and rank for both nonsense words. If you want to run that test, let me know the results — I’d be happy to publicize them.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!





SEO

via SEOmoz Blog https://moz.com/blog

May 28, 2017 at 07:10PM
0 Comments

Shopify SEO: How to Optimize Your Store for Success by @SEOBrock

5/28/2017

0 Comments

 
http://ift.tt/2qpreGn

Shopify SEO: How to Optimize Your Store for Success by @SEOBrock

http://ift.tt/2r1EJuM

Want your Shopify store to rank in Position 1 on Google? No matter the platform, the same SEO principles apply if you want your e-commerce store to be found in search results.

With some simple tweaks and optimizations to your Shopify store, you can improve your visibility, rank on the first page for your targeted keywords, and make the sale.

While you may not outrank Amazon on day one, this Shopify SEO guide will walk you through everything you need to position your store for success.

Launching Your Shopify Store

Your store theme should match your brand style and make sense for your inventory. But also think about the impact on speed and usability, as these factors will greatly affect SEO.

Before you buy or install a theme, verify that it won’t hurt your speed and performance.

  1. Run the theme URL through a tool like GTMetrix or Web Page Test. While the numbers will be different on your store, it will give you an idea of the resource demand of the core theme you’re building on.Screenshot of fast Shopify theme scanned in GTMetrixScreenshot of slow Shopify theme scanned in GTMetrix
  2. Run the theme URL through PageSpeed Insights. Since Google’s analysis of your page is ultimately what matters when you want to rank, verify that Google evaluates the code positively.

Screenshot of fast Shopify theme in PageSpeed Insights

Screenshot of slow Shopify theme in PageSpeed Insights

When it comes to speed, the Turbo Shopify theme from Out of the Sandbox is my favorite right now. This theme is optimized for speed without sacrificing form or functionality.

Installing this theme on a client site cut loading time by 75 percent. We routinely see clients keep load times under five seconds when they launch to a lean and clean theme.

A smart theme selection will set you on the right path in terms of branding, user experience, and SEO.

Technical SEO

Without a strong foundation, SEO is just wainscoting on a pile of rubble. But when you build, reinforce, and maintain a strong foundation, SEO tweaks will add that decorative flourish to set you apart from the competition.

Always optimize your site for customers, first and foremost.

On-page SEO

The same SEO elements that apply to a regular site apply to Shopify (here’s a great SEO checklist):

  • Upload a robots.txt file so bots can crawl your site.
  • Ensure you have an XML sitemap to guide Google through the architecture of your site.
  • Install Google Analytics.
  • Validate HTML and CSS.
  • Purchase an SSL certificate and upgrade to HTTPS.
  • Correct any crawl errors and redirect issues.
  • Include target keywords for a given page in the title and H1.
  • Optimize meta descriptions, as this can affects click-through-rate (CTR).
  • Optimize headings (H1s, H2s, etc.) in a natural way.
  • Optimize images (file name, alt text, and image size).

Optimize all future pages as you add them, including collections and product pages.

You’ll also need to re-optimize your content periodically to better target the keywords that convert the most.

SEO is an ongoing process. Tweaks will always be needed because search algorithms are constantly changing – and so are your customers’ wants, needs, and behaviors.

Run your site through an auditor like WooRank. Check back in every few months to make sure your corrections have taken hold and identify any new issues.

On-page Optimizations in Shopify

Shopify makes it easy to set your basic on-page details directly in the backend.

Navigate to Sales Channels > Online Store > Preferences. On this page, you can update your homepage title and meta description. This is also where you can link your Google Analytics account.

Enter your store's title and meta description in these fields

Individual product title and meta descriptions can be set directly on the page:

Enter your product page's title and meta description in fields provided

Need to set up redirects? This can be done via Sales Channels > Online Store > Navigation. Click the “URL Redirects” button in the top right. Here, you can manage redirects and add new ones.

Screenshot of URL redirects in Shopify backend

Shopify automatically generates an XML sitemap of your store. You can find it by going to http://ift.tt/NXUDQh.

While Shopify isn’t quite as SEO-friendly as sites on WordPress with a plugin like Yoast, you can optimize almost everything directly through the dashboard.

Apps

Shopify has a pretty awesome collection of free or low-cost app integrations to take your store to the next level.

Our clients have had great success capturing abandoned carts and re-engaging past customers with Conversio (formerly Receiptful). Bold Apps can help with sales and upsells, not to mention Sweet Tooth Loyalty Program and MailChimp integration for your email marketing efforts.

The Shopify Reviews App or Yotpo are amazing as they add review schema right onto the product to help click-through rate in the SERPs.

Apps can help with everything from driving newsletter sign-ups to running sales. However, apps can slow down stores and cause some conflicts, so choose wisely.

Screenshot of product review page

Speed

Speed should always come before cool features. Why? Forty percent of users will abandon a page if it takes longer than 3 seconds to load. In fact, a single second delay in page loading can drop conversions by 7 percent.

For an e-commerce store, that could cost thousands in lost sales each year. Mobile users are impatient. A slow store is bad business.

Optimizing your Shopify store for speed is almost exactly the same as any other website. Small improvements like the ones listed below should help improve load times:

  • Minify code (CSS & JavaScript)
  • Compress images
  • Minimize redirects
  • Enable browser caching
  • Use a content delivery network (CDN)

You can see a full list of recommendations and see how your store measures up when you run your domain through a tool like GTMetrix.

Speed matters for user experience, but it’s also a Google ranking factor. The faster your site can be without stripping elements that provide an optimal user experience, the better.

User Experience

Ask yourself these questions to evaluate whether your Shopify store offers an excellent user experience:

  • Does the homepage explain what your store is all about? Is it easy to understand?
  • Is the site structured well?
  • Is the page layout clear, consistent, and visually appealing?
  • Are the graphics welcoming and consistent?
  • Do your product images show high-quality graphics of what you are selling?
  • Is there good use of white space? Is the content too dense?
  • Are navigation buttons and tabs consistent and intuitive?
  • Is the relationship between the page and navigation clear?
  • Is there a clear “path” for users to follow from first landing on the site to buying a product?
  • Is the content unique, well-written, and accurate?
  • Are there ads, interstitials, pop-ups, or any other obtrusive elements on the page? If so, are they displayed in a professional manner?
  • Are there calls-to-action (CTAs) on the page, and are they clear and intuitive?

If you can say an emphatic “yes” to all of these questions, you’re on the right track to delight your users and Google.

If you’re hesitant about any of the above elements, go back to the drawing board. You can draw in visitors through paid ads, organic traffic, or social, but the page has to serve their needs. Otherwise, metrics will confirm your site just isn’t cutting it and everything will suffer — including search rankings.

User testing can help you get a more accurate sense of how customers interact with your store and shine a spotlight on problem areas.

Optimizing Your Store for Conversions

Conversion optimization is all about getting visitors to engage with your content, like you, trust you, and ultimately buy your product.

Each of these actions at one level or another is associated with a site-level metric that influences your site’s Google rank. After all, users who take an action on your site contribute a lower bounce rate, longer time on site, higher pages/session, and more.

Without worrying about a single technical detail, shifting your mindset to associate conversion optimization with rankings benefits will pay tremendous dividends.

If Google sees that users click to your store and buy something, that can positively influence your rankings.

Google wants to show the most relevant and useful results to users. Make your site exactly what your customers want and need.

Content

A beautiful website without quality content is like a top-line Viking range with no ingredients to cook — a total waste.

All the little optimizations and tweaks I detailed earlier mean nothing if they’re not building on a rich, deep, and insightful foundation of high-quality content.

Homepage

Your Shopify homepage represents who you are, what you do, what you offer, and your unique value proposition.

Harris Farm Markets is one site that does a great job with their homepage layout and content:

An example of a good eCommerce homepage

It’s hyper-visual and rich in content. From the first look at the page to the final scroll, you get a total sense of who they are, what they do, and what makes them different from a supermarket.

Muse is another great example:

An example of a good eCommerce homepage

Must makes it clear what the product is and how it can help from that first hero image to the research partners that provide credibility.

Product Pages

Ensure your product pages are fully optimized. Pasting the same generic sentence on each page won’t cut it.

  • Write a unique, captivating product description.
  • To avoid duplicate content, add variants to each product. This way, you won’t have to come up with new content for red and blue versions of the same shirt.
  • To better serve your customers, have shipping, return, and sizing information available on the page.
  • Ensure your images are optimized with a descriptive file name and relevant alt text.
  • Consider enabling reviews on the page. Not only does it provide more value/info for customers, but it’s free content!
  • Don’t use manufacturer descriptions! Write your own.

Duplicate Product Pages

If duplicate products are unavoidable, apply the rel=canonical tag to the product you want to take priority. If this is an ongoing issue like if your store is duplicating products across collection pages, you can install an app like the NoFollow and NoIndex Manager to take control over what gets followed and indexed.

Deleting Products

When a model becomes obsolete or you move on to a different stock, do you just delete the product? This can cause big SEO issues.

Ensure you set up 404 redirects for all deleted products. Link to the most similar live product if you can, or the most relevant collections page. If nothing applies, redirect to the homepage.

The Perfect Product Page

Here’s an example of an exceptional product page:

An example of a strong eCommerce product page

It has:

  • Tons of content broken up into different chunks for readability and flow.
  • Variants all on the same page to avoid duplicate content issues.
  • Visual, informative content about features and specs, and even includes videos of the product in use.
  • A review app, giving 800 words of additional content.

Tabbed content allows you to add depth to the page without cluttering it. Here’s a great example:

An example of tabbed content on a product page

Collections

Shopify automatically creates collections pages. You don’t have to make the pages visible on your site, but they represent a huge ranking opportunity for category keywords if you choose to use them.

If your collections pages are just product lists, you’re missing an opportunity.

We’ve been doing content projects for a lot of our e-commerce clients to grow their collections pages, and we’ve seen great results.

For instance, instead of just a generic “jeans” or “men’s jeans” page, make a “skater jeans” or “baggy jeans” landing page building on a collection. These hyper-targeted keywords convert really well.

An example of good collections page content

Blog

A blogging strategy is really important, especially for e-commerce. Maintaining a blog will:

  • Provide value to users.
  • Help you target long-tail keywords.
  • Improve your organic search visibility.
  • Give you more content to promote on social.

Use tools like Google Trends and BuzzSumo to identify keyword opportunities and stay ahead of the curve on evergreen and trendy topics.

Write long-tail content that has your product as the answer to a commonly asked question. Browse social media, Quora, Reddit, and niche forums to find these questions.

Remember, a blog can be so much more than pure text:

  • Post videos.
  • Do a Q&A.
  • Post how-tos.
  • Create an infographic.
  • Post gift guides, style guides, and lookbooks.

Integrate buy buttons and strong calls-to-action (CTAs) to make blogs an important part of your sales funnel.

An example of Shopify's buy button used in a blog

Off-page Optimizations

All of your on-page optimizations are just one part of the puzzle. Your store’s off-page presence is just as important, if not more.

In other words, you need to market your Shopify store. Get the word out there.

  • Put effort into a comprehensive, strategic social media presence.
  • Do outreach and online PR.
  • Build a robust backlink profile from trustworthy, authoritative domains.

All of these efforts will help build brand awareness and affinity, which will ultimately increase search demand.

Conclusion

From choosing an SEO-friendly theme to core optimizations (on-page, speed, UX, CRO, content, and off-page), the recommendations in this guide should help steer your Shopify store to improved and sustained rankings success.

Plan and deploy your SEO strategy using Shopify’s built-in features; monitor your results through keyword tracking software and Google Analytics, and adjust and experiment until you strike gold.

Just remember: SEO alone isn’t enough for Shopify success. Search engines want to satisfy users, so always optimizing for your customers will delight two birds with one stone.

Image Credits

Featured Image: Shutterstock
All Screenshots by Brock Murray. Taken May 2017.





SEO

via Search Engine Journal http://ift.tt/1QNKwvh

May 28, 2017 at 02:07AM
0 Comments

SearchCap: Search trademark issues & search pictures

5/26/2017

0 Comments

 
http://ift.tt/1WuzysF

SearchCap: Search trademark issues & search pictures

http://ift.tt/2qrYQPd

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Search trademark issues & search pictures appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.




SEO

via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/fN1KYC

May 26, 2017 at 08:01AM
0 Comments

Search in Pics: Google drum set a deep blue room & Google sneakers

5/26/2017

0 Comments

 
http://ift.tt/2qmOMaN

Search in Pics: Google drum set, a deep blue room & Google sneakers

http://ift.tt/2roJpvN

In this week’s Search In Pictures, here are the latest images culled from the web, showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have and more. Google deep blue meeting room: Source: Twitter Google elephant march...

Please visit Search Engine Land for the full article.




SEO

via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/fN1KYC

May 26, 2017 at 01:45AM
0 Comments

Everything You Need to Know About Facebook Ad Relevance Score by @@SusanEDub

5/26/2017

0 Comments

 
http://ift.tt/2r459u9

Everything You Need to Know About Facebook Ad Relevance Score by @@SusanEDub

http://ift.tt/2rGjXBQ

Terms like “Relevance Score” and “Quality Score” can seem vague and mysterious to people who just want to put some money in the PPC slot machine and have sales come out. Facebook introduced its Ad Relevance Score in 2015, but many advertisers still struggle to understand it, or how to fix it if it’s struggling.

If you’re one of those advertisers, read on. This is everything you need to know about Facebook Ad Relevance Score.

What Is the Facebook Ad Relevance Score?

With the explosion of Facebook Ads and the News Feeds getting ever more crowded, it made sense for the social network to create an ad quality metric. This also added a layer of complexity. Advertisers had another “thing” to figure out if ads were under-serving or costs got really high.

Your Facebook Ad Relevance Score is a rating of 1-10 after it has at least 500 impressions (yes, that’s pretty quick). The score is calculated daily based on, as Facebook says, “positive and negative feedback we expect from people seeing it, based on how the ad is performing.”

Facebook goes on to define “positive” as things like shares, likes, or other actions that help you achieve your objective. Meaning that yes, the criteria for your Relevance Score can change a bit depending on whether you’re running a campaign with an objective of video views vs. one for link clicks.

“Negative” feedback is anything like when people hide your ads. Though Facebook doesn’t explicitly say so, it’s also safe to assume that anything not meeting your objective (i.e., people not clicking, etc.) also contributes to negative feedback.

This isn’t surprising. After all, Facebook is a social network. Facebook rewards you for generating more interaction and interest — it’s their value proposition and they have to protect it (in the same way AdWords protects the quality of search results by having a Quality Score).

Does Relevance Score Really Have an Impact?

I was skeptical of Facebook’s Ad Relevance Score when it came out. A lot of AdWords advertisers didn’t consistently see more favorable CPCs as we improved Quality Score, and the word itself became another dreaded metric that can distract from overall account goals.

However, it’s pretty easy to see with Facebook that when you’re able to improve your ad relevance you also reduce your costs

I decided to specifically test this in an account that was doing well socially already.

To understand this method, it’s important to understand one nuance first.

Ad IDs & Sharing Them

When you create an ad on Facebook, it automatically generates an Ad ID.

Copying and pasting that ad into another Ad Set creates a new ID. Even thought it’s the same ad, Facebook will treat them as being different and it won’t retain the Ad ID.

Social interaction on an ad is tied to the ID level. What that ultimately means is each ad unit will keep the social interaction to just that ID — it doesn’t share it with the otherwise identical ads because the ID is different. Each ID also has its own Relevance Score:

Each Facebook ad ID has its own relevance score

You can get around this by creating an ad and then pasting its ID into the option for “Use Existing Post” when you create a new ad. This will share that ID, and all the social proof it accumulates will display on that ad for every ad set that it’s used within:

Pasting the ad ID into the "Use Existing Post" option on Facebook

 

Important: As with any ad, if you update the copy, link, or anything like that, it will reset all of that social proof. This carries even more weight once you share an ID because it will affect multiple ad sets with the stroke of a key.

Pushing That Relevance Score

I decided to test what happened with a focus on Relevance Score. This meant we wanted to try and house all the social proof on one ad ID in order to maximize impact. Otherwise, we’d have a bunch of disparate ads with each having their own social proof.

I took a well-performing ad that had duplicate versions with different IDs running in different ad sets. The average Relevance Score of them all was around 8. The social proof on them all was similar and their Relevance Scores were the same, so we just picked one at random.

I took that ID and pasted it into the other ad sets. This would ensure that moving forward, any new social proof and any change in Relevance Score would be concentrated in this one unit. It would amass the social proof faster because it wasn’t being distributed among disparate Ad IDs.

The shared post ID started to run, and then I did one more thing: I added a Page Post Engagement campaign (now just called “Engagement” objective type), and for the creative I used the same Ad ID. (I threw a couple hundred bucks towards it only because the population size warranted it, but the same methodology can be used with just a few bucks a day.)

Engagement objective type for paid post engagement campaigns in Facebook

This means that while that ID was running in the ad sets focused on converting to sales, it was also running in the PPE campaign, accruing social results simultaneously. Within a few days, the Relevance Score for the shared ID was hitting a 10.

Test Results

I pulled the CTR, CPC, and CPA data for the ads when they ran in different groups (called Non-PPE version) vs. when I previously ran the single ID with the extra dough put towards accruing some social proof (called the PPE version). Other than as outlined above, nothing else changed, including the targeting.

CTR results for PPE vs non-PPE ads

It appears the additional social proof gave us a leg up here.

CPC results for PPE vs non-PPE ads

Again, the additional social proof seemed to help here. (I didn’t analyze it at the time, but it would have been interesting to see if the offset in CPC saved enough money that it paid for the social proof.)

Finally, the ultimate number that matters:

CPA results for PPE vs non-PPE ads

Wow. That’s a huge difference, and definitely statistically significant with the audience size we had.

I’ve run this test in a few other accounts, and the results were similar. Sometimes the result wasn’t as marked, but it was still usually there.

Clickable, Shareable, Likable

What I liked best about running this test was that it helped take the hypothesis of Relevance Score, and proved it. It’s easy to say “create great content,” but it’s another thing when you see how it directly impacts the bottom line of what you’re selling. It’s worth the extra effort to test that video, or do something more creative than just throwing a stock image up there and hoping for the best.

Like every ad creative on Facebook, eventually it ran its course and it was time to swap out.

The beauty of this method, however, is it allows you to launch a new creative and secure a higher relevancy score faster. It helps alleviate the peaks and valleys that can be associated with launching new creatives, and get relevancy score working in your favor, faster!

Image Credits

In-post images: Screenshots by author. Taken May 2017.





SEO

via Search Engine Journal http://ift.tt/1QNKwvh

May 26, 2017 at 01:33AM
0 Comments

Exploit These 3 Powerful Motivators for Better PPC Ad Copy by @clarkboyd

5/26/2017

0 Comments

 
http://ift.tt/2qmYqKF

Exploit These 3 Powerful Motivators for Better PPC Ad Copy by @clarkboyd

http://ift.tt/2r3vUP7

Here’s a typical search engine results page:

Flower Delivery

Wow. There’s a lot more going on here than meets the eye. Do you see it?

Every ad here is vying for a click. So how will people decide which ad to click on – and ultimately make a purchase decision?

The best PPC ad here is the one that ultimately motivates a searcher to click.

We rely on data to guide us toward that all important click. After all, we have so much data at our disposal today.

But pure, hard data isn’t the answer in isolation. Actually, it’s the interplay between the rational and the emotional where we can have the most impact.

Why?

Humans Aren’t Rational Creatures!

People make snap judgments all the time and with good reason. To apply our full critical faculties to every decision would be exhausting and inefficient.

Rational PPC

Moreover, rationality itself is a nuanced concept. We are rational to different extents and in different ways at various times of each day.

The level of rationality applied to decisions will depend on the product. For example, the behavioral factors behind choosing a pension will differ from those that drive our choice of Mexican restaurant for dinner tonight.

So if we agree humans aren’t rational, then how can you ensure as many of them as possible click on your PPC ads?

With some motivation.

Here are three powerful motivators you can exploit for better PPC ad copy that converts.

Motivator 1. Incentives

Incentives are behind pretty much everything we do. They are the reason we go to work, tchoose certain brands over others, and eat at particular restaurants.

These concepts typically come in two forms:

  • Extrinsic incentives: These relate to factors outside of the self. For example, I go to work because I need the paycheck to pay rent and I enjoy the status my title affords me.
  • Intrinsic incentives: These come from within and can often be more powerful motivators. For example, I go to work because I feel like I am contributing to society and I enjoy what I do.

Advertising has always played to these incentives.

In other words: What’s in it for me? How will I feel, look, or live better if I buy your product?

How to Use Incentives in PPC Ad Copy

Incentives should play a significant part in any PPC campaign. You’re fighting for attention in a crowded space. The quicker your offer can demonstrate incentives, the better.

The line between extrinsic and intrinsic incentives is often blurred, as we can see in the ad copy for [charity donation]:

Charity PPC

All worthy causes. All deserving of support. All competing for our attention.

The intrinsic incentives are clear: “Children Need Help”, “End Childhood Cancer”, “Support Cancer Research”, to cite just three examples. We would all like to contribute to these causes.

However, there’s an awareness that perhaps donations are not always purely selfless acts. There are extrinsic incentives behind donations, too. Note the mention of “100% Tax-Deductible” in there.

These examples of ad copy demonstrate a sophisticated understanding of just how muddled our incentives can become.

At other times, the distinction is clear.

Mens Jeans PPC

For an overtly commercial search like [mens jeans], we can see which tactics are employed. These ads play to extrinsic incentives, with nods to our desire to be seen as fashionable (“The Seasons Best Looks”) but also to find value within our financial range (“At Affordable Prices”).

There are, however, numerous attempts to appeal to intrinsic incentives. Of note are phrases like “Made Ethically”, “Personal Freedom”, and “Stop Paying Retail Markup”.

Consumers want more from a product than just the latest styles. They want to feel ethically responsible, they want a sense of buying into something greater than just the material, and they also don’t want to feel like they are being ripped off.

Key takeaway: We should always be asking what is most likely to motivate our audience. We should also understand that this will differ by product set, by time of day, and by demographic.

Motivator 2: Herding

People are conditioned to learn from the experience of others. Deep down, there’s an assumption that the wisdom of crowds will guide you toward a quick, safe decision.

Parallels of this form of decision-making are found everywhere in nature.

Penguins in the Antarctic face a daily dilemma, as they are in the middle of the food chain. Should they dive into the water in the hope of finding krill, but simultaneously invoke the risk of being eaten by a seal? There’s no way to be certain.

One penguin takes the plunge and the others make assumptions based on the outcome. Should their flightless friend emerge unscathed with a beak full of beautiful krill, the others will follow suit. If not, well, they hold back the hunger pangs a little longer until the coast clears.

Penguins

This is a slightly more important decision than choosing which PPC ad to click on, but the underlying principle is the same. People are suspicious of brands they’ve never heard of and that have no customer reviews. People also want someone else to take the plunge and report back to base.

Although most people still trust media outlets, in this cynical age many people are more likely to trust their fellow consumers.

How to Use Herding in Ad Copy

Remember that people want all the necessary information to make a decision at their disposal, as effortlessly as possible.

Reviews matter. Use them in your ads if they are available.

Make full use of ad extensions to include your company’s USPs and reassure the consumer that you’re a reputable provider. This provides a sense of security in the knowledge that many other consumers have used and enjoyed your services.

We can see this in action if we look at the results for [red sox tickets]:

Red Sox Ticket

There are not only reviews but also quotes from sources like the Washington Post. Ad extensions are used to include consumer ratings on service and website quality, too.

People are sometimes concerned about buying event tickets online, but these ads show that they are in safe hands. Others have entered the waters and returned unscathed.

Motivator 3: Availability Bias

People reach for the information that is most readily available when making most decisions.

We all have a repository of past experiences and knowledge that we use to cut through the noise and reach snappy conclusions. This is known as the availability bias. It is considered a bias because it tends to lead us to irrational choices based purely on the first relevant piece of information we think of.

The availability bias is related to psychological phenomena known as primacy and recency. These concepts state that we tend to recall the information we heard first and last, but rarely the information in between.

The applications of such a theory for search are self-evident.

How to Use Availability Bias in Ad Copy

Make decisions as easy as possible for customers. You can do this by demonstrating how close your store is to their location or how simple your shipping process is.

Also consider how you communicate with existing consumers. If they have shopped with you before, they are more likely to do it again. This should be a central consideration as you try to attract repeat customers.

Looking to primacy and recency, you can test your ad positions to see if it makes the most sense to rank first in PPC and first in SEO. We may perform more cost-effectively in the fourth position for PPC and first for SEO, for example.

We could also go against the grain. Availability sometimes leads consumers into decisions that are against their best interests. This is particularly true of more cumbersome decisions, like switching cell phone carriers.

We can see this in action for the query [sprint]:

Sprint PPC

The second ad is focused on convincing consumers to make the switch. They may be overpaying for service with another carrier out of habit, so Sprint makes overt reference to the savings it provides and also the ‘Waived Activation Fee’. Mentioning a limited time offer also plays into the scarcity effect – or FOMO (fear of missing out) in modern parlance – to add a sense of urgency to proceedings.

We can learn from this that while cognitive biases can be used to guide users on an unconscious path, we can also awaken them from the slumber of irrationality if we have a better deal to offer.

Bringing It All Together

Now, let’s go back to the earlier example of search results for [flower delivery brooklyn] and apply what we’ve learned.

Flowers PPC Analysis

Now it’s clear to see all the motivators at play.

Sometimes we simply know these things intuitively and include them within ad copy. However, familiarizing yourself with these concepts can add more rigor to your testing.

Adding some powerful motivation can significantly improve your ad copy and PPC campaign performance.

Image Credits

Featured image: Pixabay

In-post image 1: Unsplash

In-post image 2: Unsplash

Screenshots taken by Clark Boyd, May 2017.





SEO

via Search Engine Journal http://ift.tt/1QNKwvh

May 26, 2017 at 12:03AM
0 Comments

Should SEOs Care About Internal Links? - Whiteboard Friday

5/26/2017

0 Comments

 
http://ift.tt/2rFJ47W

Should SEOs Care About Internal Links? - Whiteboard Friday

http://ift.tt/2r3W0DH

Posted by randfish

Internal links are one of those essential SEO items you have to get right to avoid getting them really wrong. Rand shares 18 tips to help inform your strategy, going into detail about their attributes, internal vs. external links, ideal link structures, and much, much more in this edition of Whiteboard Friday.

Should SEOs Care About Internl Links?

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat a little bit about internal links and internal link structures. Now, it is not the most exciting thing in the SEO world, but it's something that you have to get right and getting it wrong can actually cause lots of problems.

Attributes of internal links

So let's start by talking about some of the things that are true about internal links. Internal links, when I say that phrase, what I mean is a link that exists on a website, let's say ABC.com here, that is linking to a page on the same website, so over here, linking to another page on ABC.com. We'll do /A and /B. This is actually my shipping routes page. So you can see I'm linking from A to B with the anchor text "shipping routes."

The idea of an internal link is really initially to drive visitors from one place to another, to show them where they need to go to navigate from one spot on your site to another spot. They're different from internal links only in that, in the HTML code, you're pointing to the same fundamental root domain. In the initial early versions of the internet, that didn't matter all that much, but for SEO, it matters quite a bit because external links are treated very differently from internal links. That is not to say, however, that internal links have no power or no ability to change rankings, to change crawling patterns and to change how a search engine views your site. That's what we need to chat about.



1. Anchor text is something that can be considered. The search engines have generally minimized its importance, but it's certainly something that's in there for internal links.

2. The location on the page actually matters quite a bit, just as it does with external links. Internal links, it's almost more so in that navigation and footers specifically have attributes around internal links that can be problematic.

Those are essentially when Google in particular sees manipulation in the internal link structure, specifically things like you've stuffed anchor text into all of the internal links trying to get this shipping routes page ranking by putting a little link down here in the footer of every single page and then pointing over here trying to game and manipulate us, they hate that. In fact, there is an algorithmic penalty for that kind of stuff, and we can see it very directly.



We've actually run tests where we've observed that jamming this type of anchor text-rich links into footers or into navigation and then removing it gets a site indexed, well let's not say indexed, let's say ranking well and then ranking poorly when you do it. Google reverses that penalty pretty quickly too, which is nice. So if you are not ranking well and you're like, "Oh no, Rand, I've been doing a lot of that," maybe take it away. Your rankings might come right back. That's great.



3. The link target matters obviously from one place to another.

4. The importance of the linking page, this is actually a big one with internal links. So it is generally the case that if a page on your website has lots of external links pointing to it, it gains authority and it has more ability to sort of generate a little bit, not nearly as much as external links, but a little bit of ranking power and influence by linking to other pages. So if you have very well-linked two pages on your site, you should make sure to link out from those to pages on your site that a) need it and b) are actually useful for your users. That's another signal we'll talk about.



5. The relevance of the link, so pointing to my shipping routes page from a page about other types of shipping information, totally great. Pointing to it from my dog food page, well, it doesn't make great sense. Unless I'm talking about shipping routes of dog food specifically, it seems like it's lacking some of that context, and search engines can pick up on that as well.

6. The first link on the page. So this matters mostly in terms of the anchor text, just as it does for external links. Basically, if you are linking in a bunch of different places to this page from this one, Google will usually, at least in all of our experiments so far, count the first anchor text only. So if I have six different links to this and the first link says "Click here," "Click here" is the anchor text that Google is going to apply, not "Click here" and "shipping routes" and "shipping." Those subsequent links won't matter as much.

7. Then the type of link matters too. Obviously, I would recommend that you keep it in the HTML link format rather than trying to do something fancy with JavaScript. Even though Google can technically follow those, it looks to us like they're not treated with quite the same authority and ranking influence. Text is slightly, slightly better than images in our testing, although that testing is a few years old at this point. So maybe image links are treated exactly the same. Either way, do make sure you have that. If you're doing image links, by the way, remember that the alt attribute of that image is what becomes the anchor text of that link.

Internal versus external links

A. External links usually give more authority and ranking ability.

That shouldn't be surprising. An external link is like a vote from an independent, hopefully independent, hopefully editorially given website to your website saying, "This is a good place for you to go for this type of information." On your own site, it's like a vote for yourself, so engines don't treat it the same.

B. Anchor text of internal links generally have less influence.

So, as we mentioned, me pointing to my page with the phrase that I want to rank for isn't necessarily a bad thing, but I shouldn't do it in a manipulative way. I shouldn't do it in a way that's going to look spammy or sketchy to visitors, because if visitors stop clicking around my site or engaging with it or they bounce more, I will definitely lose ranking influence much faster than if I simply make those links credible and usable and useful to visitors. Besides, the anchor text of internal links is not as powerful anyway.



C. A lack of internal links can seriously hamper a page's ability to get crawled + ranked.

It is, however, the case that a lack of internal links, like an orphan page that doesn't have many internal or any internal links from the rest of its website, that can really hamper a page's ability to rank. Sometimes it will happen. External links will point to a page. You'll see that page in your analytics or in a report about your links from Moz or Ahrefs or Majestic, and then you go, "Oh my gosh, I'm not linking to that page at all from anywhere else on my site." That's a bad idea. Don't do that. That is definitely problematic.

D. It's still the case, by the way, that, broadly speaking, pages with more links on them will send less link value per link.

So, essentially, you remember the original PageRank formula from Google. It said basically like, "Oh, well, if there are five links, send one-fifth of the PageRank power to each of those, and if there are four links, send one-fourth." Obviously, one-fourth is bigger than one-fifth. So taking away that fifth link could mean that each of the four pages that you've linked to get a little bit more ranking authority and influence in the original PageRank algorithm.

Look, PageRank is old, very, very old at this point, but at least the theories behind it are not completely gone. So it is the case that if you have a page with tons and tons of links on it, that tends to send out less authority and influence than a page with few links on it, which is why it can definitely pay to do some spring cleaning on your website and clear out any rubbish pages or rubbish links, ones that visitors don't want, that search engines don't want, that you don't care about. Clearing that up can actually have a positive influence. We've seen that on a number of websites where they've cleaned up their information architecture, whittled down their links to just the stuff that matters the most and the pages that matter the most, and then seen increased rankings across the board from all sorts of signals, positive signals, user engagement signals, link signals, context signals that help the engine them rank better.

E. Internal link flow (aka PR sculpting) is rarely effective, and usually has only mild effects... BUT a little of the right internal linking can go a long way.

Then finally, I do want to point out that what was previous called — you probably have heard of it in the SEO world — PageRank sculpting. This was a practice that I'd say from maybe 2003, 2002 to about 2008, 2009, had this life where there would be panel discussions about PageRank sculpting and all these examples of how to do it and software that would crawl your site and show you the ideal PageRank sculpting system to use and which pages to link to and not.



When PageRank was the dominant algorithm inside of Google's ranking system, yeah, it was the case that PageRank sculpting could have some real effect. These days, that is dramatically reduced. It's not entirely gone because of some of these other principles that we've talked about, just having lots of links on a page for no particularly good reason is generally bad and can have harmful effects and having few carefully chosen ones has good effects. But most of the time, internal linking, optimizing internal linking beyond a certain point is not very valuable, not a great value add.

But a little of what I'm calling the right internal linking, that's what we're going to talk about, can go a long way. For example, if you have those orphan pages or pages that are clearly the next step in a process or that users want and they cannot find them or engines can't find them through the link structure, it's bad. Fixing that can have a positive impact.


Ideal internal link structures

So ideally, in an internal linking structure system, you want something kind of like this. This is a very rough illustration here. But the homepage, which has maybe 100 links on it to internal pages. One hop away from that, you've got your 100 different pages of whatever it is, subcategories or category pages, places that can get folks deeper into your website. Then from there, each of those have maybe a maximum of 100 unique links, and they get you 2 hops away from a homepage, which takes you to 10,000 pages who do the same thing.



I. No page should be more than 3 link "hops" away from another (on most small-->medium sites).

Now, the idea behind this is that basically in one, two, three hops, three links away from the homepage and three links away from any page on the site, I can get to up to a million pages. So when you talk about, "How many clicks do I have to get? How far away is this in terms of link distance from any other page on the site?" a great internal linking structure should be able to get you there in three or fewer link hops. If it's a lot more, you might have an internal linking structure that's really creating sort of these long pathways of forcing you to click before you can ever reach something, and that is not ideal, which is why it can make very good sense to build smart categories and subcategories to help people get in there.

I'll give you the most basic example in the world, a traditional blog. In order to reach any post that was published two years ago, I've got to click Next, Next, Next, Next, Next, Next through all this pagination until I finally get there. Or if I've done a really good job with my categories and my subcategories, I can click on the category of that blog post and I can find it very quickly in a list of the last 50 blog posts in that particular category, great, or by author or by tag, however you're doing your navigation.



II. Pages should contain links that visitors will find relevant and useful.

If no one ever clicks on a link, that is a bad signal for your site, and it is a bad signal for Google as well. I don't just mean no one ever. Very, very few people ever and many of them who do click it click the back button because it wasn't what they wanted. That's also a bad sign.

III. Just as no two pages should be targeting the same keyword or searcher intent, likewise no two links should be using the same anchor text to point to different pages. Canonicalize!

For example, if over here I had a shipping routes link that pointed to this page and then another shipping routes link, same anchor text pointing to a separate page, page C, why am I doing that? Why am I creating competition between my own two pages? Why am I having two things that serve the same function or at least to visitors would appear to serve the same function and search engines too? I should canonicalize those. Canonicalize those links, canonicalize those pages. If a page is serving the same intent and keywords, keep it together.

IV. Limit use of the rel="nofollow" to UGC or specific untrusted external links. It won't help your internal link flow efforts for SEO.

Rel="nofollow" was sort of the classic way that people had been doing PageRank sculpting that we talked about earlier here. I would strongly recommend against using it for that purpose. Google said that they've put in some preventative measures so that rel="nofollow" links sort of do this leaking PageRank thing, as they call it. I wouldn't stress too much about that, but I certainly wouldn't use rel="nofollow."

What I would do is if I'm trying to do internal link sculpting, I would just do careful curation of the links and pages that I've got. That is the best way to help your internal link flow. That's things like...



V. Removing low-value content, low-engagement content and creating internal links that people actually do want. That is going to give you the best results.

VI. Don't orphan! Make sure pages that matter have links to (and from) them. Last, but not least, there should never be an orphan. There should never be a page with no links to it, and certainly there should never be a page that is well linked to that isn't linking back out to portions of your site that are of interest or value to visitors and to Google.

So following these practices, I think you can do some awesome internal link analysis, internal link optimization and help your SEO efforts and the value visitors get from your site. We'll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!





SEO

via SEOmoz Blog https://moz.com/blog

May 25, 2017 at 07:18PM
0 Comments

Googles AMP (Accelerated Mobile Pages) Gains Support From Facebook by @MattGSouthern

5/25/2017

0 Comments

 


Google’s AMP (Accelerated Mobile Pages) Gains Support From Facebook by @MattGSouthern

http://ift.tt/2rmMyfV


Facebook is rolling out support for AMP as part of its open source Instant Articles software development kit.

The company’s new SDK will have an extension that allows publishers to create content in the Instant Articles, AMP, and Apple News format.

Support for Google’s AMP will be rolled out first, with support for Apple News coming in a few weeks.

Facebook’s SDK will work by building AMP and Apple News pages with the same markup used to build Instant Articles. In addition, it will include the unique customization options offered by each publishing format.

“With an easy way to get from one markup format to another, publishers can then plug-and-play the markup in their content management systems or third party publishing tools.”

This is an interesting and unexpected move from Facebook, which was previously pushing its own Instant Articles format as the only fast-loading article format allowed on the network. Instead of forcing publishers to adopt Instant Articles, Facebook is now giving them more freedom.

Having to create multiple versions of the same piece of content to share in different places is a challenge that Facebook is hoping to address with this update. This can also be seen as a way to attract publishers back to Instant Articles.

As publishers have begun to abandon Facebook’s Instant Articles format due to lack of monetization options, this could be the company’s way to get publishers to come back. Whether or not that will work remains to be seen, but introducing a solution for creating three types of articles at once sounds promising,





SEO

via Search Engine Journal http://ift.tt/1QNKwvh

May 25, 2017 at 12:21PM
0 Comments

Google Issues a Warning About Guest Posting to Build Links by @MattGSouthern

5/25/2017

0 Comments

 
http://ift.tt/2qlrDWi

Google Issues a Warning About Guest Posting to Build Links by @MattGSouthern

http://ift.tt/2rW69R1


Google has issued a warning to remind site owners about the dangers of publishing content on other sites for the purpose of building inbound links.

The company doesn’t frown on guest posts or syndicated posts in general, but lately there has been an increase in spammy links stuffed into these types of posts. That’s the reason behind this sudden warning from Google.

Distributing content on a large scale when the main intention is to build links back to your own site is strictly prohibited under Google’s guidelines on link schemes.

What Google does allow are guest posts and syndicated posts which “inform users, educate another site’s audience or bring awareness to your cause or company.”

Google goes on to explain other article writing and distribution practices that are against its guidelines.

  • Stuffing keyword-rich links to your site in your articles.
  • Having the articles published across many different sites; alternatively, having a large number of articles on a few large, different sites.
  • Using or hiring article writers that aren’t knowledgeable about the topics they’re writing on.
  • Using the same or similar content across these articles; alternatively, duplicating the full content of articles found on your own site (in which case use of rel=”canonical”, in addition to rel=”nofollow”, is advised).

This probably goes without saying, but Google reminds being caught publishing articles with spammy links could affect the perceived quality of a site and thus affect search rankings. Site owners should be vigilant in their vetting of guest posts, and nofollow any links that appear questionable.

Google will also take action on websites creating the content in violation of Google’s guidelines. The company points out to site owners being harassed about publish content they can submit a complaint via Google’s spam report form.





SEO

via Search Engine Journal http://ift.tt/1QNKwvh

May 25, 2017 at 12:21PM
0 Comments

SearchCap: Google link warning in-store sales & more SEM

5/25/2017

0 Comments

 
http://ift.tt/1WuzysF

SearchCap: Google link warning, in-store sales & more SEM

http://ift.tt/2rm5F9G

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Google link warning, in-store sales & more SEM appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.




SEO

via Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/fN1KYC

May 25, 2017 at 08:00AM
0 Comments
<<Previous

    Categories

    All
    Conversions
    Landing Pages
    Lead Generation
    Link Building
    Search Ranking
    SEO

    Archives

    November 2020
    October 2020
    September 2020
    January 2020
    November 2019
    July 2019
    June 2019
    May 2019
    April 2019
    March 2019
    February 2019
    January 2019
    December 2018
    November 2018
    October 2018
    September 2018
    August 2018
    July 2018
    June 2018
    May 2018
    April 2018
    March 2018
    February 2018
    January 2018
    December 2017
    November 2017
    October 2017
    September 2017
    August 2017
    July 2017
    June 2017
    May 2017
    April 2017
    March 2017
    February 2017

    RSS Feed

All content copyrighted (C) 2010 ~ 2020
​All Photos & Content Used Under Creative Commons
​www.RickRea.com 701-200-7831
Privacy Policy
  • Home
  • Blog
    • Social Media News
    • SEO Marketing News
    • Digital Trends News
    • Photography News
    • Mobile Marketing
    • Business News
    • Gadget News
    • Printing News
  • Contact
  • About
  • Subscribe