Leaning into SEO as Google shifts from search engine to portal
Google’s SERP is almost unrecognizable compared to what it looked like just a few years ago. The changes aren’t just on the surface, either: Google is becoming less search engine, more portal, said Jessica Bowman, CEO of SEO In-house and Search Engine Land editor at large, during her keynote at SMX Advanced this month.
This evolution is fundamentally altering the customer journey from search, with Google owning the process by enabling users to bypass clicks to websites to get information, take action and even transact. This will have repercussions for just about every company. Bowman offered several plans of action for SEOs preparing for these changes and said investments in SEO will be more important than ever.
Build and train your SEO army
“When I evaluate an organization, I find that every role has activities they do that affect SEO, and SEO needs to be integrated into those activities,” Bowman told Search Engine Land, “The SEO team has to figure out what those are and then train people to do that.”
Larger companies should incorporate SEO into their daily vernacular, said Bowman. This way, you can conscript dozens, if not hundreds, of staff members into your “SEO army,” get them advocating for it, quoting best practices, involving the dedicated SEO team and flagging missing requirements on a day-to-day basis.
Although non-SEOs aren’t expected to be authorities on the topic, their 20% of effort stands to make 80% of the impact on your brand’s overall optimization, Bowman said. It will be up to your main SEO team as well as upper management to empower them.
Expand writing competencies
Product information, news stories, how-to guides and various other types of content may receive higher visibility on SERPs if they appear as a knowledge panel, within a carousel or as a featured snippet. Your writers, be they bloggers, copywriters, social media managers or anything in between, need to be creating content that is comprehensive and authoritative enough to compete for organic visibility, said Bowman.
Understanding and correctly implementing schema on your site can help crawlers make sense of your content and, consequently, increase the odds that it gets displayed as a featured snippet. Featured snippets and other rich results, of course, illustrate the double-edged sword nature of Google’s portal-like interface: They increase your content’s visibility and yet users may not click through to your site because the information they need has already been presented to them.
Event, FAQ, speakable content and much more — Google now supports dozens of markups for various content types, making schema a valuable tool for modern SEO. If you’re using WordPress’ CMS, Yoast has revamped its schema implementation to streamline structured data entry, but it’s still important for your development team to be able to verify the quality of your code.
“Particularly for large, global companies, they need to think about these smaller search engines that are less sophisticated than Google but still drive a decent amount of traffic in international markets,” Bowman emphasized.
Monitor and study mobile SERPs
“The problem is, a lot of us work on our computers, and so we’re checking things out on the desktop interface,” Bowman pointed out. Beginning on July 1, all new sites will be indexed using Google’s mobile-first indexing, with older sites getting monitored and evaluated for mobile-first indexing readiness. Since the majority of searches now happen on mobile, brands need to closely examine the mobile SERP and account for updates and changes in order to create content that’s optimized for the devices their audiences are using.
“I think the reason that we, as an industry, have not been talking about this is because of that — we’re not really studying the search results on a mobile interface to truly see they’re [Google] taking it over, and as mobile takes over, they’re going to gobble up some of our traffic. I think once they’ve got it [the mobile SERP] mastered and they know it’s a strong user experience, it’s only a matter of time before they do that to desktop as well.”
Take advantage of big data
“Hiring a data scientist is better than hiring an SEO to study the data,” Bowman stated simply. Data scientists are better equipped to identify commonalities and trends that you can use to improve your optimization efforts, inform your content strategy and enhance user experience (UX).
During her keynote, Bowman also recommended that brands make use of the Google Chrome User Experience Report to compare site speed to the competition as well as reference UX metrics from popular destinations across the web. You can then be more proactive.
Google’s search results interface has changed dramatically, but brands and agencies that can shake the inertia, rally their staffs and reorient their processes will be the first to spot new opportunities and novel ways to reach their audiences.
About The Author
George Nguyen is an Associate Editor at Third Door Media. His background is in content marketing, journalism, and storytelling.
via Search Engine Land https://selnd.com/1BDlNnc
June 18, 2019 at 02:04PM
Google to add attribution to licensed lyrics providers
Earlier this week, Genius, a song lyrics web site, accused Google of stealing its lyrics without a proper licensing agreement. Google responded to those accusations in a blog post Tuesday, saying, again, that it “licenses the lyrics text from third parties” and does “not crawl or scrape websites to source these lyrics.” However, it is going to begin attributing licensed content to its partners.
Why we should care. Google said it will “soon include attribution to the third party providing the digital lyrics text.” This is something Google does not often do when it comes to content it licenses to show in search results. It does show the source of information for featured snippets and other forms of content but typically not for licensed content. Now users and site owners will know for certain where licensed content was sourced from.
Payment for content. Because “music publishers often don’t have digital copies of the lyrics text,” Google said, “In these cases, we—like music streaming services and other companies—license the lyrics text from third parties.”
Google said it licenses this content to “ensure that the songwriters are paid for their creative work.” Google wrote, “To do that, we pay music publishers for the right to display lyrics, since they manage the rights to these lyrics on behalf of the songwriters.”
LyricFind. LyricFind is a Google licensing partner, and may be the source of the Genius content appearing in Google’s search results. LyricFind published an explanation on its web site Monday, saying, “Some time ago, Ben Gross from Genius notified LyricFind that they believed they were seeing Genius lyrics in LyricFind’s database. As a courtesy to Genius, our content team was instructed not to consult Genius as a source. Recently, Genius raised the issue again and provided a few examples. All of those examples were also available on many other lyric sites and services, raising the possibility that our team unknowingly sourced Genius lyrics from another location. As a result, LyricFind offered to remove any lyrics Genius felt had originated from them, even though we did not source them from Genius’ site. Genius declined to respond to that offer. Despite that, our team is currently investigating the content in our database and removing any lyrics that seem to have originated from Genius.”
via Search Engine Land https://selnd.com/1BDlNnc
June 18, 2019 at 01:43PM
Why search needs to be combined with awareness for maximum impact
No marketer today should be relying on a single channel to drive conversions. That’s obvious; what’s less obvious is figuring out what’s actually driving conversions. The market is awash in data but many marketers are no closer to a precise understanding of ROI than at any time in the past.
As the consumer journey becomes ever more circuitous and complex, marketers need to think and act holistically in their media planning and ROI/ROAS analysis. How are the multiple channels in a campaign interacting but how might they also reflect the influence of other ad exposures in the path to online or offline transactions?
Branded queries reflect other media influence. Search is a great example. It’s both a staring point for research at the top of the funnel but can also function as a tool reflecting the impact of other media on brand or product awareness. This is especially true in the case of branded or product-specific queries, a point that Yahoo actually made in 2005 before the launch of Google Trends.
Specifically, Yahoo argued back then that search can help measure the effectiveness of TV and online display campaigns. Marketers have also been including some version of “Google XYZ” as a call-to-action in TV ads, out-of-home and other media, off and on for years.
Digiday now reports that direct-to-consumer brands are increasingly looking to search traffic to evaluate the performance of their TV campaigns. Although this isn’t novel, it’s smart to combine these channels: TV to develop awareness and prompt branded queries for lower-cost and more efficient traffic acquisition and, potentially, conversions. Search queries and volumes then reveal the efficacy of the TV campaigns.
What’s behind ‘direct traffic.’ According to Episerver’s new B2C Dot-Com Report, based on 1.3 billion shopping sessions “across 159 unique retail and consumer brand websites,” 48% of website visits come from direct traffic (“a person who types in the retailer’s website directly into their browser”). Direct traffic is one of the most efficient and highest converting channels. But what’s behind it?
Direct traffic is often, if not mostly, a function of some other stimulus or awareness mechanism. Indeed, Episerver asserts that direct traffic is “the culmination of a brand’s marketing efforts” as much as the result of any single channel or campaign.
Search, both organic and paid, is the next most common traffic referrer and accounts for a 37% share according to Episerver. The report doesn’t break down branded vs. non-branded keywords. I suspect, however, that a meaningful percentage of search-referral traffic identified in the report is ultimately branded.
Episerver says that, overall, “paid and organic search are the highest performing traffic sources,” taking a variety of factors into consideration. Episercer advises that “retailers and brands should ensure they’re seeing similar strength in paid and organic search traffic and if not, optimize accordingly.” Beyond paid and SEO best practices, “optimizing” search traffic may require driving awareness through other channels such as display, social and video.
Marketers need to understand ‘the big picture.’ Chamber.Media’s Bryant Garvin, during an SMX Advanced presentation on ROAS and attribution this month, discussed the need to step back and look at the full customer journey to understand the role of multiple channels in driving conversions. He argued that marketers should look at — and question — the data not just rely on Google analytics or Facebook pixel data. Often these tools are going to “too myopic,” he argued.
Data from any single channel may mislead or tell only part of the story. Marketers need to look at the “big picture,” he said.
via Search Engine Land https://selnd.com/1BDlNnc
June 18, 2019 at 01:27PM
SMX Overtime: When to use automation to improve account performance
Although a scheduled presenter for “Automation – The Next Generation,” PPC expert Duane Brown of Take Some Risk was ill and with much regret missed this session at SMX Advanced, he was happy to answer some questions from attendees as well as share a helpful resource for Google Ad scripts.
How long should it take before a third-party platform should be helping your account performance improve? When should we cut?
This is no different than running an A/B test on your ad account. You need to know what success looks like before you go into the process of picking a third-party tool. You also need to understand why you are using the tool (is it to help with your CPA or grow revenue). If you don’t know what success looks like, then you can not determine if you should cut a tool or not. To start, I would test a new tool. The first month shouldn’t be a huge change but as we get into months two and three… if I don’t see a performance boost in our account, I will start to ask why. Odds are it is time to cut. Any tool should ideally save you time and money while also growing your revenue.
If you work in an environment where frequent testing is the norm, how do you get over the fluctuations you’ll see due to the learning period of an automated bidding strategy and have stable performance quicker?
You could test less. Sometimes running six tests in a month is better than trying to run ten tests. The other option would be to figure out how you get more data. Most bidding strategies do better with more data to help Google understand what is going on. With any testing, you have to deal with fluctuations and accept that as part of doing business. Even when something fails, you would have learned something from the experiment.
What are your thoughts on smart shopping campaigns? And how to make it perform better?
Smart shopping is great when you have tons of data to back the algorithm. The challenge we face is that you grow revenue and maybe lower your CPA but then plateau in performance. Limiting your performance and growth potential is not always worth doing smart shopping. Also, the black box of data and ads showing outside your target region does not always make it the best option for an e-commerce brand.
It’s the end of the quarter and my boss says: “We need to increase conversions during this last week.” Now that I’m using Target CPA, I find I can’t as quickly ramp conversions. Any suggestions for getting faster results when using machine learning bid strategies?
If you are asking this question, does this mean it happens a lot? If so, it sounds like a communication and education issue. You are not always going to be able to ramp up performance as you see fit. Your boss should know they can’t just come to you and ask in the last week to make it rain for them, especially if another area of the business is hurting performance wise. I would be finding out sooner than the last week of the month if we need to see a higher performance.
How do you know the right third-party automation tool to pick? Are there any studies doing a side-by-side of the top services?
Most studies are either going to be biased and/or take into account nuances that may not apply to you when it comes to using any tool or even picking a rule or scripts. I always start with why am I doing this job and what am I hoping to get out of it. Then I look for tools that can help do the job and maintain the quality of work that we want. It’s also ok to only automate 70% of your work and still have a few manual steps. Automation does not need to be 100%.
What should we consider when determining the right attribution model? DDA does not provide any data.
The right attribution model should move your business in the right direction. There is no right attribution model and it’s a test-and-learn approach as you understand how your attribution model and using that data affects your business. This is not a set and forget it process either.
Need some help with Google Ad scripts? Here’s a good resource.
Here is a helpful resource to check out – a library of references for 142 unique scripts for Google Ads.
About The Author
Wendy Almeida is Third Door Media's Community Editor, working with contributors for Search Engine Land, Marketing Land and MarTech Today. She has held content management roles in a range of organizations from daily newspapers and magazines to global nonprofits.
via Search Engine Land https://selnd.com/1BDlNnc
June 18, 2019 at 12:50PM
Google Search Console drops preferred domain setting
Preferred domain setting. The preferred domain setting is an old Google Search Console feature that has been part of the toolset since it was named Google Webmaster Tools. It let you communicate to Google the preferred domain is the one that you would liked used to index your site’s pages, which is also referred to as the canonical domain.
What it looked like. Here is a screen shot of that feature in the old version of Google Search Console:
No longer needed. Google is removing this feature because Google believe this is no longer needed. Google said it is able to pick the preferred domain for you based on various signals Google is picking up.
Current setting won’t be respected. Google said with this feature going away, Google will not look at the current configuration and setting. Google said “that with the deprecation we will no longer use any existing Search Console preferred domain configuration.”
What do I do now? You can now communicate to Google your preferred domain through good site architecture. Google said you can use these four methods or read this help document to help Google determine your canonical domain.
(1) Use rel=”canonical” link tag on HTML pages
Why it matters. If you are currently depending on Google to select your preferred domain through this setting, you can no longer depend on that. You will need to ensure that your canonical URL listed in Google has not changed with this announcement. Google did not say if it will communicate in any way to webmasters, publishers or developers through Search Console if this resulted in your site switching its canonical URL in Google’s index. Instead, you need to audit your Google results to ensure no changes were made.
via Search Engine Land https://selnd.com/1BDlNnc
June 18, 2019 at 09:35AM
Bye Bye Preferred Domain setting
As we progress with the migration to the new Search Console experience, we will be saying farewell to one of our settings: preferred domain.
It's common for a website to have the same content on multiple URLs. For example, it might have the same content on http://example.com/ as on http://bit.ly/1sPJHUz. To make things easier, when our systems recognize that, we'll pick one URL as the "canonical" for Search. You can still tell us your preference in multiple ways if there's something specific you want us to pick (see paragraph below). But if you don't have a preference, we'll choose the best option we find. Note that with the deprecation we will no longer use any existing Search Console preferred domain configuration.
You can find detailed explanations on how to tell us your preference in the Consolidate duplicate URLs help center article. Here are some of the options available to you:
Posted by Daniel Waisberg, Search Advocate
via Google Webmaster Central Blog http://bit.ly/1Ul0du6
June 18, 2019 at 09:10AM
SEO Tactics: Black Hats, White Hats, Gray Hats & ‘Asshats’ via @schachin
If you have been in the SEO industry for any length of time, you know there is a definitive line drawn in the industry sand.
On one side are black hats.
On the other? White (or gray) hats.
Nothing can draw a group of SEO professionals into a debate faster than mentioning to the other side that your way is better.
But when we talk about the multiple “hats of SEO,” just what are we talking about?
Hats. Hats. All the Hats.
First, let’s make sure we’re all on the same page: the types of “hats” in SEO are neither good or bad.
Hats simply represent philosophies used by SEO professionals and how closely the tactics they use follow Google’s Webmaster Guidelines.
You are free to choose which hat you think works best.
Well except for “asshats,” but we’ll get to that later.
The color of an SEO “hat” is just an indicator of what tactics you use to rank sites. Nothing more.
White Hat SEO
So, what is “white hat” SEO?
Do you 100% follow Google’s Webmaster Guidelines, playing only within their rules to get your site to the top spot in Google?
Then you practice white hat SEO. You wear a “white hat.”
“White hats do everything Google tells them to and play within Google’s rules (the Webmaster Guidelines). They are simple unimaginative people who can never rank as well as a Black Hat” – or at least that is what a black hatter might tell you.
However, this is not true.
White hats just love to play inside the rules of the algorithms.
They want to be far from the long arm of Google’s penalty team and their manual actions.
White hats love the challenge of playing within Google’s Webmaster Guidelines and getting big wins.
And, as any white hat can tell you, when you get those wins, those wins stick around – even if you stop doing the work.
Now, getting those wins usually takes longer for a white hat than it will a black hat, but once there the white hat SEO reaps the rewards of their work the gains aren’t usually going anywhere any time soon.
That is not necessarily true for black hat SEO methods.
Because white hat SEO is not going to be penalized by Google. Once rewarded by an update, it is not likely that will go away unless changes are made that affect those ranking factors.
Now, this does not mean it is better. Again, white hat is simply an SEO philosophy.
But, if done well, white hat SEO is at little risk of getting “hit” by a Google update or a manual action. This makes it much safer philosophy in general.
Is anyone really always pure white hat? Are there SEO professionals who follow Google’s rules 100% of the time?
In reality, virtually no one is really a white hat.
All effective “white hats” typically wear a hat that is some shade of grey. Some a little lighter or darker than others. These are known as “gray hats.”
Gray Hat SEO
So, what is “gray hat” SEO?
Let’s be real. Nobody who is successfully ranking their sites is a pure white hat, according to Google’s rules.
If you were, it would mean you do nothing to attract links and you would simply wait for those to naturally arrive because Google’s rules say so.
The issue is, if SEO pros followed that suggestion, well nothing we did would rank competitively for years because that is how long it would take to have enough links to rank properly and most no companies have that kind of time to wait for rankings and traffic.
So to get rankings, an SEO professional will have to help the site content acquire links.
The minute you do something to attract or acquire links, you are no longer a white hat.
So, most people in SEO who say they are “white hat” are really playing in shades of gray.
Now, being gray hat doesn’t mean the person is a “black hat,” either.
Generally speaking, a gray hat won’t buy links from link sellers like black hat might, but they will try to amplify their content specifically to get links or they will find placements on legitimate sites willing to place their links.
Once an SEO has crossed this line, they are no longer purely white hat.
Yet, except for link acquisition, for the most part gray hat SEO pros do follow the rules Google has laid out. This is not the same for someone who uses black hat tactics.
Black Hat SEO
So, what is a “Black Hat”?
Do you look for ways around Google’s guidelines to quickly position your sites?
Do you buy links?
Do you look for ways to get the algorithms to boost your site, outside of Google’s guidelines?
Then you’re using “black hat” SEO tactics. You wear a black hat.
“Black hats are unethical. Black hats manipulate the search results and sites have to fight for placements against people that are just breaking the rules. Black hats aren’t SEO professionals, they just create SPAM.”
Well, at least that is how you will hear some white hat SEO folks describe the black hats.
But is this fair?
No, it isn’t.
Do you know what black hats will tell you about that “SPAM”? It is just a “Site Positioned Above Mine” and you are jealous.
They might be right.
Black hats are SEO practitioners, too. Black hat is just a tactic they use that falls outside of Google’s Webmaster Guidelines in order to rank sites.
This is not without controversy, though.
Why is it controversial?
Because black hat techniques go against Google’s Webmaster Guidelines. They can get a site hit with a Google “penalty”, which means that site might fall off an algorithmic cliff – literally overnight.
So unlike white hat SEO, there is an ethics component to black hat SEO.
Most people of all colored hats agree that you must tell your client about all the risks of using these strategies.
Meaning, black hat is a perfectly legitimate strategy – as long as the client knows what could happen when implemented.
Another reason for the controversial nature of black hat SEO?
Some mistake black hats for hackers. While, by definition, a black hat is breaking Google’s rules, they are not breaking any laws – unlike many hackers.
Black hats just use a different set of SEO tactics to get their sites ranked. Just ones that Google doesn’t like. Not ones that have men in black showing up at your door for a chat.
There are people who use SEO and hacking to promote their sites or attack competitors, but that is not in the scope of this discussion and not a strictly black hat SEO technique.
Although people who use black hat SEO techniques are often criticized and looked down on, due to the risks involved, there are times when a black hat approach might even be preferred to a white hat strategy.
Let’s say a client needs to rank a site for the Thanksgiving and Christmas holidays, but they don’t come you until a few weeks before.
As a white hat, you would have to tell that client that you are sorry, but you cannot get a site to rank that quickly.
However, the black hat will tell you – sure! No problem. I will have you there before Thanksgiving!
So the black hat will build a microsite and get it ranking to help the company grab some of that holiday cash.
Just like white hats, black hats have their place in the SEO ecosystem as well.
“Asshat” SEO is the real issue.
So what is an “asshat”?
Asshat SEO generally falls into two categories.
Black hat is risky. Risking other people’s livelihoods on it, even when informed, is where lines usually need to be drawn.
If you are using these tactics ethically, you must tell the client that you are using strategies that could get them a manual action, and one that could take months to recover from if the site was penalized.
Another ethical question if you are a Black Hat and using high-risk tactics on a client’s bread and butter site, is that site covering payroll for a company’s employees?
If it is, then you have to ask yourself is this okay to put them in a risky position even if they are informed?
Many would say, no.
It is not okay to risk a client’s bread and butter site for rankings. If people’s jobs are dependent on that site, then that site should not be used for Black Hat SEO.
Even the best Black Hat can make a mistake or get caught by an unexpected update or manual action. So Black Hat techniques should be reserved for microsites or other web platforms not dependent on the main cash register site.
Same goes for someone who claims to do White Hat SEO just for the money without being competent in SEO. You can damage a site just as badly by doing White Hat SEO poorly as you can by doing Black Hat.
So what is an Asshat SEO?
If you are not good at SEO whether black or white yet do it for the money, are a Black Hat SEO and do not inform the client of the risks, or endanger your client’s main cash register site even when they know the risks – you’re what might be called an Asshat SEO.
Disregarding the potential damage to the people on the other end of the site isn’t being a risk taker, it is being careless with others livelihoods and that can never be okay.
Never Be This SEO
No one should ever lose their job because the SEO jeopardized the client site because, in the end, we are there to protect the company, not hurt it.
We stand between a client hiring people or firing people. Though sometimes it is out of our control, SEO professionals should never be the reason for the latter, if they can help it.
In the end, wear whatever color SEO hat you want. Black, white, gray – it doesn’t matter.
Just don’t be an asshat SEO, where someone can’t pay their rent tomorrow because of something you did yesterday.
Featured Image: DepositPhotos.com
via Search Engine Journal http://bit.ly/1QNKwvh
June 18, 2019 at 08:58AM
12 EASY Signs You Need to Rethink Your Paid Search Strategy via @adamproehl
Running paid search campaigns, but wondering if you have the right strategy in place?
Here are a dozen signs to look out for that may indicate the need to rethink your current approach.
1. You’re Constantly Underspending Your Budget
You have a budget of $5,000 per month to spend, but on a consistent basis, you aren’t coming anywhere close to that.
I’m not talking about a one-month seasonal aberration. There could be multiple reasons you’re underspending, including:
2. You’re Down to Buying Branded Terms Only Because You Don’t See Conversions From Any Other Source
Your paid search campaign is so limited. You or your agency has culled your keyword list down to just branded terms because everything else just seems like a waste of money.
Either every non-branded (but relevant) term is way too expensive to justify the cost of a click or it’s priced fairly, but not once clicks or converts. If this is the case, here are a few potential root causes to examine:
And on the flip side to this extreme…
3. You Refuse to Buy Branded Terms Because You ‘Already Rank for That Organically’ … ‘So Why Waste Our Money?’
This tired argument is bound to come up when you have a new executive who is new to the world of search.
Some of the easiest answers to this question include:
4. You’re Expecting the Same Type of Conversion Results From All Keywords
If you have this expectation, you’re set up to fail from the beginning. It just doesn’t work this way.
The stage of the buying cycle plays a very heavy role in which keyword gets search on. For example, let’s look at “virtual meetings”:
5. You’re Using Cost per Click as Your Sole Determining Factor in Whether You Bid on a Term or Not
It can certainly be a factor, but not the only factor. Remember this is an auction. At auctions, the price is determined by what a bidder is willing to pay.
Logically, if someone is willing to pay a higher price, that bidder deems the cost to be worth it.
I’m not saying you should automatically be willing to pay what you feel is an outrageous price, but you owe it to yourself to dig deeper and ask a few questions:
6. You’re Only Measuring the Success of Your Campaign on an overall Return On Ad Spend (ROAS)
I understand why you would do this. It’s a simple trap to fall into:
Here’s why using only ROAS for a success measurement is a bad idea:
7. You’re Spending Your Entire Paid Media Budget on Text-Based PPC Ads – You See It as Your Only Paid Media Channel
Stop! Just Stop!
Don’t get me wrong. I love PPC. I’ve made a good living doing it, but if you’re relying on it as your only paid channel to drive revenue, you’re in trouble.
Remember that statement back in #1 about search being great for responding to demand but unable to help you build it?
That’s all you really need to know if you ever start thinking it’s a good idea to make PPC as your only paid media channel. It’s not!
8. You’re Measuring Your Program’s Success on Impressions & Clicks
It’s fine to measure impressions and clicks to add context to a comprehensive analysis that includes more important success metrics like post-click activity and conversions.
If the report you get doesn’t go any deeper than impressions and clicks, what can you actually tell me about the success of your paid search program?
Answer: Nothing. You need additional context to both of those metrics otherwise there is nothing actionable with that data.
9. You’re Sending Every Single Paid Search Ad to the Home Page
It’s fine to send some traffic to the home page if it’s relevant to the searcher. Otherwise, don’t do it.
In your ad copy, you’re making a promise to that searcher. Whatever it is your ad copy promises (specific info, third-party validation, discount, how-to, case study, etc.), your landing page needs to deliver. That’s asking a lot of a website’s home page.
In the Zoom meetings example above, the headline links to the home page (which, again, is fine if that page is relevant to the ad), but look at the four extensions below it.
They all point to relevant landing pages that keep the promise they make to the searcher:
10. You Aren’t Using Any Ad Extensions
Here’s how that Zoom ad above would look without the extensions:
The ad seen in #9 looks like it would generally be an effective one. Without the ad extensions, it becomes a lot less compelling, doesn’t it?
11. You Haven’t Touched Your Ad Creative in at Least a Year
For the record, I’m in favor of “what works”, not “what’s new.” It’s important to always keep that perspective.
As Responsive Search Ads become more widely adopted, the old ways of “one static creative” will be less and less of a standard practice.
However, Responsive Search Ads still require inputs. The combinations of outputs that result from a machine learning system assembling the best mix of headlines and copy are only as good as the inputs.
How we optimize ad creatives has dramatically evolved in recent years beyond simple A/B testing, but you still need to have the type of mindset where you’re always looking to find the best recipe for success.
That means challenging your biases and assumptions to take a chance that you can improve your campaign performance.
For more insights into Responsive Search Ads, check out SEJ’s post covering the original announcement.
12. You Don’t Know What You Should Be Paying for a Conversion
Calculating the Cost Per Conversion is a simple equation an average second grader could do: Total campaign spend divided by the total conversions tracked (back to the campaign in some form).
The hard part occurs away from the Ad Platform interface – calculating what you should be willing to pay for the conversion.
This is where the PPC manager will need help from stakeholders. Some of the things that need to be taken into account are:
The point is, understand what you SHOULD be willing to pay for a conversion within your market and you’ll have an easier time setting up your paid search campaign for success!
Starting With This
There’s more to running a successful paid search program than just the 12 items listed here, but this is a good place to start re-examining your strategy and will ultimately set you on a better path.
All screenshots taken by author, June 2019
via Search Engine Journal http://bit.ly/1QNKwvh
June 18, 2019 at 08:58AM
How to Manage & Maximize Visual Content Creation on a Large Scale via @lorenbaker
Producing content is a challenge for businesses of all sizes – but especially for large organizations.
With so many moving parts, enterprises must find a way to streamline the content management process while ensuring quality output.
So how can content teams achieve success and make their content stand out in a crowded marketplace?
On June 12, I moderated a sponsored SEJ ThinkTank webinar presented by Lauren Klein, Senior Account Executive, Enterprise at ScribbleLive, and Makayla Millington, Campaign Development Specialist at Autodesk.
The presenters shared how to overcome the challenges of managing and maximizing the visual content creation process on a large scale.
Here’s a recap of the webinar presentation.
Visual content is essential to any marketing organization. According to the latest stats:
Marketing organizations need to focus an equal amount of their effort on visual components of the content.
In addition to creating white papers, blog posts, and research reports, you should also consider:
The Challenges of Creating Content on a Large Scale
However, creating content on a large scale comes with a unique set of challenges:
To solve these pain points, you should aim to streamline as much of your content creation as possible, and ensure that your team is on the same page.
Lack of Time
According to a CMI report, 51% of marketers cite lack of time and bandwidth as a major hurdle.
You may solve this pain point by:
Example: How Autodesk Saved Time & Simplified Workflow
Autodesk also encountered this pain point but found a way to save time and simplify their workflow by partnering with Visually, which is part of the ScribbleLive Content Cloud.
By using the platform, Autodesk was able to:
Lack of Resources
Demand Metric’s 2018 Benchmark Study on Content Experience Impact and the Buyer’s Journey reveals that 48% of marketers view staffing restraints as their biggest barrier to content marketing.
Some of the solutions to address this pain point include:
Example: How Autodesk Maximized Their Resources
Through Visually, Autodesk managed to maximize its resources when taking on content creation projects. The partnership enabled them to:
Lack of Communication
Salesforce reports that 86% of employees and executives cite lack of collaboration or ineffective communication for workplace failures.
Companies can overcome this challenge by:
Example: How Autodesk Improved Team Communication
With the help of Visually, Autodesk improved its overall team communication.
AutoDesk’s Campaign Results
Through a well-managed, large-scale content creation process on Visually, Autodesk produced various content marketing collaterals such as:
[Video Recap] How to Manage and Maximize Content Creation on a Large Scale
Watch the video recap of the webinar presentation and Q&A session below.
Or check out the SlideShare below.
Join Us for Our Next Webinar!
Join Us Text: Join our next live webinar on Wednesday, June 26 at 2 p.m. ET as we Directive’s Garret Mehrguth shares actionable tips on how to “pay for SEO” in order to drive qualified leads.
via Search Engine Journal http://bit.ly/1QNKwvh
June 18, 2019 at 08:58AM
What Type of Links Does Google Really Prefer?
We all know that links help rankings. And the more links you build the higher you’ll rank.
But does it really work that way?
Well, the short answer is links do help with rankings and I have the data to prove it.
But, you already know that.
The real question is what kind of links do you need to boost your rankings?
Is it rich anchor text links? Is it sitewide links? Or what happens when the same site links to you multiple times? Or when a site links to you and then decides to remove the link?
Well, I decided to test all of this out and then some.
Over the last 10 months, I decided to run an experiment with your help. The experiment took a bit longer than we wanted, but we all know link building isn’t easy, so the experiment took 6 months longer than was planned.
Roughly 10 months ago, I emailed a portion of my list and asked if they wanted to participate in a link building experiment.
The response was overwhelming… 3,919 people responded, but of course, it would be a bit too hard to build links to 3,919 sites.
And when I say build, I’m talking about manual outreach, leveraging relationships… in essence, doing hard work that wouldn’t break Google’s guidelines.
Now out of the 3,919 people who responded, we created a set of requirements to help us narrow down the number of sites to something more manageable:
We decided to cap the experiment to 200 sites. But eventually, many of the sites dropped off due to their busy schedule or they didn’t want to put in the work required. And as people dropped off, we replaced them with other sites who wanted to participate.
How the experiment worked
Similar to the on-page SEO experiment that we ran, we had people write content between 1,800 and 2,000 words.
Other than that we didn’t set any requirements. We just wanted there to be a minimum length as that way people naturally include keywords within their content. We did, however, include a maximum length as we didn’t want people to write 10,000-word blog posts as that would skew the data.
Websites had 2 weeks to publish their content. And after 30 days of it being live, we looked up the URLs within Ubersuggest to see how many keywords the article ranked for in the top 100, top 50 and top 10 spots.
Keep in mind that Ubersuggest has 1,459,103,429 keywords in its database from all around the world and in different languages. Most of the keywords have low search volume, such as 10 a month.
We then spent 3 months building links and then waited 2 months after the links were built to see what happened to the rankings.
The URLs were then entered back into the Ubersuggest database to see how many keywords they ranked for.
In addition to that, we performed this experiment in batches, we just didn’t have the manpower and time to do this for 200 sites all at once, hence it took roughly 10 months for this to complete.
We broke the sites down into 10 different groups. That’s 20 sites per group. Each group only leveraged 1 link tactic as we wanted to see how it impacted rankings.
Here’s each group:
Now before I share what we learned, keep in mind that we didn’t build the links to the domain’s homepage. We built the links to the article that was published. That way we could track to see if the links helped.
Do you really need links to rank your content? Especially if your site has a low domain score?
Based on the chart, the older your content gets, the higher you will rank. And based on the data even if you don’t do much, over a period of 6 months you can roughly rank for 5 times more keywords even without link building.
As they say, SEO is a long game and the data shows it… especially if you don’t build any links.
They say anchor text links really help boost rankings. That makes sense because the link text has a keyword.
But what if the anchor rich link comes from an irrelevant site. Does that help boost rankings?
It looks like anchor text plays a huge part in Google’s rankings, even if the linking site isn’t too relevant to your article.
Now, I am not saying you should build spammy links and shove keywords in the link text, more so it’s worth keeping in mind anchor text matters.
So if you already haven’t, go put in your domain here to see who links to you. And look for all of the non-rich anchor text links and email each of those site owners.
Ask them if they will adjust the link and switch it to something that contains a keyword.
This strategy is much more effective when you ask people to switch backlinks that contain your brand name as the anchor text to something that is more keyword rich.
They say sitewide links are spammy… especially if they are shoved in the footer of a site.
We built one sitewide footer link to each article to test this out.
Although sites that leverage sitewide links showed more of an increase than the control group, the results weren’t amazing, especially for page 1 rankings.
Do relevance and the placement of the links impact rankings? We built 3 in-content links that were relevant to each article.
Now the links were not rich in anchor text.
Compared to the baseline, rankings moved up to a similar rate as the sites who built rich anchor text links from irrelevant sites.
Multiple site links
I always hear SEOs telling me that if you build multiple links from the same site, it doesn’t do anything. They say that Google only counts one link.
For that reason, I thought we would put this to the test.
We built 3 links to each article, but we did something a bit different compared to the other groups. Each link came from the same site, although we did leverage 3 different web pages.
For example, if 3 different editors from Forbes link to your article from different web pages on Forbes, in theory, you have picked up 3 links from the same site.
Even if the same site links to you multiple times, it can help boost your rankings.
Is more really better? How does one relevant link compare to 3 irrelevant links?
It’s not as effective as building multiple links. Sure, it is better than building no links but the articles that built 3 relevant backlinks instead of 1 had roughly 75% more keyword placements in the top 100 positions of Google.
So if you have a choice when it comes to link building, more is better.
Similar to how we tested footer links, I was curious to see how much placement of a link impacts rankings.
We looked at in-content links, footer links, and now sidebar links.
Shockingly, they have a significant impact in rankings. Now in order of effectiveness, in-content links help the most, then sidebar links, and then sitewide footer when it comes to placement.
I wish I tested creating 3 sitewide footer links to each article instead of 1 as that would have given me a more accurate conclusion for what placements Google prefers.
Maybe I will be able to run that next time. ?
Do nofollow links help with rankings?
Is Google pulling our leg when they say they ignore them?
From what it looks like, they tend to not count nofollow links. Based on the chart above, you can see that rankings did improve over time, but so did almost every other chart, including the control group.
But here’s what’s funny: the control group had a bigger percentage gain in keyword rankings even though no links were built.
Now, I am not saying that nofollow links hurt your rankings, instead, I am saying they have no impact.
High authority link
Which one do you think is better:
Having one link from a high domain site (70 or higher)?
Having 3 links from sites with an average or low domain score?
Even though the link from the authority site wasn’t rich in anchor text and we only built 1 per site in this group… it still had a bigger impact than the sites in the other group.
That means high authority links have more weight than irrelevant links that contain rich anchor text or even 3 links from sites with a low domain score.
If you are going to spend time link building, this is where your biggest ROI will be.
Build and removed links
This was the most interesting group, at least that is what the data showed.
I always felt that if you built links and got decent rankings you wouldn’t have to worry too much when you lost links.
After all, Google looks at user signals, right?
This one was shocking. At least for sites that have a low domain score, if you gain a few links and then lose them fairly quickly, your rankings can tank to lower than what they originally were.
I didn’t expect this one and if I had to guess, maybe Google has something programmed in their algorithm that if a site loses a large portion of their links fast that people don’t find value in the site and that it shouldn’t rank.
Or that the site purchased links and then stopped purchasing the links…
Whatever it may be, you should consider tracking how many links you lose on a regular basis and focus on making sure the net number is increasing each month.
I wish I had put more people behind this experiment as that would have enabled me to increase the number of sites that I included in this experiment.
My overall sample size for each group is a bit too small, which could skew the data. But I do believe it is directionally accurate, in which building links from high domain score sites have the biggest impact.
Then shoot for rich anchor text links that are from relevant sites and are placed within the content.
I wouldn’t have all of your link text rich in anchor text and if you are using white hat link building practices it naturally won’t be and you won’t have to worry much about this.
But if you combine all of that together you should see a bigger impact in your rankings, especially if you are a new site.
So, what do you think about the data? Has it helped you figure out what types of links Google prefers?
via Neil Patel https://neilpatel.com
June 18, 2019 at 08:17AM