On Friday we reported on an issue where SEOs began noticing that Google was removing tons of pages from their index. It seemed to have started on Thursday but Google did not confirm the issue until Saturday morning. The issue is now into day four or so and it is not yet fully resolved.
In short, some sites were seeing a nice chunk of their web pages being de-indexed, removed from the Google index, and thus not showing up in the Google search results. This can hurt big time for sites that depend on Google to send them traffic. That traffic may convert to leads, ad clicks, e-commerce check outs, and other conversion metrics.
It is hard to say how big this was, I asked, Google won’t say. But the issue seems to be big at least from within the SEO community.
Here is what we know so far.
Google thought they fixed the issue Saturday morning:
Sorry — We had a technical issue on our side for a while there — this should be resolved in the meantime, and the affected URLs reprocessed. It’s good to see that the Inspect URL tool is also useful for these kinds of cases though!
— 🍌 John 🍌 (@JohnMu) April 6, 2019
SEOs before this knew that they could use the URL Inspection tool to submit URLs back into the Google index. But for sites with thousands of pages removed from the Google index, this is not practical.
Google won’t explain what the technical issue is/was:
That’s unlikely :-). Our internal systems are pretty unique.
— 🍌 John 🍌 (@JohnMu) April 6, 2019
Then Sunday morning, Google came back after most SEOs were saying – no Google it isn’t fixed – to confirm once again, it isn’t fixed. Google is still working on it they said and even when they are done – don’t expect all your URLs indexed, because Google doesn’t do that:
One thing to add here – we don’t index all URLs on the web, so even once it’s reprocessed here, it would be normal that not every URL on every site is indexed. Awesome sites with minimal duplication help us recognize the value of indexing more of your pages.
— 🍌 John 🍌 (@JohnMu) April 7, 2019
Google will fix it, you don’t need to do anything but feel free to use the URL Inspection tool John said to expedite certain pages:
Yep! It looks like it’s still catching up, things will settle back like before automatically. People seem to have success with the submit-to-indexing tool, if there’s something specific you’re missing and don’t want to wait. (I know, nobody wants to wait :))
— 🍌 John 🍌 (@JohnMu) April 7, 2019
Google won’t say how much of their index was impacted by this and I have not seen estimates from outside sources yet:
The way search works, a single number isn’t that representative, nor useful for context. Eg, if we dropped all calendar pages from the years 2020+, that might be a ton of URLs, but it probably wouldn’t interest you much. So, I’m unsure we’d have a number to share in the end.
— 🍌 John 🍌 (@JohnMu) April 7, 2019
On Sunday Google also shared it on this account:
We’re aware of indexing issues that impacted some sites beginning on Friday. We believe the issues are mostly resolved and don’t require any special efforts on the part of site owners. We’ll provide another update when the issues are considered fully resolved.
— Google SearchLiaison (@searchliaison) April 7, 2019
As of right now, 8am eastern time in New York, we do not have any further updates from Google. The last we heard was from yesterday on this topic.
Forum discussion at Twitter, WebmasterWorld, Black Hat World & Reddit.
We have seen audience reviews in the Google search results panel for TV and movies but now Google is expanding it to music. Or maybe Google had it for music for some time but we haven’t seen examples of it yet. In fact, it doesn’t come up for all queries but you can trigger it for some.
Here is a screen shot I was able to replicate based on the example shared by Mordy Oberstein on Twitter – you can click to enlarge:
Again, I’ve never seen it for music results but that doesn’t mean it hasn’t been around.
Have you seen it before?
Forum discussion at Twitter.
Want to learn more about PPC and keep up on the constant changes and news?
Engage in the industry with paid search specialists who have a passion for everything in search and social that is pay-per-click advertising.
There are several amazing people who specialize in PPC and have become experts in paid search marketing for many years.
The majority have certifications, “top” list recognition, awards, and have contributed to the development of Google and Bing search engines’ ad platform offerings themselves by providing their technical feedback.
They share strategies, tactics, tips, tools, data, and so much more on social media and at conferences – as well as in articles, research, and blog posts!
You will definitely learn something new every day from this list of paid search experts. Follow them to get tips and stay on top of industry news.
Here were the basic criteria for the list research:
Are they currently doing PPC or contributing directly to the industry?
Do they present PPC at conferences, webinars, podcasts, etc.?
Do they share relevant PPC content or insights on social media?
Do they write useful content about PPC for books, ebooks, blogs, publications?
The main idea of this post is to help you find interesting people who have PPC knowledge and are willing to share what they know.
This is just one way Search Engine Journal is able to direct you to PPC professionals who can help you improve at your job and advance your career.
Rather than go in alphabetical order by last name, I decided to use the same tool used for the SEO list: this list randomizer to give everyone a fair shot at where they appear.
After checking out this experts, download and share our PPC guide.
Partner & Co-Founder of NordicClick Interactive, Adam has shared his knowledge speaking at more conferences than can be counted.
Adam is a stellar manager, but also hands-on in digging into the data, and helping smart people grow their business.
For Adam’s best tips, attend his conference sessions and read his articles on Search Engine Journal.
Read Adam on Search Engine Journal
Daniel shares his industry knowledge through speaking at industry conferences and his columns for search publications. He shares PPC news and conference takeaways on Twitter.
Follow @dangilbertppc on Twitter
Read Daniel on Search Engine Journal
Pauline Jakober is a speaker and the founder of Group Twenty Seven. As well as a regular contributor to Search Engine Journal, she regularly shares PPC tips and news on Twitter.
Follow @GrpTwentySeven on Twitter
Read Pauline on Search Engine Journal
As a former Google Ads evangelist and founder of Optmyzr, Frederick Vallaeys is an expert on scripts, reports, and automations. He is also a long-time industry international speaker and writer.
Follow @siliconvallaeys on Twitter
Read Frederick on Search Engine Journal
The premier Google Ads seminar leader and trainer, Brad Geddes has spoken at hundreds of conferences all over the world.
He is a book author and avid blogger, sharing his inside knowledge and predictions on PPC and Google Ads.
Follow @bgtheory on Twitter
Elizabeth Marsten is an industry speaker, writer, and book author.
She is currently the Senior Director of e-Commerce Growth Services and specializes in ecommerce PPC, product ads, feeds, and is a leading authority on the inner workings of Amazon ads.
Follow @ebkendo on Twitter
Read Elizabeth on Search Engine Journal
As a long time expert and current PPC Consultant, Amy shares her knowledge by writing for several search industry publications, including Search Engine Journal.
You’ll find the most recent paid media news on her Twitter feed along with her own weekly PPC roundup. You will find her on webinars and conference speaking engagements, including SMX.
Read Amy on Search Engine Journal
Frances has a passion for the search industry and in her role promoting Bing Ads and the larger community.
She writes for the Bing Ads blog, runs #BingAdsConnect, #BingAdsWebinars, and #BingAdsNext. Along the way, she educates the industry about Bing and beyond.
As co-founder of Janes of Digital, she celebrates women who work in the search and digital space.
Christi is busy serving Bing Ads (and the search industry!) as an evangelist, keynote speaker, and published columnist on digital marketing, search and AI.
She shares insightful, actionable tips on Twitter and her articles on Search Engine Journal.
Read Christi on Search Engine Journal
Julie F. Bacchini
Julie shares her knowledge of PPC and digital marketing through speaking events and organizing topics and contributors for a popular Twitter chat, PPCchat.
Her unique experiences in web design and mentoring offer fresh perspectives to being successful in paid search.
Purna is a keynote speaker, writer, and one of the most influential PPC experts in the world. She travels globally to educate audiences on PPC, AI, machine learning, and voice search technologies.
Follow @purnavirji on Twitter
Read Purna on Search Engine Journal
Probably the most knowledgeable person on B2B paid search marketing on the planet, Melissa Mackey shares her insights via search industry speaking, blogging, and her Twitter feed and Twitter chats.
She has a reputation for growing ROI in both an in-house and agency setting.
Director of a digital agency, Midas Media in the UK, Ed Leake leads PPC ads analytics and shares his no-nonsense new and views on social media.
Creator of an ad testing and automation tool, he has great insight into this specific area of PPC.
Host and Client Services VP at 3Q Digital, Joe Kerschbaum, talks to some of the industry’s biggest names on a variety of topics in his PPC podcast.
Joe is a long-time speaker and author, making his podcast an interesting and informative listen for PPC geeks.
Listen to Joe Kersbaum’s Podcast
A regular and popular industry speaker and avid blogger, Michelle Morgan regularly tweets with an emphasis on the nerdy nuts and bolts of search ads.
Don’t miss her Twitter feed and articles as you will find secret tips on PPC.
Follow @michellemsem on Twitter
Read Michelle on Search Engine Journal
As an industry writer and speaker, Aaron Levy frequently takes deep dives in PPC and recent trends using his keen eye for consumer psychology and buyer behavior and its impact on paid media.
Follow @bigalittlea on Twitter
Read Aaron on Search Engine Journal
Samantha is a keynote, conference speaker, and judge for several industry awards. She shares industry news and events on her Twitter feed.
As founder of Digital Females group in the UK, she brings together like-minded females in the digital marketing industry.
Digital marketer, speaker, and reporter of PPC trends. Author of Major Trends in Paid Search, a research report on paid media, including industry expert input and research.
Marc stays on top of the latest developments in AI and in Internet marketing. As co-founder and CEO at Acquisio, he has personally shared his knowledge with the industry and insights from his platform focused on helping SMBs thrive in the digital economy.
He is an award winner, writer, and international speaker at search engine marketing conferences.
Follow @marcpoirier on Twitter
Kirk is a regular conference speaker, avid tweeter, industry writer and owner at ZATO. He is actively involved in sharing his knowledge in detailed blogging and Twitter chats.
Follow @PPCKirk on Twitter
Read Kirk on Search Engine Journal
David Szetela is the host of the world’s longest running podcast on PPC, PPC Rockstars.
He is an author and contributor to PPC industry news, trends, and a regular conference speaker.
Follow @Szetela on Twitter
Listen to David’s Podcast
Robert specializes in SMB paid search and shares his insights through writing, speaking at conferences, and collaboration with industry practitioners.
He posts his real-time point-of-view and tips on his Twitter feed.
Read Robert on Search Engine Journal
John Lee is a speaker, writer, and paid search geek with a current focus on training for Bing Ads.
As a formal entrepreneur, he is well-versed in numerous verticals and can rarely be stumped with any PPC question.
Follow @john_a_lee on Twitter
Read John on Search Engine Journal
If SEO is your thing, check out 140 of Today’s Top SEO Experts to Follow.
Featured Image: Paulo Bobita
What is the Loading Attribute?
HTML Elements and Attributes
HTML element are the major components of a web page, like the image, a paragraph, and a link. That’s analogous to what an engine, a tire and a window is to a car.
An attribute are things that add more meaning to or modify those elements. Continuing with the car analogy, this could be comparable to the color of a fender, size of an engine, the air pressure specs of a tire.
The new “loading” attribute will provide a signal to browsers that an image or iframe is not to be loaded until the user scrolls close to the image or iframe. That makes the web page appear to load faster for the user. This will be especially useful for users on mobile devices.
How Does Native LazyLoad Work?
The loading attribute is a simple attribute that can be added to the image or iframe elements. A web browser will not download an image or iframe that has the loading attribute until the user scrolls near to it.
Here is the example for lazy loading an image that was given in the informal announcement:
<img src=”celebration.jpg” loading=”lazy” alt=”…” />
This is an example of the loading element in use for a video that is contained within an iframe:
It’s easy to implement. Just add loading=”lazy” to the code and you’re done.
It will be even easier in WordPress should a plugin be created to add it to the image attachment screen. The option to add the loading attribute could be included as part of inserting an image.
When Will Chrome Feature the Loading Attribute?
A Google Chrome engineer informally announced that this feature may be arriving in Chrome 75. Chrome 75 is tentatively scheduled to be released on June 4, 2019.
Loading Attribute has Compatibility Issues
Search marketing expert Edward Lewis has been involved with web development and search marketing since around 1995. He’s also a web standards expert whose opinion I have great respect for. So I asked him for his thoughts on the loading attribute.
He pointed out that there were serious compatibility issues with the printing function for printing out web pages.
“I work with a lot of HTML documents that are saved and/or printed. We would have to add logic to extend the functionality of the loading=”lazy” attribute so those Save/Print functions would work properly.”
Edward is right. The documentation on the loading attribute standard states:
“Compatibility with features that expect the whole page to be loadedChrome features such as “Print” and “Save Page As” currently expect all elements on the page to be loaded before printing or saving the page. One way to mitigate this issue would be to automatically load in any deferred elements on the page when “Print” or “Save Page As” are clicked, then wait for everything to load before continuing, but that could introduce user-noticeable delay which might require some UX changes with those features.”
On whether this is a good way to handle lazy loading, Edward offered:
Loading Attribute May Help Publishers
Anything that makes a web page download faster and improves the user experience is good for web publishers. It’s well known that a fast user experience correlates with more sales and conversions, including advertising revenues.
In Google’s recently released Mobile Speed Playbook, Google stated that a one second delay could negatively impact conversions on mobile devices by up to 20%.
According to Google, “…a one-second delay inmobile load times can impact conversion rates by up to 20%”
Loading Attribute Could Negatively Affect Ad Revenues
There is a potential for this to negatively affect publisher revenues. For example, if advertisers begin to use the loading attributes on their iframes, a publisher will not be paid for displaying the advertisement until the user scrolls to the advertisement.
Advertisers currently pay for advertising when the advertisement loads on the page, regardless of whether or not the user sees the ad or not.
Chrome for Android may also choose not to load an advertisement that is in an iframe.
The details are unclear at this time, but the official documentation states that when the attribute is unset or when the “data saver” feature is turned on, that Google itself will lazy load images and iframes, even if there is no loading attribute assigned to the images and iframes.
Documentation Warns About Revenue Loss
The documentation linked to by the Google engineer warns that there may be a negative impact to publisher advertising revenue:
Compatibility risksCounting impressionsAd networks that currently record an impression every time the ad is loaded instead of every time the user actually sees the ad (e.g. using the visibility API) could see a change in their metrics, mainly because LazyFrames could defer ads that would have otherwise been loaded in but never seen by the user. This could also affect the revenue of site owners that use these ads as well. Note that the distance-from-viewport threshold will be tuned such that a deferred frame will typically be loaded in by the time the user scrolls to it.
That same document states the following about automatically not loading advertising that are in iframes:
“On Android Chrome with Data saver turned on, elements with loading=”auto” or unset will also be lazily loaded if Chrome determines them to be good candidates for lazy loading (according to heuristics). Setting loading=”eager” on the image or iframe element will still prevent it from being lazily loaded though.”
Firefox is also developing the addition of the loading element for a future version of their browser. In a discussion about this feature, someone noted how this could negatively affect publisher revenues:
“Has anyone thought about the privacy implications of this yet, especially for e.g. 3rd party content? Or the negative economic impact for webmasters in the case of ads (e.g. being marked as a non-load while the page was loaded). I mean, it’s cool if you can save a few KB of bandwidth, but I see plenty of potential abuse here too.”
Google engineer Addy Osmani tweeted that he hopes for an end to advertisers being paid for ads that were loaded but not seen:
“The third-party embeds discussion will be interesting e.g we’ve seen a very small % of sites adopt JS-based lazy-loading for their ads/embeds over the years. Pushback has been “but marketing says we still get paid for those offscreen views”. Hoping some of those practices change.”
The Impact on Advertiser Revenue is Difficult to Estimate
Even Google’s engineer has no idea how all this is going to play out for publishers who rely on advertising revenue, according to this tweet:
“I’m curious to see if this encourages more lazy-loading of offscreen video players, embeds & ads.”
A web developer responded by acknowledging the negative impact to publishers:
“…there’s also a lot to consider for traditional publisher revenue models and of potential impact to revenue.”
Will Lazy Load Attribute Negatively Affect Ad Revenues?
At this point it is unknown. It depends on how Chrome and Firefox handle images and iframes that do not have the loading attribute while in “data saver” mode.
If advertisers and advertising brokers begin to add the loading attribute to their iframes, then yes, this will have a negative impact on publisher advertiser revenue.
On the other hand, a better user experience will benefit publishers as more users will stay on a page that loads faster, increasing the amount of people that see the ads (when they load).
Read the unofficial announcement by Google engineer Addy Osmani.
Read Addy Osman’s Twitter announcement.
Read the official Chrome overview.
Read the “explainer” of the loading attribute.
Addy Osmani’s LinkedIn page.
Images by Shutterstock, Modified by AuthorScreenshots by Author, Modified by Author
Looks like some of the Googlers in the Dublin office had a Google Impact Challenge event, where it appears Googlers put on shows from singing to dancing. Here are some I found on Instagram from the night event.
This post is part of our daily Search Photo of the Day column, where we find fun and interesting photos related to the search industry and share them with our readers.
On a recent Google Webmaster Hangout, John Mueller gave tips on how to fix a home page ranking for a keyword phrase when an inner page is the better page.
Signals for Web Page to Rank over Home Page
“You said Google’s algorithm doesn’t automatically favor the homepage ranking above other pages. What should we do to let Google know that a blog post for example should be ranking for a certain page term rather than the home page.
If we have a small website, how do we present clear signals to show Google that this blog post is the better page for certain search terms even though the home page probably has most internal links pointing to it?”
This is a problem of a home page outranking an inner page for a search term. This is a strange problem because presumably the home page, particularly on a blog, should feature limited amount of content from the inner pages.
The Power of Custom Content Excerpts
Although the person asking the question didn’t mention how much content from the inner pages is being shown, it could be that the article excerpt used on the home page is from the article itself and possibly too much of it is being used.
Unless a custom excerpt is provided, many themes will automatically display the first few sentences from an article or even the entire article on the home page. This is a setting that you can control in WordPress.
I find that it’s better to create a unique excerpt that describes what the article is about, thereby encouraging the site visitor to click through to the page.
The excerpt can be crafted in the same way a meta description is, (description and a call to action) and you can even use it for your meta description if you like. The role that a custom crafted page excerpt plays for getting a user to visit a page is similar to that of the meta description.
Google’s John Mueller Explains How to Rank a Page
John offered the following advice for how to rank a web page instead of a home page:
“The best thing that you can do in a case like this is to make sure that you really have that content covered well on those blog posts and maybe have it a little bit clearer on the home page that this page is not about that content.”
Google’s John Mueller explains how to rank web pages.
In general, the home page should not be ranking for a very specific keyword phrase. If it does then that could mean that the home page is lacking in focus.
John Mueller’s advice on making it clearer what the home page is about is good advice. The home page should in most cases be optimized for what the entire site is about.
So if your site is about widgets, then the home page should be optimized to communicate that you sell all kinds of widgets. If the content is about a local industry, then the home page should clearly say it’s a Mexican Restaurant, that it’s a plumber in the San Francisco Bay Area, etc. Then let the inner pages carry the burden for menu in the case of a restaurant and garbage disposals in the case of a plumber.
John Mueller on How Internal Linking Helps Pages Rank
John Mueller went on to explain best practices for internal anchor text to help web pages rank. Anchor text are the words you use when you link from one page to another, what the user clicks on (like, click here).
“You mentioned internal linking, that’s really important. The context we pick up from internal linking is really important to us… with that kind of the anchor text, that text around the links that you’re giving to those blog posts within your content. That’s really important to us.”
What that means is, rather than use non-descriptive words like “more info” or “read more,” it is better to use anchor text that is meaningful, that uses words that describe what the content is about.
Of course, that’s hard to do in the context of a blog home page. But it can be done from within the content of other pages.
Then he discussed the value of standard SEO practices such as titles and headings:
“Additionally, of course the content, like I mentioned is really important. So, making sure you have clear titles on those pages, you use clear headings, you could structure content in a way that’s easily readable that’s in a way that is really clear that this is about this topic without… resorting to keyword stuffing.
…be reasonable about… putting keywords on your pages. Write your pages in a way that they would work well for users rather than in a way that you think search engines might pick that up.”
John Mueller then reiterated that repeating keywords “in all variations” is a 20 year old spam method that is outdated so don’t try it.
What About External Links?
Google is about ranking specific pages for a search query. Search queries that are specific about size or color tend to return product pages that are specific about size or color. Google seems to prefer ranking pages for detailed phrases, not home pages.
If a home page is ranking instead of the inner page, then that could be a symptom that the site does not have enough useful links overall and that a majority of the links tend to go to the home page instead of inner pages.
In my opinion, a weak link profile could work against inner pages to rank. But the other factors discussed above related to the proper use of excerpts, good site architecture and a clear focus of what the home page is about can overcome a disadvantage from a weak link profile.
Watch Google’s John Mueller answer how to rank an inner page.
Images by Shutterstock, Modified by AuthorScreenshots by Author, Modified by Author
As PPC marketers, there are certain A/B tests we are always running.
Landing page, ad and audience tests are all important.
But if you have been regularly testing these items and are looking for new features and ideas for testing, look no further!
Below is a list of less obvious PPC tests that you should try out.
Let’s hop in.
1. Test Increasing Brand Awareness with Target Impression Share Bidding
Smart bidding strategies are a way to incorporate some automation while not relinquishing complete control of your campaigns.
Google has a smart bidding type that is perfect for increasing brand awareness: Target Impression Share.
Using this smart bidding model, you can choose the amount that your ads will appear in the auction in conjunction with page position and a Maximum CPC you are willing to pay.
How to Get Started
This bidding strategy works best when you want to increase or stabilize visibility and brand awareness.
If your competitors are encroaching on your brand terms, it’s a great time to test IS bidding.
If you’re seeing cost per leads that are trending too high and want to decrease the impression share your ads are receiving, you could set the IS at a lower percentage to be less competitive.
We ran a test for a client who also wanted to increase leads while decreasing costs by decreasing IS, which we found to be a delicate balance.
If you have a new brand that has a core group of highly competitive terms, this could also be a useful bid model, but you’d want to be careful and potentially only test a few core terms to begin.
2. Test Competitor Keyword Campaigns
Some advertisers are opposed to competitor campaigns and some are all for it. In today’s landscape, I always recommend at least testing competitor campaigns.
Competitor campaigns are a delicate balance, as it can inflate CPCs and ultimately CPAs, but when done right, it can also allow you to gain brand awareness amongst an audience that has shown intent.
How to Get Started
Head to Auction Insights in the search channels to check out who your biggest competitors are currently.
When drafting ads, make sure not to use the competitor’s name in the ads – those ads will get disapproved. Instead, use this as a starting point to highlight your brand, call out what’s different and how you stand out.
Starting with a smaller scope can allow you to test the waters before jumping in and potentially spending a great deal on folks who are looking for your competitors.
There are a few ways to do that:
You can start with a limited set of keywords, perhaps long-tail keywords like “chewy.com dog food” instead of just “chewy.com”.
Test a limited geographic area. Instead of targeting the entire state of California, try testing 1 DMA (designated market area) to start.
Keeping bids low but competitive can also be helpful in long-term success.
3. Test Dynamic Search Ads
Feel like you have plateaued with traffic and can’t seem to find additional ways to grow?
Dynamic Search Ads (DSA) may be able to help you uncover additional keywords and search trends.
DSA has gotten a bad rap in the past. But this campaign type has seen a lot of improvements since its initial release.
How to Get Started
I always recommend having a page feed in place. This will allow you to have the most control over a campaign that has the potential to wreak havoc.
Always make sure to add every search keyword you are bidding on in other campaigns as a negative for DSA.
The ultimate goal of DSA is to render it useless in the sense that you are either:
Gathering new keywords that are outside of your current list that match your site content.
Or are negating the terms because they are outside of your scope.
Using both of these techniques, your DSA may (and most likely will) eventually be producing so little traffic that you pause the campaign.
Keep a close eye on queries and traffic – this is not a set-it-and-forget-it campaign type!
Google rolled out some cool new features in late 2018 and Bing recently introduced page feeds so both of these channels are a great starting point.
4. Test Broad Match Keywords (Gasp!)
If you have killer performance with some keywords and money to test, then give (monitored) broad match keywords a shot!
How to Get Started
Break your keywords out into their own campaigns so they don’t impede on other match types.
As with DSA, don’t leave this campaign to run without supervision.
Create a separate negative keyword list that includes negatives you have already identified along with modifiers of long-tail keywords that are included elsewhere in your account.
Start with low keyword bids to begin in case things go awry.
I would also recommend turning off eCPC, which is the default for new campaigns since Google can now increase eCPC bids as much as it deems necessary to generate a conversion.
Set your initial daily budget to be half of what you’re OK with spending during testing.
Since Google can double your daily budget so you “don’t miss out on valuable clicks” (I know, I know, it will probably average out – unless you’re just testing and turn it off, in which case you’ll likely be mad at the overspend).
5. Test Audience Targeting + Demographic Data
There has been a lot of talk about layering In-Market audiences and Custom Affinity audiences into your campaigns for observation.
You can take audience targeting one step further and couple it with some basic demographic targeting for a next-level audience test.
How to Get Started
Run some initial tests on audience targeting and narrow down your desired audience demographics.
Make sure to exclude any demographics that you do not want to target.
For highly desired demographics, say the 65+ age group, consider setting a bid modifier to bid higher for these users. Targeting a larger audience to start may help prevent traffic from being too low to gather performance data.
Keep in mind that targeting and observation layers have different functions and limitations depending on the type of campaign. In order to limit the reach of ad groups in Search to users only in that audience, you’ll need to flip on Targeting.
Create a separate campaign or ad group that looks to target your desired criteria separate from your normal search ad groups.
Group by audience themes. By making these divisions, it allows you to personalize the ad copy and landing pages without throttling traffic from normal search ad groups.
Additionally, when it comes to demographics, I rarely exclude the “unknown” categories.
These categories are often some of my best performers, which leads me to believe that there is likely a good chunk of my target audience hanging out in the shadows.
It can be easy to get into a testing rut after running many A/B landing page and ad copy tests, but there are a lot of great testing ideas out there!
Screenshots taken by author, April 2019
Bill Slawski and I had an email discussion about a recent algorithm. Bill suggested a specific research paper and patent might be of interest to look at. What Bill suggested challenged me to think beyond Neural Matching and RankBrain.
Recent algorithm research focuses on understanding content and search queries. It maybe useful to consider how they might help to explain certain changes.
The Difference Between RankBrain and Neural Matching
These are official statements from Google on what RankBrain and Neural Matching are via tweets by Danny Sullivan (aka SearchLiaison).
— RankBrain helps Google better relate pages to concepts… primarily works (kind of) to help us find synonyms for words written on a page….
— Neural matching helps Google better relate words to searches.…primarily works to (kind of) to help us find synonyms of things you typed into the search box.
…”kind of” because we already have (and long have had) synonym systems. These go beyond those and do things in different ways, too. But it’s an easy way (hopefully) to understand them.
For example, neural matching helps us understand that a search for “why does my TV look strange” is related to the concept of “the soap opera effect.”
We can then return pages about the soap opera effect, even if the exact words aren’t used…”
Google’s Danny Sullivan described what neural matching is.
Here are the URLs for the tweets that describe what Neural Matching is:
What is CLSTM and is it Related to Neural Matching?
The paper Bill Slawski discussed with me was called, Contextual Long Short Term Memory (CLSTM) Models for Large Scale Natural Language Processing (NLP) Tasks.
The research paper PDF is here. The patent that Bill suggested was related to it is here.
That’s a research paper from 2016 and it’s important. Bill wasn’t suggesting that the paper and patent represented Neural Matching. But he said it looked related somehow.
The research paper uses an example of a machine that is trained to understand the context of the word “magic” from the following three sentences, to show what it does:
“1) Sir Ahmed Salman Rushdie is a British Indian novelist and essayist. He is said to combine magical realism with historical fiction.
2) Calvin Harris & HAIM combine their powers for a magical music video.
3) Herbs have enormous magical power, as they hold the earth’s energy within them.”
The research paper then explains how this method understands the context of the word “magic” in a sentence and a paragraph:
“One way in which the context can be captured succinctly is by using the topic of the text segment (e.g., topic of the sentence, paragraph).
If the context has the topic “literature”, the most likely next word should be “realism”. This observation motivated us to explore the use of topics of text segments to capture hierarchical and long-range context of text in LMs.
…We incorporate contextual features (namely, topics based on different segments of text) into the LSTM model, and call the resulting model Contextual LSTM (CLSTM).”
This algorithm is described as being useful for
Word PredictionThis is like predicting what your next typed word will be when typing on a mobile phone
Next Sentence SelectionThis relates to a question and answer task or for generating “Smart Replies,” templated replies in text messages and emails.
Sentence Topic PredictionThe research paper describes this as part of a task for predicting the topic of a response to a user’s spoken query, in order to understand their intent.
That last bit kind of sounds close to what Neural Matching is doing (“…helps Google better relate words to searches“).
Question Answering Algorithm
The following research paper from 2019 seems like a refinement of that algo:
A Hierarchical Attention Retrieval Model for Healthcare Question Answering
This is what it says in the overview:
“A majority of such queries might be non-factoid in nature, and hence, traditional keyword-based retrieval models do not work well for such cases.
Furthermore, in many scenarios, it might be desirable to get a short answer that sufficiently answers the query, instead of a long document with only a small amount of useful information.
In this paper, we propose a neural network model for ranking documents for question answering in the healthcare domain. The proposed model uses a deep attention mechanism at word, sentence, and document levels, for efficient retrieval for both factoid and non-factoid queries, on documents of varied lengths.
Specifically, the word-level cross-attention allows the model to identify words that might be most relevant for a query, and the hierarchical attention at sentence and document levels allows it to do effective retrieval on both long and short documents.”
It’s an interesting paper to consider.
Here is what the Healthcare Question Answering paper says:
“2.2 Neural Information Retrieval
With the success of deep neural networks in learning feature representation of text data, several neural ranking architectures have been proposed for text document search.
…while the model proposed in  uses the last state outputs of LSTM encoders as the query and document features. Both these models then use cosine similarity between query and document representations, to compute their relevance.
However, in majority of the cases in document retrieval, it is observed that the relevant text for a query is very short piece of text from the document. Hence, matching the pooled representation of the entire document with that of the query does not give very good results, as the representation also contains features from other irrelevant parts of the document.”
Then it mentions Deep Relevance Matching Models:
“To overcome the problems of document-level semantic-matching based IR models, several interaction-based IR models have been proposed recently. In , the authors propose Deep Relevance Matching Model (DRMM), that uses word count based interaction features between query and document words…”
And here it intriguingly mentions attention-based Neural Matching Models:
“…Other methods that use word-level interaction features are attention-based Neural Matching Model (aNMM) , that uses attention over word embeddings, and , that uses cosine or bilinear operation over Bi-LSTM features, to compute the interaction features.”
Attention Based Neural Matching
The citation of attention-based Neural Matching Model (aNMM) is to a non-Google research paper from 2018.
Does aNMM have anything to do with what Google calls Neural Matching?
aNMM: Ranking Short Answer Texts with Attention-Based Neural Matching Model
Here is a synopsis of that paper:
“As an alternative to question answering methods based on feature engineering, deep learning approaches such as convolutional neural networks (CNNs) and Long Short-Term Memory Models (LSTMs) have recently been proposed for semantic matching of questions and answers.
…To achieve good results, however, these models have been combined with additional features such as word overlap or BM25 scores. Without this combination, these models perform significantly worse than methods based on linguistic feature engineering.
In this paper, we propose an attention based neural matching model for ranking short answer text.”
Long Form Ranking Better in 2018?
Jeff Coyle of MarketMuse stated that in the March Update he saw high flux in SERPs that contained long-form lists (ex: Top 100 Movies).
That was interesting because some of the algorithms this article discusses are about understanding long articles and condensing those into answers. Specifically, that was similar to what the Healthcare Question Answering paper discussed (Read Content Strategy and Google March 2019 Update).
So when Jeff mentioned lots of flux in the SERPs associated with long-form lists, I immediately recalled these recently published research papers focused on extracting answers from long-form content.
Could the March 2019 update also include improvements to understanding long-form content? We can never know for sure because that’s not the level of information that Google reveals.
What Does Google Mean by Neural Matching?
In the Reddit AMA, Gary Illyes described RankBrain as a PR Sexy ranking component. The “PR Sexy” part of his description implies that the name was given to the technology for reasons having to do with being descriptive and catchy and less to do with what it actually does.
The term RankBrain does not communicate what the technology is or does. If we search around for a “RankBrain” patent, we’re not going to find it. That may be because, as Gary said, it’s just a PR Sexy name.
I searched around at the time of the official Neural Matching announcement for patents and research tied to Google with those explicit words in them and did not find any.
So… what I did was to use Danny’s description of it to find likely candidates. And it so happened that ten days earlier I had come across a likely candidate and had started writing an article about it.
Deep Relevance Ranking using Enhanced Document-Query InteractionsPDFhttp://www2.aueb.gr/users/ion/docs/emnlp2018.pdf
And I wrote this about that algorithm:
“Although this algorithm research is relatively new, it improves on a revolutionary deep neural network method for accomplishing a task known as Document Relevance Ranking. This method is also known as Ad-hoc Retrieval.”
In order to understand that, I needed to first research Document Relevance Ranking (DRR), as well as Ad-hoc Retrieval, because the new research is built upon that.
“Document relevance ranking, also known as ad-hoc retrieval… is the task of ranking documents from a large collection using the query and the text of each document only.”
That explains what Ad-hoc Retrieval is. But does not explain what DRR Using Enhanced Document-Query Interactions is.
Connection to Synonyms
Deep Relevance Ranking Using Enhanced Document-Query Interactions is connected to synonyms, a feature of Neural Matching that Danny Sullivan described as like super-synonyms.
Here’s what the research paper describes:
“In the interaction based paradigm, explicit encodings between pairs of queries and documents are induced. This allows direct modeling of exact- or near-matching terms (e.g., synonyms), which is crucial for relevance ranking.”
What that appears to be discussing is understanding search queries.
Now compare that with how Danny described Neural Matching:
“Neural matching is an AI-based system Google began using in 2018 primarily to understand how words are related to concepts. It’s like a super-synonym system. Synonyms are words that are closely related to other words…”
The Secret of Neural Matching
It may very well be that Neural Matching might be more than just one algorithm. It may be a little bit of a variety of algorithms and that the term Neural Matching is name given to describe a group of algorithms working together.
Don’t Synonym SpamI cringed a little when Danny mentioned synonyms because I imagined that some SEOs might be encouraged to begin seeding their pages with synonyms. I believe it’s important to note that Danny said “like” a super-synonym system.
So don’t take that to mean seeding a page with synonyms. The patents and research papers above are far more sophisticated than simple-minded synonym spamming.
Focus on Words, Sentences and ParagraphsAnother takeaway from those patents is that they describe a way to assign topical meaning at three different levels of a web page. Natural writers can sometimes write fast and communicate a core meaning that sticks to the topic. That talent comes with extensive experience.
Not everyone has that talent or experience. So for the rest of us, including myself, I believe it pays to carefully plan and write content and learn to be focused.
Long-form versus Long-form ContentI’m not saying that Google prefers long-form content. I am only pointing out that many of these new research papers discussed in this article are focused on better understanding long form content by understand what the topic of those words, sentences and paragraphs mean.
So if you experience a ranking drop, it may be useful to review the winners and the losers and see if there is evidence of flux that might be related to long-form or short-form content.
The Google Dance
Google used to update it’s search engine once a month with new data and sometimes new algorithms. The monthly ranking changes was what we called the Google Dance.
Google now refreshes it’s index on a daily basis (what’s known as a rolling update). Several times a year Google updates the algorithms in a way that usually represents an improvement to how Google understands search queries and content. These research papers are typical of those kinds of improvements. So it’s important to know about them so as to not be fooled by red herrings and implausible hypotheses.
An update to Google Assistant on Android devices will allow it to provide better visual responses and more complete information.
When answering queries, Assistant returns a screen that more resembles what you would see in Google search results.
Here’s a before & after example of asking Assistant for events in Mountain View:
Another new thing Google Assistant will do is provide accompanying visuals when possible.
Here’s a before & after example of asking Assistant for cute cats:
There’s now a lot less empty screen space when returning search results.
In the example below, you’ll see that Assistant used to display results horizontally in a carousel, leaving nearly half the screen empty.
Now, Assistant fills the screen vertically which looks better and is more functional.
This update is rolling out now on Android phones only. Android typically receives updates to Google software before iOS.
It wasn’t explicitly stated, but an iOS version is likely not far behind.
In an update to Google My Business profiles, product catalogs will now appear in desktop and mobile search results.
Previously, product catalogs would only appear in mobile search results. Now we know this isn’t a mobile-only feature.
Product catalogs are a fairly new addition to Google My Business pages, as they were first seen in use in October 2018.
Businesses can add products to a catalog by uploading a form in the ‘Products’ tab. This tool is called the product editor.
All items added through the product editor are eligible to appear in the product catalogs of Google My Business pages.
After uploading a product collection, when people view the GMB listing they will be able to browse through items in the new ‘Products’ tab.
What makes this feature even more valuable for retailers is that it’s free to use.
Small-to-medium sized businesses can upload product catalogs whether they’re advertisers or not.
So any business can use product catalogs to make their listing more attractive and engaging.
Now that product catalogs appear on desktop they’ll reach even more searchers as well.
Prior to this update, product catalogs did not appear in Google Maps search results and I believe that is still the case.