It has been a while since we talked about the Google Medic Update, the big August 1st update, that seemed to have been one of the largest core ranking updates from Google this year. John Mueller from Google chimed in about it on Reddit basically suggesting that when looking at it, you shouldn’t look at things like H1 tags but rather look at the site overall.
The Reddit thread has an SEO talking about his struggle with his client who thinks all they need to do is change links, title tags and h1s/h2s tags and other technical SEO issues and the problem will be solved. The client doesn’t want to take a more global view of their site and see what can be improved with the content and overall site quality.
The SEO wrote:
How would you handle a client who claims Google’s search quality guidelines, John Mueller’s official statements, and other authoritative seo materials are all just speculation and not rooted in real data? I’m getting frustrated and having a hard time explaining that seo is not an exact science like paid search because quality is more subjective. Client is convinced these theories are all hocus pocus and with the right data-driven tweaks (# increase in backlinks, regardless of quality; # keyword in h1; position of keyword in title tag) that we can get back to #1, even with the page being 60% autogenerated and duplicative.￼
John Mueller from Google actually responded to it and I wanted to highlight his responses:
At the point where they’re hand-picking what they consider to be “real data,” it’s really hard to make objective recommendations :-). Would comparing with other sites be useful?
One approach might be to do an A/B test based on their requests, update some part of the site accordingly, and see what happens (probably not much). Doing that the other way around wouldn’t work though, improving just a part of the site’s general quality (in whatever ways make sense for a site like that) is kinda hard without improving the whole site.
He then added later:
My thoughts are that if the site was ranking reasonably previously, then technically it’s unlikely to be that bad. There’s always something that can be tweaked & improved from a technical point of view, and these can give you incremental wins, and there’s also a clean-out of duplicate’y content, which is somewhere between technical & quality, which can help over the long run. However, if you’ve seen a significant, steady change around the core algorithm updates, then you probably want to go past incremental updates and instead rethink things overall.
The quality rater’s guidelines & the old Panda blog post (“More guidance on building high-quality sites”) are good places to get ideas. The important point (in my eyes) is that this is not a “tweak h1’s, inject keywords, get links” kind of traditional SEO work, but rather you’d want to step back, understand where the site’s audience is & where it’s going, and rethink how you’d like to position the site within the 2019+-web. As an SEO consultant, you’ve probably seen a lot of potential directions, and how they’ve evolved over the years, so you might be in a good place to make informed recommendations.
I hope this helps some of you.
Forum discussion at Reddit.