web analytics
Affordable SEO Results

4 Common Website Mistakes To Avoid


4 Common WEB Mistakes that will make you INVISIBLE in Google.

Hi Folks Matt here,

I thought this info on Common Website Mistakes was pretty good and might help save someone’s business someday. Not saying there’s problems on your site, but it helps to be informed.  

(Updated Ranking tips for 2017)

1. Missing – WWW Re-directs
Many websites can be accessed via typing www.domain.comor simply domain.com into a browser location bar. This is not a bad thing if one variant is re-directed to the other variant. But when there is no re-direct, it can cause you major problems, especially with Google. This is perhaps the most common mistake made by website designers, and takes less than a minute to fix. As professional SEOs we believe that correct domain redirects should be part of standard website design procedure but sadly it’s not.

Why Having No WWW Re-directs is BAD(very bad indeed)
Google HATES Duplicate websites, pages, and content. Without the proper redirects, your website can be seen at both variants, being www.domain.com AND domain.com. Google literally sees every page of your website as being 2 websites and will inflict major penalties against your website’s search engine rankings.

How to Test For Proper WWW Re-directs
This is also easy to test. Simply type your domain name into a browser address bar, first with www in front of the domain name, then without the www in front of the domain. If a redirect is installed, the www will either be stripped or added in your browser’s address bar when you hit enter to go to your website.

Contact your webmaster and have them install the correct redirects into your websites htaccess file. (they may have to create an htaccess file in some cases). As previously stated, this should take less than a minute for them to do. I would mention to them that they should do this as a standard procedure. If they insist on billing you to fix this 1 minute fix, scold them appropriately.


2. Boiler Plate MetaData (Duplicated Meta Tags)

Boiler Plate Metadata is the practice of using identical page titles, descriptions and keywords on multiples pages of a website. MetaData is the “behind the scenes” code instructions for search engines that is built in to most website pages. This data is not meant to be seen by human visitors to a webpage but can easily be seen by viewing the source of a website page. Meta data may include several elements such as:

  •  Meta Title
  •  Meta Description
  •  Meta Keywords (Not used in Google or most search engines now.)
  •  Meta Robots

Please note that meta data often varies between websites. In some cases it will be at the top of the page, sometimes it may be further down the webpages source code. The meta data may contain other Meta fields such as Meta author, Meta revisit instructions, etc.

Why Having Boiler Plate MetaData is BAD
The Major Search engines such as Google, Yahoo, AOL and Bing have duplicate content filters that look for and penalize websites and/or webpages that duplicated content. These duplicate content filters are immediately triggered by boiler-plate (in other words, duplicated) meta data, specifically in Meta Title, Meta Description and Meta Keywords fields.

Here is a Quote directly from Matt Cutts of Google: it is better to have unique meta descriptions and even no meta descriptions in some pages, than to show duplicate meta descriptions across pages

How to Test Meta Data
There are several ways to check for duplicated Meta Data. The easiest way is to go to your website and View Page Source. This can either be done by navigating to a page on your website, moving your mouse pointer somewhere on the page where there are no images, and clicking your right mouse button. This will open a new tab on your browser which will show you the source coding of your webpage.

In the image below, I have placed green boxes around the most important Meta Tags. These can vary greatly between different websites. The Title tag is the most important of all of the possible meta tags.

If you follow the instructions above on several pages of your website and discover identical Titles, Meta Descriptions and/or Meta Keywords on multiple pages, you are being penalized by Google. If your meta description fields, meta robots field and/or meta keywords fields are missing or blank, you passed the test!

Google now ignores the Keyword Meta Description fields and will use other content on the page instead. The Meta Robots field is primarily for telling a search engine NOT to index a page, therefore absence of a Meta Robots field will allow Google or other search engines to index the page.

If you discover that you have Boiler plate, duplicated meta data in regards to Meta Titles, Descriptions or Keywords, contact your webmaster ASAP! these 3 meta data fields should be unique and NOT duplicated among different pages of your website. I would suggest setting diverse Titles and stripping (removing) of the meta description and meta keywords fields. You will literally get better search engine rankings that way. If you do not have a webmaster for preforming this fix, contact me today for a quote.


3. Duplicate Websites
We have seen many cases where our clients have multiple websites. This is actually a good thing, unless the websites are copies of each other or have the same textual and image content. Are you seeing a pattern here about Duplicate Content? It’s bad! In the eyes of Google and the other major search engines, unique content is the golden rule and duplicate content and/or websites will expeditiously bring the wrath of the almighty Google!

Make no mistake about it, Google expects some images, image names and textual content to be on more than one website and will tolerate such to a point. However, having the exact same sets of paragraphs on two sites will get your site heavily penalized.

This doesn’t only apply to a website owner having multiple websites. It also applies to where the content on your website came from. If you buy a product from a wholesale source and copy the manufacturers description to your website pages, you can certainly make a safe bet that you won’t get top Google rankings for them items. Oddly though, Google does know when something was shared on Social Media. It does not punish your site as such if your content is shared or goes viral.  Google is no dummy.

If you want high rankings for your website and it’s pages, diverse and unique content is the golden rule. At the end of the day, it’s better to have one diverse websites than two clones.
Last for today’s lesson is,

4. Purchased Backlinks
Everyone has likely received the emails from oversees spammers boasting about their glorious services of providing back-links to your website. I received several today myself! Years ago, a website owner could boost their rankings by having other websites link to their website. That’s still a good thing and a major part of the Google rankings.

However, Google is a rather high-tech search engine these days and is smart enough to know the difference between natural, normal backlinks and those which spring up by the thousands overnight (the kind the spammers try to bilk unsuspecting website owners into). Google has a literal database of websites that sell links to other websites. Any website that get’s a link from a known backlink selling website will find their website falling from Google’s graces into the abyss of search engine obscurity.

First of all, Don’t buy backlinks. Don’t even allow yourself to be tempted. If you have already made the mistake, then speak to your webmaster about Disavowing them links in Google’s Webmaster Tools Systems.



I know there wasn’t a lot of pictures on this post and it was a little techie. Also it is by no means an exhaustive list, but it is important information to know, if you don’t want to lose traction in Google..

I hope this helped.

Warmly,  Matt Johnson