Nuclear Fruit Salad loading...
Loading
Automobile 3368094 1920

SEO, the used car market of website advice

Dodginess and misinformation abound

published Monday, 05 Jun - 2023  by

On comparing notes with other developers, we found we had all independently noticed the poor, and often incorrect, advice being given to clients by companies that specialise in SEO (Search Engine Optimisation) with one colleague describing SEO companies as the new 'Used car salesmen' of the internet.

Read below for more details and how to avoid bad advice.

What's wrong with the advice?

Work it out

Now to be clear, in some cases nothing is wrong with the advice, it can sometimes be useful and on rare occasions it is accurate. But far more often the advice is poor or out of date.

Having worked with businesses of all sizes from sole traders up to multinational level, regularly the advice given to them by SEO agencies is often incorrect and can sometimes lead to a website actually losing search indexation rankings.

Incorrect advice

Scam or misleading SEO

This advice is often rumour or based on really old techniques scammer sites have previously used. This advice will often result in your website either having no change or losing ranking.

Some of the most common outright incorrect recommendations I have seen in the last 2 years are:

Use the keywords and keyword tags to get a better ranking:

In the 90's scam sites used this technique to get high indexing on Yahoo and Google, search engines quickly started lowering site rankings where keywords were not used in the content of the page, they later evolved to store the synonyms and meanings of terms on pages themselves ignoring keywords altogether/

Google removed keywords from their indexation entirely around 15 years ago!

Keywords can be useful, but mostly as something to focus an article around. As search indexers often do not use the keywords tag at all, you have to use meaningful keywords and phrases within the content of your page. Use them within meaningful paragraphs and sentences that talk about your product or service, not just as a list on a page or in sentences that make no real-life sense. Google at least also detects synonyms of words, so if you have a target market you wish to hit, don't just repeat the same 'keyword' but use phrases and terms used within that market that may be searched for, as well as synonyms for the most used terms, but do not over-egg your pudding, keep the information relevant and unrepeated to avoid google thinking the content looks suspicious.

 

Put keywords in invisible text on your site

Essentially the same as the keywords advice above, this is also a really bad idea and will get your site ranked even lower than having unrelated keywords as it is obvious you are trying to 'trick' people.

Google and other crawlers will look at the 'visible' page and compare to the actual content on the page to the visible content, they will also click every link and button on the page and determine if the content is relevant.

If you have text that is never revealed it will then add to what I call your site's 'dodginess' score. I believe there may be 1 or more of these scores that once you reach a threshold, your site will be deemed 'scammy' and you will be ranked lower. But even before that point, it means you are not getting raised indexation on your targeted phrases.

 

Multiple pages about x will rank your site better

Again this comes from the day of scam sites getting high results on google. The idea was that the more pages you had that had the same words on it, the higher you would be ranked overall.

This is false, it is my belief based on observed results of my own testing that google finds all associated pages with repeated data on them, and may divide the score those pages. So if you have page x, y, and z that all have roughly the same information on them then they may all together may get a lower ranking in google that you would get by having a single page about the subject.

This means each individual page will have a tougher time being seen as highly relevant whenever a search is done, which results in your website ranking lower overall.

Misleading advice

Work it out

Keyword research

As mentioned in incorrect advice, Google and other search engines understand synonyms and whole phrase synonyms. This analysis has only gotten more accurate in the last couple of years with the advent of LLMs like ChatGPT.

Just write your content for that page, the parsing of your text that google does will see the common terms and pharses for the market you are targeting and adjust the score accordingly. So specific keyword research to find out what other people are using the most is a little pointless, BUT if there is a specific term to the market such as rolling edit or LUT in videography then use those, but otherwise word your content the way you would when talking to a client.

 

Content/Layout doesn't matter as much as your text (or keywords)

While the text is important, the content and layout on your site can affect your ranking significantly, if you have a site that loads really slowly, then most search engines will rank it lower than similar scoring sites that load faster.

likewise if the web crawler from a search engine finds few links to a page, that page will be ranked lower on your site, so never hide your pages from navigation.

These sorts of issues also have the secondary effect of increasing your bounce-rate, this rate will affect your ranking on search engines.

Ask yourself, how often have you stayed waiting for a slow website before going back to the search results and finding a competitor? How often have you tried to find a page on a website when it is hidden away through other pages before you went elsewhere?

 

You need to have landing pages that then feed visitors to more specific content pages from there.

This is one of the more common bits of advice I see companies giving out.

You should think of every page as a landing page.

Forcing people to go through a particular page or sequence of pages (when it isn't a form) to get to another page is annoying, a visitor should be able to get where they need to go from anywhere within your site.

Hiding a page from navigation simply lowers the rankings for that page (and your site by extension) and annoys visitors as mentioned above. Search engines will also rank your page lower for keywords on the 'hidden' pages if they find this, deeming the information of less importance as it cannot be accessed from your whole site.

 

You need to use social media to increase your page rankings and increase your customer base

While google and other search engines will index FB/Twitter etc. They are indexing them as a page on the social media site, this has little effect on your search ranking for a website.

While setting up a new business you may feel like going to social media platforms improves your chances of finding customers and this is true but only to an extent, only initially, or whilst your competitors do not advertise on that platform. Why? Social media platforms make money in two ways, selling user data and selling advertising direct from advertiser to the customers in a market.

If you are on social media and actively encourage your client-base to find you on there, you are essentially handing your clients over to facebook, twitter, or whatever site you are using. This platform will then sell advertising to the highest bidder in your market to advertise directly to these customers. So in a few short months you may have given away your clients to any competition willing to pay more than you. 

Now you can also use the above to your advantage and for markets where the majority of clients are on Facebook (horse industries for example), it may make sense to be a presence on there, but you need to be judicious, redirect the clients to your website whenever possible, when adding blog entries, add them to your website and make it so the site makes a post to the social platform with just the image, description, and link to your blog page.So if someone is interested and click it, it takes them back to your website.

Similarly there are legal issues with using things like the facebook pixel in that it collects a lot of data about your customers, their contacts, and then subsequently tracks them, GTM also collects identifiable data about your site's visitors, so make sure to make it anonymous on visitor request (or if you are marketing to the EU, turn it off entirely).

 

You should use this 3rd party traffic analysis

Again, you are giving your customers and their data to a 3rd party, this raises several ethical and legal issues, including GDPR laws if you operate in Europe. These are no joke, and some site owners have been fined millions of euros in one example because Google's tag JS was not anonymising data. 

But beside the legal issues, you may have technical issues, 3rd party cookies are easily blocked or fed false information by browsers and plugins, for example my system feeds random data into these cookies which will cause all kinds of weird and incorrect data to show up in analytics results. Add in that a lot of these require lowering or removing security features in your website (GTM and HotJar both require the use of unsafe inline JS scripts).

3rd party analytics can be good, but currently all browsers except safari block 3rd party cookies, meaning you take full legal responsibility as you have to enable them as 1st party cookies now.

Using 1st party (where your website creates and receives the cookie data) analytics is a much better method, even better yet is having a plugin for your CMS that monitors your website's visitors and builds up analytics on your own server that are not shared remotely. These usually allow more advanced features like A/B testing as well etc.

 

Get as many backlinks as possible

Backlinks are links from other website to your website, this should always be amended to get as many quality backlinks as you can.

Google and other search engines have a trust rating, this rating is high for universities and governed bodies, so if you can get these bodies to link to your website, your trust rating will see an improvement, this dominoes through to your overall search result ranking.

But not all links are created equal, if you are linked to by a lot of spam sites, or other sites that are simply 'link sites', then your trust rating will drop, and thus your ranking will be affected negatively. If you are linked to by government or a university then your trust rating raises.

 

Use google tag manager to setup tags on your website

This is ok insofar as it helps google understand what your data is (as long as you use the GTM Javascript on your site), but it means manually going in and adding the tags, every single time you make or change a page.

There are also other search engines out there that do not see this information, and Google looks at results from those as well to get rankings. Google's tags are also bit limited and often do not meet (or require extra on top of) the official microdata standard found at schema.org.

Instead we can make our site output the microdata tag schema so ANY search engine and screenreader can read it and understand the type of data on that page.

Doing it this way means when you make a new page, it automatically knows it is a blog post or a new gallery, or a new product, the author, the date it was posted, the description, a feature image associated with it, etc.

 

Pages should be low level in the url

This means that instead of [site url]/computer parts/video cards/[brand] you would just have something more like [site url]/[brand].

There are 2 arguments I commonly hear in favour of this,

  1. 'lower level pages are ranked higher on google' 
  2. 'it's easier for visitors to get to when typing in the address manually'.

Argument 1. Is outright false, the page ranking within your site may be lower if you do not change it's importance, but it does not change your site's overall ranking, (depending on the search term used). This also stops one of most search engine's techniques in understanding your website, the title and url relationship to the page data.So in the first example above Google knows that we are looking at computer components, specifically video cards, and more specifically that this the page is where [brand] sits.

So if someone enters the search term Nvidia 4070ti, then the search engine (which already associates this term with a video/graphics card) knows that [url]/computer parts/video cards/Nvidia may be a more appropriate page to go to than a lower level page like video cards, this means that page may be ranked higher than someone else's page that just has [url]/Nvidia as the page name.

The inverse of the above is also true, if someone enters a search for computer components and your site is listed, than the lower level page computer parts will be ranked as a higher result for your website than the specific brand for the specific video cards, even if the lower pages use the term more often.

Argument 2. is technically true, but how often have you typed in a website and the particular page you wish to go to? Was that site easy to navigate normally? A visitor having to do this is often a sign you need to optimise your site a little more.

What can I do?

A fine line

Keep it simple and keep it stupid, SEO can be a fine line to walk in order to get the most out of it, but by keeping the site structure as simple and easy to use as possible, you make it easy for your visitors and for the webcrawlers from search engines to find what they need.

Here are the following recommendations I follow:

  • Don't hide any pages from navigation.
  • Make navigation easy
  • Each page should have 1 focus and shouldn't be too long, do not duplicate information.
    (visitors never scroll down unless they are already interested)
  • Never autoplay music or video, hotspots on any page doing this show nearly every visitor first-click is always the stop button. This also seems to affect search rankings.
  • Use microdata, assistive tags, and meta data. Essentially these do more than google tags, and once setup you do not manually have to do much, allowing search engines, assistive devices, and social platforms to see what the data is.
  • Don't hide urls or titles - slashes in the urls and pipe ( | ) in the titles of pages let search engines know what categories the data on those pages is under.
  • Use headings - 1 h1 on a page and it should be the title of your page, then h2, h3, h4 in descending order of importance, search engines look at these when crawling your site.
  • Sensible content, use images that fit conceptually with what you are talking about, make sure the images are labelled and have an alt-text.
  • Make a responsive layout, a truly responsive website should use the font-size for breakpoints and sizing as some devices will have the font-size altered, but the pixel and percentages do not change and this method allows assistive devices to zoom properly.
  • Crop images and use formats like WebP/WebM with fallbacks to jpg/png. WebP/WebM are very efficient codecs for images and video respectively a 100kb jpg can be displayed easily as a 30kb or sometimes 5kb WebP, they are supported on every browser except Safari on older systems (Apple pre Big Sur). CMS croppers are fairly efficient usually and often cache their output as part of the CMS, so you can get responsive, good looking images, that load very quickly that can improve your rankings.
  • HTTPS & HTTP2 and every security measure to stop XSS and other hack or hijack attacks. These actually also make your site run quicker and HTTP is now blocked by default on most browsers.