8 Tips For Better SEO

This article offers 8 tips for better SEO. Tips you may not have heard of, and tips that aren’t generic talking points you’ll see everywhere on social media.

This list is compiled of SEO optimisations I have tested over the years.

Table of Contents

Search volume isn't everything!

Why?

Search intent is far too variable to assign value to a keyword based on search volume alone.

Sure there are high volume, high click terms. But there are also many high volume, low click terms. So when doing keyword research, keep this in mind.

Here’s an example:

I’ve been ranking #1 for for a 300 monthly volume term for some time.

I’m averaging 4 clicks a month.

Australian baseline organic CTR metrics: Position #1 = 30.12% of clicks. Source: advancedwebranking.com/ctrstudy/

In theory I should be getting 78 clicks per month!

I’d call these types of keywords window shopping keywords.

This is where a user is gauging options and comparing results at a glance. Think

  • generic category keywords for ecommerce such as: “shampoo”
  • generic service keywords, like: “plumber”

High volume, low click can also happen service type terms that include a location modifier, where purchase intent is low.

These keywords are normally higher in volume. A few examples:

  • marketing “city”
  • video production “city”

Some other examples where informational intent may still be very high, as a user is at the start of their journey:

  • hotels in “city”
  • things to do in “city”

With all these, expected CTR is likely lower than the industry averages.

Compare this to:

I’m also ranking #1 for a longtail product keyword with 20 searches per month.

This keyword is currently averaging 64 clicks per month.

What! How does that even work?

Well, the longtail is triggered in many “People Also Ask” results too.

So while the direct volume is low, the volume from the parent keyword set is high!

Don't solely rely on tools for technical SEO

By now you may be very familiar with “technical audits” within tools like ahrefs and SEMrush.

While these tools are fantastic all round, and their audits are very helpful, not everything is an issue.

These reports will list the issues but not tell you anything about the root cause.

Take duplicate meta descriptions as an example. This issue itself is a super low priority fix that often gets flagged as an error. This flag can make it seem dramatic to clients who don’t know the ins and outs of SEO.

But also, there may be more severe reasons as to why duplicate descriptions appear. Such as duplicate URLs existing with and without a trailing slash.

So when using these tools, you need to understand the why behind some of the issues. Good technical SEO is about understanding why issues occur and the impact they have on moving the needle for a website.

Cull pages on your website for better performance

This might sound counterintuitive, but trimming down your website can lead to SEO improvements.

There are 2 main reasons why this works:

  1. Query counts
  2. Click velocity

Query counting is based on the number of terms a single URL is ranking for.

Click velocity is a trended click metric over time, up or down.

When you have pages with a declining or no click velocity, that’s a good indication of dead weight content.

You want every page on your website to serve a purpose. So if there happens to be many dead weight pages, deleting them entirely is often a good solution.

If the content is just dated, look to update or consolidate it into a newer piece.

Quickly personalise reporting dashboards

While not directly an SEO tip, every SEO will need to do reporting.

This is a super simple one. If you use LookerStudio, that platform allows you to apply a visual theme based on an image.

That means you can upload a clients logo, or your own, and click to base the entire reporting visuals on that logo. Here’s out to do so:

  1. Click “Edit” in a LookerStudio report
  2. Click on “Theme and layout”
  3. Click on “Extract theme from image”

You can then either upload an image or paste in an image URL.

This is a fast way to brand up reports for your clients. Simple!

A better host will fix the majority of site speed issues

Chasing that 100/100 score in core web vitals?

Firstly, don’t. It hardly has any impact on your SEO performance. Unless of course your site is so slow it’s unusable.

Secondly, a great host will fix the majority of load speed issues for you.

Cloud hosting with inbuilt CDNs, minification and caching can quickly optimise these technical SEO elements for you.

So before you spend time analysing lighthouse reports and core web vitals metrics, review your hosting.

Still use pagination tags

For a while Google told us they had depricated pagination tags.

However, in 2024 they have brought them back into their own search results.

Pagination tags are great for:

  • crawling bots
  • user navigation

Both are useful for SEO as they improve click depth and content discovery. This article on pagination tags will help you further understand best practices.

If you have a large website, pagination is even more important.

The reason why is because it helps search engines better discover content.

Infinite scroll can be great for users but it’s not great for product discovery, especially if you have products buried 100s deep.

AI content can be very good and it's here to stay

By now people know AI is here to stay, as we’ve seen recently with Google’s AI Overview rollout.

Naturally, SEOs abused AI to pump up masses of content. As a result, many websites tanked. 

I also tested this on a dummy website to learn first hand. I created 250 articles on pet health in a few weeks. They ranked for a little while, until one day it was like Google turned my website off. Ranks dropped, my site de-indexed, and now it’s dead. So be warned. Pure AI content does get punished.

Here’s what to avoid with AI content:

  • Un-prompted articles with no original research
  • High publication rate. Do not publish 100 articles in a week. Google Bot is smart enough to know it’s not natural
  • (Your Money or Your Life) content, yes even pet health. 

Here’s how to use AI for SEO content effectively:

  • Feed it datasets and get it to summarise
  • Dump in original research, such as customer feedback, and get it to summarise
  • Get AI to reformat existing content for better information gain rate
  • Use AI to summarise news
  • Get AI to programmatically insert internal links
  • Use it to analyse reports and ask for call outs
  • Ideate. AI is good at creating chunky lists very quickly

For me the key takeaway is to use AI as a supplement to your human copywriting, not a replacement. Let it do some heavy lifting with your first party data. Let it review and reformat original content. Use it for new ideas.

Content format & medium are important ranking benchmarks

The best way to gauge the kind of content you need to produce to rank for a keyword is by looking at the current Google SERP.

What is Google rewarding is more important than who Google is rewarding with top spots.

Things to look for:

  • SERP features, namely Featured snippets, People also ask results and videos. This gives you a quick insight into search intent
  • Content format. Are the high ranking results long form, list posts, videos, comparisons? Take note
  • Supplementary media like images, infographics and videos are key elements

All of these things play into the type of content you need to produce to meet competitive benchmarks.

If all top ranking results have a video, and there is a video pack on the SERP itself, then you need a video.

If all the top ranking content is 2,000+ words, you likely need a longform article.

Crawling is the #1 technical SEO thing to check

SEO all starts with how your site is crawled.

Crawling = Indexation = Rankings = Traffic, Leads & Revenue

Step 1 in any technical audit should be to review crawl bloat & indexation coverage.

I’ve seen too many sites with 50%+ of discoverable URLs as non-indexable bloat. Trim it!

Imagine someone hands you a book with 1,000 pages, but only 100 pages have text.

And in order to find those 100 pages, you must flick through every single page one by one. Wouldn’t that suck?

Don’t put GoogleBot through the same torture.

Use Google Search Console and Bing Webmaster Tools to validate crawling and indexation errors.

Conclusion

Get out there and start testing some of these SEO tips.

If you have further questions, please feel free to reach out to us.

Facebook
Twitter
LinkedIn

Recent Posts

In our blog section you’ll discover more in depth guides, SEO tips and SEO news. Check out some other posts below!