Five steps to clean up your on-page act
If you were to take a survey of blog posts on SEO in the last year I’d be willing to hazard over 60 per cent concern themselves with content marketing and online PR – what we used to call off-page SEO. Attracting links to your website using good quality content is crucial to SEO, but it's worth remembering it’s immaterial if your site is not technically healthy...
By that I mean your website needs to be set up so it’s easy to crawl for the Googlebots, while also optimising for users at the same time. Often even the smallest on-site changes can have significant impact on your search rankings so they should be at the top of your to-do list. If you aren’t already, here are five steps you should take to optimise your on-page.
5 steps to optimise on-page
Stuffing keywords into your page title, meta descriptions, menus and internal link titles or in the body of your content is considered Black Hat 101, but amazingly it still occurs. Google does not take kindly to websites that appear to be purposely over-optimising for specific search terms and by doing this you run the risk of having a keyword rankings filter applied to your site. This will prevent your website from ranking to its true potential.
Hidden text is another common form of over-optimisation. This Black Hat practice has moved on from white text on white background to include hiding text using CSS templates which reveal text behind drop downs.
Doing this purely for SEO purposes should be avoided. As should hiding vast swathes of text behind drop downs at the bottom of the page as it’s unnatural and of no use to the user.
Page titles at an individual level are probably the most significant on-page factor. Avoid repeating variations of your head terms in your title tags and consider how they will appear to the user. The page title usually appears in the SERPs as the hyperlink you click to visit the site so it should be conversion focused, logical and interesting for a human to read.
- 2. Avoid duplicate content, avoid duplicate content
There are scenarios where duplicate content is required for legitimate reasons; your site may have localised language pages (e.g. UK and US English) or pages that are accessible by users and search engines via multiple complex product paths with different URLs.
Search engine bots attempting to index your site also have a problem with duplicate content that is caused by tracking parameters and session IDs appended to URLs.
Duplicate content will harm your search rankings so you should identify all duplicate pages and if they are required for genuine reasons use canonical tags to highlight this.
Canonical tagging provides you with complete control over the URL that you wish to be returned in search results. Essentially they inform Google where the original or preferred version of your content can be found and notify the search engines to ignore any duplicated pages. They also ensure any link popularity is kept to your preferred page.
- 3. Check your back-link profile
This is the section where those people who have indulged in bad link building practices previously begin to feel uncomfortable. There is no escaping your past SEO discretions however. Use Moz’s Open Site Explorer, Majestic SEO and Webmaster Tools to identify the sites that are linking to your website and if you ‘discover’ a large volume coming from low quality sites, link or article directories you need to get these removed.
If you’ve been mixed up in Black Hat practices previously then the chances are these kind of links are going to number in the hundreds, probably thousands – so it’s not plausible to remove every single one. You should however document your attempts and if all else fails you can submit your request via the Google Disavow Tool.
If your website is linked to these low quality sites then you run the risk of being judged guilty by association by Google. If your backlink profile is not overly unhealthy don’t panic. There have been far too many people throwing the baby out with the bath water when it comes to link removal and asking for perfectly good links to be removed. This is only going to be detrimental to your rankings.
Avoid blanket emails to webmasters, and instead thoroughly investigate your backlink profile and accurately identify what are bad links and what are good. Links are not bad! Only links from bad websites are bad!
- 4. Redirects and footer links
Links at the bottom of the page will be devalued pretty sharpish by Google, so avoid putting too many in. The footer of a page – usually reserved for privacy statements, legal information etc. – and should only be for secondary navigation. If a link is required it should be higher up in the page as this will be crawled first and will be more beneficial in terms of SEO.
Another common problem is redirects. When new pages are being created to replace old ones ensure that the old page redirects traffic with a 301 permanent redirect.
- 5. Speed up your load times
We all know how annoying it is waiting for a slow website to load up, watching either that blue circle chase itself or being tormented by an egg-timer on your screen – well Google feels the same way.
Search engines use site speed as a ranking factor. Ultimately search engines want the user to have the best experience possible and site speed plays a big part in this.
To increase your load times assess the file size of any images on your site and reduce these as far as possible. Also reduce any irrelevant ‘junk code’ and don’t use too much display advertising as these will both slow down your site.
Thanks to
Jimmy McCann for sharing his advice and opinions in this post. Jimmy is Head of SEO at Search Laboratory in Leeds. You can follow him on
Twitter or connect on
LinkedIn.