eSites Network Website Design

WHAT'S NEW?
Loading...

The Ultimate Guide to On-Page Optimization

We work with a great deal of very talented website designers and developers who create first-rate websites for their clients – aesthetically superb but not always the most search engine friendly they could be. Overall it can make the project more expensive for the client because it means that we as the SEOs have to go back under the hood of their shiny new website to make tweaks for better on-page optimization.

This guide is intended to be a cheat sheet that allows you to really bake SEO into the projects you are working on – saving time and money in the long run as well as making the process much more seamless for them and your life a heck of a lot less stressful, well at least no more SEOs bugging you to make changes!


Website Factors & Site Architecture Factors

URL Structure – the golden rule when it comes to URLs is to keep ‘em short, pretty and include the page level keyword – never ever build on a platform that doesn’t allow for URL rewriting because not only are ugly URLs, well, ugly, they are also a mistake SEO wise. Short, sexy and search friendly URLs make it easy for the user to share with their social networks or link to – not to mention how much easier a logical URL structure makes website management!

Website structure accessibility – inaccessible navigations are a real headache when it comes to SEO. A navigation wrapped in javascript is bad and a menu made from Flash is worse. Now I bet you are thinking “Javascript makes a website more user friendly because it creates things like drop down menus helping the user to make better sense of the page options.” This might be true, but we need to balance usability with search engine friendly. Firstly, we shouldn’t forget that a slick looking menu/navigation bar could render a website unusable on certain devices and in certain browsers (try switching off Javascript or Flash) but from a strictly SEO perspective it could mean that pages deep within your site aren’t being indexed because the only links to them are from a menu wrapped in code that the spiders can’t decipher.

Considered use of Javascript – following on from the point above…whatever Google says, there is clear evidence that the search engine struggles to handle javascript. Reams and reams of unreadable code could mean Googlebot heads somewhere else rather than crawling any deeper into your site. It might also be causing other issues like crawl errors and damaging your website’s crawl rate neither of which are good
things!

Canonical URLs – use the attribute to specify your preferred URL for a page. This is useful in situations where almost identical pages appear at different URLs because of something like a category choice or session ID being added. It is important to tell Google and Bing which page is the one they should index and pass all relevant link juice and authority to. Failure to implement canonical
URLs can mean duplicate content issues but more crucially loss of rankings as search engines divide link juice and page authority between the copies of the page something which could have been avoided if the correct page had been stated in the rel=canonical tag.

Unique meta titles and descriptions – to many, on-page optimization is just about changing a meta title here or there…hopefully this list will show you otherwise. Whilst making Meta title and description changes might feel like SEO from 1997, in my experience it is still a part of the bigger on-page optimization jigsaw. In my mind, it is quite a simple step in the on-page optimization process… a unique title and description for every page front-loading page level keywords in a natural non-spammy way. There are of course other meta tags that you can include, e.g. ‘keywords’ and whilst I am sure some people will disagree with me on this, I only see the use of optimizing the titles and descriptions, tags like the keyword data have been abused to the point of rendering them almost a complete waste of time. Google might not always use the title and description you give a page but at least you’ve told the search engines what the page is about and if Google does decide to use your title and description, you have some influence over encouraging a user to come to your website over the other choices in the SERPs.

Robots.txt file – a good starting point for robots.txt best practice is this guide from SEOmoz. It is always worthwhile ensuring a robots.txt file doesn’t contain any unwanted directions for the search bots, even if you haven’t added anything, someone or something working on the site before might have.

XML Sitemap – fairly common practice nowadays but still worth a mention. An XML sitemap should always be available. It helps make the search engines aware of all the pages on your website and increases the likelihood of faster inclusion in the index for newer pages.

Website speed – I’m sure this issue is right at the fore of your mind when it comes to building websites because it is a really hot topic right now. Google recently enabled all webmasters to monitor page loading speed directly from their Google Analytics dashboard; if they’ve made it that easy for you, you can bet they are using this data as part of their calculation as to where to rank your website. Google loves to improve user experience and since a fast loading page is definitely a better user experience, I can see this playing an increasing role in SEO of the future, particularly in competitive markets. Also, Amazon.com conducted a study and found that for every 100 millisecond increase in page load time, their sales decreased by 1%. Therefore the reasons for improving page speed go way beyond just SEO! There are multiple ways to improve site speed so I won’t go through them all here but all I will say is code responsibly, choose a good host and setup a CDN (content delivery network) if your client is targeting users worldwide.

Content Factors

I was in two-minds as to whether to include this section in the final guide because as a designer you might have limited control over content factors but there again in my experience; designers certainly have some responsibility for either the content itself or for formatting and publishing so I feel it is worthwhile to mention these factors.

Content language – Google uses the language the text content has been written in as a reference point for the relevance to the user making the search query. If you are targeting an English speaking country then content should be written in English. Obvious really but it does reinforce the need for localized websites if you are helping a client to target other countries that speak different languages.
Content uniqueness – one phrase I am sure you are bored of hearing is ‘create unique content’ if you want to do well in the search results. People keep saying it because it is true. Unique content sends the right kinds of quality signals to Google because more users engage with it and talk about, they share it, it generates more links. Encourage your clients to invest in useful, unique content that offers real value to the reader or if necessary take responsibility for this yourself.

Amount of content – the recent Google Panda algorithm update has had an impact on what could be considered the right ‘amount’ of content. My suggestion is that you encourage clients to consolidate existing content or target new content creation efforts towards smaller but higher quality hubs of content. Help and advise clients to remove pages that are basically just a carbon copy of another page on the site but with a few different keywords.

Unlinked content density – pages that contain a lot of links particularly to external pages never look good in the eyes of Google. It gives off a link farm/poor quality directory/paid link operation type vibe which is not just damaging to the page but also to the website and to the pages it links to. Whilst there isn’t an optimum density, as a rule of thumb the number of links should feel natural and be well balanced with unlinked text or other types of content. If all the links are absolutely necessary, consider breaking them down into smaller categorized pages to improve the unlinked content density.

Is the content well-written? – there isn’t any direct evidence that suggests Google penalizes a website for poor spelling or grammar however that being said, a badly written page is off-putting for the user and will therefore send off the wrong kinds of signals to readers or potential customers and since Google is incorporating user feedback like bounce rate into its algorithm, keeping the user happy is vital.

Expertise and Depth of content – Google is smart and since it is on a mission to organize the world’s information, I would be willing to bet that it has already hit the mark or is close to it when it comes to understanding how deep a piece of content goes and whether the author is an expert or not. Algorithmically it could probably quite easily detect if key topics within the theme have been discussed and whether there are any factual inaccuracies meaning it is more important than ever to really be the expert.

Keyword location and repetition – it is widely accepted that Google places more emphasis on links that appear higher up a page. This is based on the logic that if something is important, it is likely to be included first. My suggestion is always (provided it looks natural) to front load the heading of the page with the keyword being targeted and then to mention the keyword within the first paragraph and then depending on the length of the page at selected intervals throughout the text. The key is to keep it natural, there’s no optimum keyword density but there certainly is such a thing as over optimization and keyword stuffing both of which will see the page and possibly the site subject to a penalty. Interweaving keywords into text so that it is good for both user and search engine can be quite challenging but it is worthwhile.

Spam keyword inclusion – if you run an adult themed website then of course this is unavoidable but be vigilant of quite innocent and accidental inclusion of these keywords on what would ordinarily be a very family friendly website. This will be a real turn off for the search engines because of safe-search filtering and also because it may suspect your website has been violated by hackers who have injected spam keywords and links.


Internal Linking Factors

Number of Internal links – one of the reasons that Wikipedia ranks so well is thanks to its internal linking structure. Of course each of the pages wouldn’t hold so much weight if it weren’t for the overall authority of the website but the online encyclopedia has still mastered internal linking best practices. It adds a link to another page on the site wherever it feels natural and will be useful to the user allowing them to flow through the website. You can take this concept and apply it to your client’s website helping them to increase pages per visit, improve user experience and ultimately improve page rankings through increased link volume. They may ‘only’ be internal links but the will still serve enhance your off-page link building efforts.

Anchor text of internal links – anchor text is still an important factor in link value. It will likely decrease in importance thanks to the abuse of it but for now it is still a case of anchor text rules. Use this with care however, particularly if you are working on a very large website where internal link implementation could potentially result in hundreds if not thousands of links with the same anchor text which would be easily detectable by Google and may result in a penalty. Just as with off-page linkbuilding, internally, it is also important to vary anchor text. Consider making the header navigation a keyword anchor text link, the breadcrumb a variation of this and in-content links something like “learn more about our services” – too many anchor text links can be overkill.

Source of internal links (content, breadcrumbs, navigation) – when it comes to link building campaigns, it is always advisable to encourage links from a variety of sources, the same applies to organizing internal links. Take care to ensure that links to internal pages are balanced. Too heavy reliance on for example breadcrumb navigation could mitigate some of the power of internal links.


Quality Factors

Google is making leaps and bounds towards making truly high-quality websites more visible in the search results. It is important to ensure you are helping clients give off the right kinds of ‘quality signals’: here are some factors worth considering.

A gorgeous design – Google can’t quite grade the looks of your website just yet but it can gauge the reaction of visitors. Good looking websites keep people engaged and stop people clicking away meaning it keeps the bounce rate low. Google utilizes user feedback metrics like bounce rate so anything you can do to improve the user experience is going to be a big win in the SEO arena.

Custom(ised) design – it doesn’t have to be a completely custom design but it is reasonable to assume that Google looks less favorably upon websites that use free or even premium themes but do absolutely nothing to make it their own. I’d imagine that Google takes this stance because it is quite reasonable to say that a webmaster who hasn’t bothered to get the basics of a website right is unlikely to be creating something high-quality in the long run. That might be an over-simplification and a sweeping generalization but Google is trying to crunch vast swathes of data and web pages, it doesn’t have the time to individually review every page out there.

Address, privacy policy, TOS, and other brand signals – Google post-Panda is looking to promote ‘real’ businesses and brands. Adding an address, a privacy policy and other basic housekeeping that reputable online operators would have on their website, can make all the difference with how well a website performs in the search engines. This Google blog post offers some guidance on building high quality websites and one of the rhetorical questions asked is “Would I trust this website with my credit card details?” If the answer is no then it would suggest there are some quality issues that need addressing.


On your radar for the future

Rich snippets and page markup – Schema.org is a collaboration between all the major search engines to allow webmasters to markup their pages and provide more information about them. This markup (or at least some of it) will be incorporated in the search results pages more and more in the future. Read the recent Google Webmaster Central blog about this

There may well be other things that you feel are important in the on-page optimization process but the above is simply my 22 point checklist for optimizing websites.