Friday 24 August 2012

Baking a New Site With a Sprinkle of SEO Part 2



August 21, 2012    0 Comments
In part one of Baking a New Site With a Sprinkle of SEO, we covered some of the more important things a developer can do to help their sites find their way into the top positions of the search engines. This week, we look at more advanced steps you can take to really help catapult them up to the top.

Advanced: canonical tags

You may have heard about Google’s recent algorithm updates such as Pandaand Penguin. Well, one of things Panda concerned its self about was not how much sugar cane it was getting, but rather duplicated content. When I say ‘duplicated content’ I mean content that gets copied across internal pages. A typical example can commonly be found on e-commerce sites. Let’s say you have a product which we’ll call aeroplane model (I am an aviation obsessive) which sits on the URL www.i-really-like-planes.com/aeroplane-model. Now like all good e-commerce sites, I can choose to filter, pick colour, size and all those good things. But each time I do, a very small amount of content changes on the page, but I get completely different URLs:
Original: www.i-really-like-planes.com/aeroplane-model
Colour: www.i-really-like-planes.com/aeroplane-model&colour=red
Colour and size: www.i-really-like-planes.com/aeroplane-model&colour=red&size=large
Essentially, all three pages (but in a typical case there will be many more, I’ve seen up to 200 in some instances!) are all duplicated. Well this confuses dear Google and they’d prefer you to specify a single URL to display to users in their results pages.
We solve this problem by specifying the canonical tag. This is a <link> element which sits in the <head> tag on that particular page. For the example above, the following code would sit on each of the above pages:
<link rel=”canonical” href=”http:// www.i-really-like-planes.com/aeroplane-model” />
If you can provide the ability to add canonical tags across a website in this way, you will greatly improve the effectiveness of the site as a whole by dramatically reducing issues with duplicate content. However, care must be taken to implement these carefully. Each canonical must be specific to each group of pages, so don’t go putting the same one on every page of the site – that will cause more problems than it will solve them!

Advanced: Next and Prev

Following nicely on from the canonical point above, we go onto another canonical element that deals with another source of SEO issues – pagination. This tag enables you to consolidate paginated sections across a site whether they are a series of posts or a category of products. This indicates to search engines that the pages are interconnected via a contextual relationship. Let’s check out an example:
An article series on your site may have a story divided into three parts, all linked via ‘next’ and ‘previous’ buttons. The URLs could be:
www.example.com/headline&page=1
www.example.com/headline&page=2
www.example.com/headline&page=3
On the first page you would include the following code in the <head> section:
<link rel="next" href="http:// www.example.com/headline&page=2" />
On the second page, you would implement this code onto the <head> section:
<link rel="prev" href=" www.example.com/headline&page=2" />
<link rel="next" href=" www.example.com/headline&page=3" />
And finally, on page 3 you would implement this:
<link rel="prev" href="http:// www.example.com/headline&page=2" />
Looking at the code above, you can see how this indicates the inter-page relationship to search engines. However, if you have a ‘view all’ option then there’s no need to implement the above solution. Google will endeavour to show the ‘view all’ page as the preferred page instead.

Advanced: Redirects

Ah yes, redirects, the bread and butter of any SEO’s day to day. You’re probably well aware of the types of redirects (301, 302 etc) so I won’t delve into that (ask in the comments if you’re not sure), but making sure you have the ability to implement 301 redirects and 302 redirects will be very useful for your website. You may need to implement a 301 redirect when:
  • You move to a new website
  • You change the URLs of your site
  • You delete pages on your site
301 redirects are excellent for maintaining the integrity of a site. Imagine if you will a website with lots of links, but the owner of the site hasn’t put in place any redirects or changed the navigational destinations. Lots of links across the site will result in a 404 error leaving a dead end for both users and search engines alike. To solve the issue, you would implement a 301 redirect to the correct URL (and in the case of navigation, fix it!). From an SEO perspective, this is also good for fixing incorrect links coming into the site from other websites, redirecting deleted pages that still carry weight which you can then direct to another relevant page. David Gitonga did a great post on redirects using Htaccess which I recommend. On a final note, I’ve always found it a challenge finding a suitable solution implementing redirects on a Windows server, so it’s worth bearing that in mind. Additionally, you want to check anything that should be a permanent redirect (via 301) isn’t actually redirecting via 302 (temporary redirect) instead as this may impact how search engines view the page. A great tool for checking these redirects can be found here.

Advanced: Sitemaps

The XML sitemap on your site (and if you don’t have one, I recommend you get one) is the road map to your content. Having one of these in place will help Google find and index your pages to enable them to show up in their search results. It’s not essential, no, but recommended. Google may well find all your pages just through the navigation and external links, but why rely on that when you can physically tell them what you want them to look at and where they can find it.
Besides having a sitemap.xml I recommend you also try to do the following:
  • Make sure the sitemap.xml can be easily edited
  • If you have an automatically generated one, ensure you can make exceptions to avoid getting stuff listed that you don’t want in the sitemap.xml (like development pages, duplicate pages and pages with unfriendly URLs if you’ve implemented friendly ones)
  • Check that your sitemap.xml removes the pages as well as adds them if it is automated
  • Check with your robots.txt that no URLs in the sitemap.xml clash with them. In other words, you’re not telling search engines not to index certain pages in the robots.txt and then telling them to in the sitemap.xml!
As with web development, there are many many facets, tactics and techniques that can be implemented to improve rankings and general site visibility. I hope to be back soon with more, but I’d love to hear your feedback and what else you may specifically be interested in learning more about in the comments below!

No comments:

Post a Comment