Ensure the site is hosted on a reliable hosting platform
Ok servers are not the most exciting thing to talk about but, your server being down is not only bad for users it is bad for search engines as well, if they cant see your site they cant index your site. Investing in a good quality dedicated hosting platform is worth its weight in gold. The dedicated platform will also ensure you have complete control over what shares your bandwidth, it also means it will be only your site on there. It is never good to share a server with sites that are black listed by the search engines! Also from a cost point of view consider a virtual server, this will save money and have the same effect.
Always think about hosting your site on a server in the country you are targeting. Some times it is not possible but hey we are given the tools we have to work with and a challenge is always a good thing. When you are thinking about Geo targeting I believe Google looks at this in 3 ways.
Server Response Codes and Redirects
- Top-Level Domain – One way Google can establish where a site is from is by looking at the extension of the domain. For most effective country specific targeting, domains used should be utilised to cover the core countries targeted (particularly if these are non US. This will help ensure that the site is included in ‘pages from country’ searches.
- Hosting Location – Where the site is physically hosted. This aids the search engine establish the locality of the site – particularly where .com sites are concerned. Hosting location can have secondary knock on effects in terms of site performance.
- Context/Language – Both in terms of inbound links and site content. This becomes very very important in the current ranking model and especially if we are dealing with server hosing location problems. Best practice would be to have a site in English and in the native language of each country you are looking to target
You can never beat a good response code or two and SEO’s tend to spend time communicating at each other in them, I am guilty and my wife and fellow Search Cowboy blogger Lisa Myers certainly is! It’s all 200!
Web server responses to user and search engine requests are a critical aspect of search engine optimisation and Website usability. If you are relaying the wrong response you can really “p*ss off” the search engines spiders and your customers alike.
I am not going to spend hundreds of words talking about 200, 300, 500 and 404 so here is a link to a page with everything you need to know about response codes: http://en.wikipedia.org/wiki/List_of_HTTP_status_codes But let’s focus on the good old two of 301 and 302. We would all think by now that everyone would have got there heads around these but still on a weekly basis me and my team keep finding some classic ways of using them.
An absolute link is always the way for me but this is how I see the 301 and 302. In the sense of the 301, the best way to look at it is the requested content (or webpage) has been assigned a new permanent URL and any future references to this page should use one of the returned URLs. Webmaster with link editing capabilities ought to change the existing URL to the new URL if it happens to rank for a given keyword.
If in the case the URL does not rank and if the current URL structure has many dynamic parameters, it may be wise to rewrite the URLS (using either ISAPI rewrite if the site is ASP or Mod_Rewrite if it is open source such as PHP or Perl) to include keywords within the URL string. This sort of thing can be a bit arduous but well worth doing to clean up the urls and make them keyword rich!
When undertaking any redirects of URLs, you should use 301 redirects to ensure that the value of the existing URL is passed along to the new URL. So basically this means the content has permanently moved to a new location, and search engines should reference the new location. OK so the 302 or the “Microsoft Developers Favourite” as I call it ;) Means basically that the webpage has been moved temporarily to a new location. Since the redirect might be altered on occasion, the client should continue to use the old URL. This way, the search engines cache the old URL and not the new one. This keeps any current rankings that this page may have until you decide where you might want to permanently redirect this page. Ultimately work carefully with these and think of it as a temporary redirect, great that the search engine keeps the old url in the index but no link weight is passed and I have seen 302 on homepages or in place for 3 years, do your Q+A regularly of suffer the consequences!
It should be noted that therefore any canonicalisation issues should be handled with a 301 response code rather than a 302.
This leads us into canonicalisation and Domains, quite nicely. So a quick bit on this, many websites register multiple domain names when they start a website. These are then generally pointed at a website creating multiple versions of the same site. This can have a significant impact on the visibility and performance of a website.
For this reason, the appropriate redirects should be put in place to ensure that search engines only index one version of the website. Generally this is done by implementing permanent (or 301) redirects to the primary domain. So to www. or not is the question or so it seems to be for a lot of sites out there! Get in server side and make sure that the website displays on both instances of the domain. I will say choose your hosting wisely as not all hosting companies service 301’s in the control panel and it can be a pain to say the least!
It is possible to instruct the good old “Big G”robot to exclude pages from a website where you don’t want them in the index. Using robots.txt on your site server correctly is so important. Incorrectly positioning the robots file can have disastrous effects on search engine site behaviour. The think I like to do is to use Google sitemaps to check that Google has correctly incorporated your robots.txt. Use the two in conjunction with each other and make them work together so Google maximises the visibility of the correct pages.
Information and Site architecture
This is such a big thing to me and so important to get it right. As ever when you want to know what something means, I use the good old Wikipedia. It defines information architecture as the "practice of structuring information for a purpose”.
Ok so for me site architecture is taking a step back and thinking about how best can I present the overall site content and make it easy for a user to navigate his way through. This same process applies to the search engine bot. I like to split this down into 3 important areas of effective information architecture:
- Understanding how your users think about the topic area of your site, this is where keyword mapping exercises come into play. From an SEO perspective, understanding what users are searching for and tailoring your site accordingly. Developing a cohesive page mapping structure helps with content based SEO, but also provides a solid foundation for subsequent link development.
- A well thought out navigation scheme always goes a long way. There is no point developing a great page mapping structure, if search engines can’t find them. The menus on your site need to be consistent. The internal linkage needs to be well position and directional across the site. Think "bread crumb" to help the users remember where they are and how they got there.
- Use common user interfacing (UI) practices. This is no time to reinvent how people like things. Users have been conditioned by other Web sites to look for things in certain places on a site. Take advantage of this and make life easier for them. This applies just as much to your on-page design as it does for your SEO techniques.
Well there ends my quick (well slightly long) rant about Technical Optimisation. As I said it is not the sexiest part of SEO but it is a very important foundation to gain success in the SEO world of today.
Discount it at your peril! So embrace it!