How to improve your website's Google rank

Scoopler is a search engine speciallizing in real time search

Search Engine Optimisation (SEO) is becoming more and more important to every site. With more than 113 billion searches conducted in July 2009 alone, the volumes speak for themselves.

Some of the techniques used have given SEO a bad name, but while there are many techniques that are less than honest, there are several simple ways to ensure your site benefits from natural search results.

Search can be classified as either organic (natural) or paid. Organic (natural) results are those that occur naturally in search engine results pages (SERPS) and high results depend on both the technical construction of your site and the content within it.

Paid results, often referred to as Pay Per Click or PPC, are the results site owners pay for and which usually surround the organic listings.

Research shows web users prefer organic listings to paid listings, considering them more relevant and trustworthy.

The goal of SEO, then, is to improve your organic listings performance, which in turn should boost traffic to your site.

Organic results

Search engines index the web using large clusters of computers, known as bots, which spider the web by following links found on web pages. These URLS are populated into the search engine indexes and it's this index that's queried every time a user performs a search.

Search engines employ complex mathematical equations, known as ranking algorithms, to order search results. Google's algorithm alone relies on more than 200 individual factors to decide which result, in which order, to return to its web searchers.

Organic SEO can be further split into two categories: On-page: The code and content you use to manage and deliver your web pages. Off-page: External factors effecting SEO.

This is primarily focused around link building – getting other websites to link to your content. Here we'll focus on on-page optimisation methods, which are all under your control.

The most important thing is to maximise accessibility to ensure search engines can find all your content. There are two ways to get discovered by search engines.

One is to submit your site directly to their index (Google; Yahoo; Bing). The other is to wait for them to find it through links to you from other sites during their crawling process. For more information on Google's crawling process, see this page.

Google url submission

To make sure your website is accessible to search engine spiders, follow these simple steps:

1. Ensure you're not preventing the search engines indexing your site via The Robots Exclusion Protocol with use of a robots.txt file, which is used to give instructions to search engine bots. More information on this can be found here.

2. Ensure your content is machine-readable. Avoid using Flash, video or imagery to exclusively house your content. Remember, search spiders cannot see images or video: they can only read written text on a web page.

3. Ensure you have a clear internal linking architecture. Promote important content to the homepage and link to key site sections via dedicated navigation. Group content into clear site sections reflected in your site navigation to aid both users and search engines. For example:

4. Eliminate duplicate content. This could be caused by the way your server is set up or how your CMS serves up content. Either way, this needs to be addressed. We'll cover how to fix the most common duplicate content issues later on.

5. Ensure you're targeting the appropriate keywords for your business objectives. Just as successful advertising campaigns contain content that appeals to a target demographic, successful websites need to focus on keywords that have the highest relevance to their target audience.