Seo-mama’s Blog

How On-Page Search Engine Optimisation (SEO) Varies For Different Coded Websites

Posted in Uncategorized by Rajib Roy on November 15, 2008

SEO is a multi-faceted and holistic discipline; a crossover where keyword, key phrase, audience and semantic space research and definition, SEO copy-writing, link building and code optimisation all merge in harmony (hopefully) in an attempt to address a variety of search engine ranking factors and deliver defined business objectives. Programming the website in a way that allows engine spiders to access, categorize, and index the content to ensure your other SEO tactics fulfil their potential is also an important aspect of SEO best practice. Put simply, organic search engine optimisation boils down to two things:

1. Getting other quality sites to link to yours and:

2. Optimising your website’s code and content.

You will want to ensure your code is as error-free as possible, from a W3C validation standpoint and that you follow guidelines for semantically correct mark-up. Testing shows that good, clean, semantically correct code allows for faster indexing by the search engines.

Code is often the most neglected aspect of search engine optimisation campaigns. In a recent online poll only 2% of respondents rated coding as the most important factor in SEO campaigns. Well-written, standards-compliant code makes your site load fast, and opens it up to search engines. With Google now giving pages a score for download speeds, the less unnecessary code on a page the quicker it downloads and the better you score.

The big question when it comes to coding a website is whether to opt for pure HTML coding, in particular Semantic XHTML Mark-up, or to go with Cascading Style Sheets (CSS). Each option has SEO implications. At SEO Consult our programming team share experience and expertise in both specifying on a project-by-project basis, depending on a range of factors.

It is a universally agreed principle that the whole concept of SEO is to optimise a page so that the engines can better understand what it is about. There are those that argue that the simpler the page – meaning a CSS solution (with less HTML and relatively more content), the easier it is for an engine to understand what it is about. It will also load faster and be less prone to spider blocking code errors or any other programming glitches that might prove detrimental to the SEO campaign.

Another particularly influential argument for CSS solutions is that search engines tend to weight content towards the top of HTML documents highest and search engines spider the content that comes first in your source code. CSS can be easily structured so that the SEO relevant content takes priority. In an HTML page you cannot adjust which parts of the page are positioned higher in the source of the page. So when a search engine comes to your site it will usually see the links first, and maybe some text that you wanted in the header. If you use a template then this will be the same on almost every page. However, if you use CSS then you could easily have your main body first, the H1 tag and paragraph text on display to the search engines, when in actual fact when you look at the page it appears to be the main body that appears last.

Other people argue that search engines don’t even read CSS and that although using CSS is a better way to develop websites from a content management, bandwidth and code volume perspective, it doesn’t matter to the search engine whether you use CSS or not. They still view the page exactly the same, as it is the content that is important. Yes, they may read the CSS file, but they are looking for links not styles. They advocate Semantic XHTML Mark-up. Semantic coding can be regarded as the art of programming your website so that the code used is descriptive and representative of the information it contains, more meaningful to search engine bots and more productive SEO-wise.

Some SEO experts are of the opinion that for the purposes of or search engine optimisation, semantic HTML coding is the best way forward, improving how easily search engine crawlers can discern the meaning of your web page. In fact one of the highest ranking sites for the term ‘SEO’, which is probably the most competitive of all the keywords in the world, actually uses tables in their pages.

When a search engine spider like Googlebot visits your web page to index it, it generally extracts the text from the code so it knows which parts of your web page are readable to humans. Googlebot isn’t interested in indexing or displaying any code or text from your website that isn’t visible to humans – it doesn’t record how many div tags you used. What it does do, and what many other search engines are starting to do, is attempt to apply more weight or importance to certain text on your web page.

Here are some great semantic coding guidelines, courtesy of Barry Wise:

  • tags should only be used once on a page, to define the title and/or purpose of the page. It should be very close in meaning to the

  • header tags should be used for subheadings, in order of descending importance. Try not to skip.
  • Don’t use
    to separate list items. Instead use the
      tag with

    1. elements for ordered lists, and

      • should be used for unordered lists.
      • For bold or emphasized text, use strong or em, instead of the less descriptive and tags.
      • Wrap paragraphs in

        tags, and never use

        tags just for spacing. Use the margin and/or padding attributes of the

        tag in your CSS code to add visual spacing.

      Either way it’s wise to check W3C compliance as although it’s unlikely that validity has any direct influence on search ranking, it is incredibly important for error checking, browser compatibility and overall site usability. Who knows; in the future Google may just include W3C validity in their algorithm.

      An increasing number of people use PDAs and mobile handsets with the majority of social media users these days accessing the web via Safari or Firefox. You’ll do well to nip any browser compatibility issues in the bud or risk missing out on a growing audience.


W3C and CSS Validation

Posted in Uncategorized by Rajib Roy on November 15, 2008

The World Wide Web Consortium (W3C) was established in 1994 by Tim Berners-Lee, the man credited with inventing the Internet. The main international standards organization for the World Wide Web (abbreviated WWW or W3), it’s proclaimed mission is, “To lead the World Wide Web to its full potential by developing protocols and guidelines that ensure long-term growth for the web”.

W3C has multiple headquarters at Massachusetts Institute of Technology Laboratory for Computer Science (MIT/LCS) in the USA, ERCIM in France, Keio University in Japan and many other offices around the world. It is set up as a consortium, ambitiously pursuing its mission through the creation of web standards and guidelines. W3C has published more than 110 standards, called W3C Recommendations. W3C also engages in education and outreach, develops software, and serves as an open forum for discussion about the web. In order for the web to reach its full potential, the most fundamental web technologies must be compatible with one another and allow any hardware and software used to access the web to work together. Publishing open (non-proprietary) standards for web languages and protocols, W3C seeks to avoid market fragmentation and thus web fragmentation.

In addition, W3C’s global initiatives also include nurturing relationships with national, regional and international organizations. These contacts help W3C maintain a culture of international participation in the development of the World Wide Web. W3C co-ordinates particularly closely with other organizations that are developing standards for the web or Internet in order to enable clear progress. This is a fundamental part of W3’s firm belief that web technologies must be compatible with one another, regardless of manufacturer or programmer. W3C refers to this goal as ‘web interoperability’.

Among the many technological specifications maintained and validated by the World Wide Web Consortium (W3C) are those for .css. or Cascading Style Sheets (CSS).

Cascading Style Sheets were developed as a means for creating a consistent approach to providing style information for web documents and came to prominence as purely HTML-based sites became more complex to write and maintain. Browser incompatibilities made consistent site appearance difficult, and users had less control over how web content was displayed.

To add an element of conformity and to improve the capabilities of web presentation, the W3C are adding work on CSS to the deliverables of the HTML editorial review board (ERB). They published CSS level 1 Recommendation in December of 1994. Their CSS Working Group began tackling issues that had not been addressed with CSS level 1, resulting in the creation of CSS level 2 in November 1997, published as a W3C Recommendation May 1998. CSS level 3, which was started in 1998, is still under development as of 2008.

In 2005 the CSS Working Groups decided to enforce the requirements for standards more strictly. This meant that already published standards like CSS 2.1, CSS 3 Selectors and CSS 3 Text were pulled back from ‘Candidate Recommendation’ to ‘Working Draft’ level.

There are numerous reasons why organisations including SEO Consult choose to develop using CSS, and why validation is an important aspect of that development.

1. Separate content from presentation. An external style sheet can contain all the styles for your web site. If you want to change the content you only have to edit one style sheet. This is particularly useful for sites containing hundreds or thousands of pages and sites built on content management systems.

2. SEO benefits. Search Engines tend to weight content towards the top of HTML documents and search engines spider the content that comes first in your source code. CSS can be easily structured so that the SEO relevant content takes priority.

3. Pages load faster. Tables slow down page load times. If tables are nested pages will load even slower. CSS-based web pages are far quicker because the styles are all contained in one style sheet.

4. Small file size. CSS reduces html document file size, also helping to reduce load times.

5. Good housekeeping. CSS eliminates rogue code, making site code neater and cleaner. Editing that code is also easier.

6. Accessibility. CSS2’s aural properties provide information to non-sighted users and voice-browser users. The CSS2 ‘media types’ allow authors and users to design style sheets that will cause documents to render more appropriately for certain devices such as Braille, speech synthesizers, or text telephone (tty) devices.

7. Cost Savings through re-use. CSS can shorten the project development timescales through style sheet re-use using only minor modifications.

8. Flexibility of design. You can use pixel precision in your website designs. By adjusting the margins and padding of the CSS you can easily adjust the position of your content. You can also create very modern designs that can’t be duplicated with tables. For example, you can use a background image for a header then place your content over it using the H1 tag for better page optimization.

9. Accessibility. The Disability Discrimination Act makes it unlawful for a service provider to discriminate against a disabled person by refusing to provide them any service that it provides to members of the public. The web accessibility features of CSS not only allow the integration of accessible design but future-proof development against additional legislation.

10. Print friendly. When a user chooses to print a web page an alternative CSS document can be called up. This document can specify that formatting, images and navigation disappear and just the page content appears on the printed version.

It’s the CSS which gives this website its look. All colours, fonts and layouts are controlled by CSS and the code that makes CSS needs to be written properly in accordance with the W3 recommended CSS specifications. Validation ensures this and indicates whether there are any errors in the code.

A CSS validated website will appear visually more consistent in modern browsers. With tens of millions of mobile devices likely to be sold worldwide this year, we are extremely keen to make content easily accessible to in-car browsers, WebTV, Lynx browsers, Screenreaders and PDAs. If there are errors in the CSS then the website may not display properly. This will make it difficult to read or even access.

The Negatives & No’s When It Comes To On-Page Search Engine Optimisation (SEO)

Posted in Uncategorized by Rajib Roy on November 15, 2008

It’s very important to do your homework before hiring an SEO company or consultant. There are essentially three types of company you will come across when researching for an SEO partner.

The Rip-off Merchants: In all walks of life there are always plenty of people out there happy to take your money in exchange for empty promises and a few regrets. SEO is no exception. Guaranteed search engine rankings, massive ROIs, the moon on a stick: they’ll promise whatever they feel you want to hear to get you to part with your cash. There are companies who cold-call prospective clients, informing them that their site is on a list of sites to be banned for link farming or spamming and that unless the prospect uses their SEO services, Google will drop them. These companies basically know just enough SEO terms and expressions to convince lay people to think that they know what they are talking about. When it comes to ripping people off, they’re expert. When it comes to SEO? Just forget them.

Sites offering ‘Website Advertising Services’, ‘Search Engine Submissions’, ‘Free Search Engine Submissions’, ‘Website Traffic’ are invariably a waste of time and, more importantly for your business, money. Submission services are also pointless. Search engines will discover your site by crawling links from other sites that point to yours. You don’t need to submit. This is a service you will most likely be offered from inexperienced SEOs. Talking of inexperienced SEOs leads to the next type of prospective SEO partner – the ‘Don’t know what they’re doing’ brigade.

Probably inexperienced or stuck in some sort of Internet time warp, they’ll apply the most inappropriate and ineffective SEO techniques. Employing unqualified or misguided SEO can risk damage to your site and reputation. A bad SEO company or SEO consultant – as well as wasting your time and money – may also do irreparable harm to an online business, even contributing to the failure of a new or unstable operation.

The final type of partner is the one you’ll hope to work with – the consummate professional – an experienced and expert SEO company or consultant applying the state-of-the-art SEO techniques that will delver your business objective. A company like SEO Consult for example.

Identifying the sharks is relatively easy. Differentiating between the ‘Don’t know what they’re doing’ brigade and the real professionals is more difficult, especially for someone with limited SEO experience themselves.

Of course you will take word of mouth referrals seriously, run searches on a company, ask for references, check their Internet presence and show the due diligence necessary when researching a prospective partner to your business. In addition, and perhaps the best thing to do, is to educate yourself in some of the dos and don’ts of SEO so that you can judge the quality of the advice you are being given.

Here are some of the classic negatives & no’s when it comes to search engine optimisation (SEO). If you’re presented with any of these 7 deadly sins, start asking why they have to be resorted to. If you don’t receive satisfactory answers then simply walk away – you’ll have exposed a company who won’t really be able to offer you the quality of SEO you need.

1. Keyword stuffing – Yes, target keywords, keyword phrases and closely-related terms should appear in the page title, description meta tag, and page copy. Stuffing and shoehorning keywords into copy is so passé. Not only that but it can seriously damage your Internet presence; at best burying your site and at worst leading to it being delisted. Matt Cutts of Google says on his blog, “As always, webmasters are free to do what they want on their own sites, but Google reserves the right to do what we think is best to maintain the relevance of our search results, and that includes taking action on keyword stuffing”.

2. Meta Tags – More old school SEO, meta tags do nothing to improve your rankings other than possibly marginally on Yahoo!. The one that really counts is the description tag. It may not help with your ranking, but it may be used as the description of your site in the Google results. Avoid companies obsessing over meta tags and their use as a selling point.

3. The anchor text at the bottom of the page SEO technique – used to encourage spiders and deep links. Overdo it and it’s excessive over-linking.

4. The high inbound link to one page looks like link farming to the search engines. A professional SEO company will apply techniques such as link bait, articles, press releases, and online public relations to generate quality inbound links not only to your home page but to deep links to pages in the site. Link quality is much more important than link quantity, so look for companies that focus on building good links for you.

5. Hidden Keywords – If you use hidden text such as white on white text or hidden with a div tag you run the risk of being penalised. It’s cheap and it’s shabby. Avoid this and all other tricks.

6. Don’t over elaborate your pages with h1, h2, h3, h4, h5, h6, bold or italic SEO techniques. There are advantages to properly organising your on-page formatting, but when it turns into a technique that is trying to outsmart the system it starts to be counter productive.

7. If submit your site to 5000 directories and 3,500 search engines you can forget about appearing in the top 10. The crawler-based search engines will find your site more quickly as soon as you get a link from another web site already being crawled. Search engine submission died a few years ago. IF SEO companies start talking big numbers and automation when it comes to site submission, show them the door.

None of these techniques reflect favourably on an SEO company. You can’t help but ask yourself why a company would have to resort to trying to fix the system when there are numerous techniques that, if applied professionally, can achieve enormously effective results.

Copywriting and Keywords For Search Engine Optimisation (SEO)

Posted in Uncategorized by Rajib Roy on November 15, 2008

As search engines compile their indexes, they calculate a number defined as (in Google’s case) ‘PageRank’ for each page it finds. PageRank measures how many times other sites link to a given page. More popular sites, especially those with links from sites that have high PageRanks themselves, are considered likely to be of higher quality. They are deemed to have authority.

PageRank is only one of the 200 types of information, or what Google calls ‘signals, that are evaluated to assess ranking. Some signals are on-page – words, links, images and so on. Some are drawn from the history of how pages have changed over time while signals are data patterns uncovered in the trillions of searches that the engines have handled over time

Increasingly, the search engines and in particular Google are using signals that come from users’ personal search histories in order to offer results that best reflect an individual’s interests.

Once the multiple signals have been collected they are fed into formulas called classifiers that try to infer useful information about the type of search. Classifiers can tell, for example, whether someone is searching for information about a person, to purchase a product, or details on a company or place. Another classifier identifies brand names. Signals and classifiers calculate several key measures of a page’s relevance, including ‘topicality’ – a measure of how the topic of a page relates to the broad category of the user’s query. Once the query has been properly analysed the engine can send the user to the most helpful and relevant pages.

Sites with the 10 highest scores win the sought-after first page positions, though the engines now tend to include a final check for sufficient ‘diversity’ in the results. Matt Cutts of Google says, “If you have a lot of different perspectives on one page, often that is more helpful than if the page is dominated by one perspective”. He goes on to add, “If someone types a product, for example, maybe you want a blog review of it, a manufacturer’s page, a place to buy it or a comparison shopping site.”

So how do you get onto that all-important first search page? Relevance. How do you develop that relevance? Simple – high quality content and copy that inspires authoritative inbound links and drives reputation.

Not all web traffic is created equal and it is the synergy between the searcher and the searched that determines whether your business objectives succeed or fail. By developing topical relevance within your site with fresh content and by cross-linking it with other useful and authoritative pages, you start to close the gap between what people want and what you can offer. Writing themed, relevant and regularly updated content essentially toggles search engine spiders. As each of your pages establishes a foothold in search engines, this cycles back to the homepage that then redistributes even more relevance throughout the rest of the site. The more pages that appear for a given topic, the stronger relevance score your website has for that topic.

Here’s the process:

1) Use goal analysis and audience analysis to identify suitable keywords or key phrase.
2) Develop related content and copy.
3) Link it (to and from) other related content with deep links.
4) Invite the spiders to feed on your content
5) Measure and adjust accordingly – remember, the best SEO strategies mature over time

Use your key words and phrases to theme your site and apply them consistently. There is no magic keyword density formula and there are those that would argue that SEO has moved on from focusing largely on keyword density; it’s importance having been diminished by other more interrelated and sophisticated factors. For the record, latest analysis suggests that Google has the strictest requirements allowing no more than 2% of each page’s words as targeted keywords. A keyword density checker return of more than 2% might be considered spamming. Yahoo! and MSN Search have much higher keyword density tolerances – up to 5% of total words.

Proximity of keyword placement and reinforcement is the new emphasis. If a page starts with a particular topic/keyword also consistently mentioned and reinforced through the middle and end of the page, this is indicates a subject focus. Search engines like focus.

It’s important to realise the importance now of reputation built on referrals from other sites, RSS subscribers, editorial links from trusted sources or word of mouth that will establish authority. Consistency is vital in sending a clear, unambiguous identity to both users and the search engines. With your selection of keywords and phrases you’ve chosen the ground on which to fight. Now claim it with conviction.

Staying on topic and maintaining a regular posting schedule (for your blog) or by adding fresh site content will generate increased spidering and indexing; a fast-track to the ultimate SEO goals of website authority and high rankings.

Does your headline make you want to click and read more? To increase click-through rates in search engines, use an appropriate headline. Sell your content and inspire readers to click through.

Deep-link relevant content using your main keywords. This sends a clear signal that these keywords are an important part of your optimisation strategy.

In conclusion, it’s vital to realise that if your content is poorly focused, uncommitted and lacklustre then you’ll be consigned to the also-rans. Even if visitors do somehow manage to find their way to your site they’ll quickly bounce.

You are how you read, with your site’s reputation based firmly on the content it contains. As search engine algorithms move further and further away from old school relevance measurements and increasingly assign authority to sites that command social media tagging and blog-driven links through the quality of their content, so crafting that content has become mission-critical. Expert copywriters armed with the ability and empathy to inspire inbound links and create sticky, rewarding visitor experiences are finding themselves essential and highly valued members of modern search engine marketing teams.

At SEO Consult we offer passionate, professional and hugely experienced writers with a range of specialities and interests. Both creative and qualified, our writers are committed to delivering the very best copy solutions to your campaigns.


Targeting Search Engine Optimisation (SEO Keywords)

Posted in Uncategorized by Rajib Roy on November 15, 2008

When Mr Singhal says, “Search over the last few years has moved from ‘Give me what I typed’ to ‘Give me what I want'”, he should know. Amit Singhal after all is a Google Fellow. Working with the company since 2000, the a 39-year-old elite engineer is in charge of Google’s ranking algorithm – the formulas that determine which Web pages best answer each user’s question.

What Amit Singhal is clarifying here is the fundamental evolution of how the search engines function and how they are approached by users, website owners and search engine optimisation (SEO) specialists.

Despite the fact that Google sits on the top spot in a list of global brands (new research estimates its value to be $86bn a 30% year-on-year increase) and serves hundreds of millions of searches every day, users still click away from Google millions of times a day, disappointed that they couldn’t find the restaurant, the phone number or the background of that new band. Google is good at finding what users want, but it doesn’t always find it.

It’s the elusive quest to close the gap between often and always that drives Amit Singhal and hundreds of other Google engineers to constantly tweak, trim, caress and cajole the company’s search algorithms towards relevance nirvana. The same applies across all the other search engine companies and if you’re serious about your Internet presence and search engine listings it’s something you’ll give thought to as well. Consummate professionals that we are, we at SEO Consult are obsessed with closing that gap as well. The closer we make it, the better it is for our customers and the better it is for us.

The days of algorithms relying on webmasters providing on page data such as the keyword meta tags or index files in engines like ALIWEB are long gone. Meta tags provided an overview on each page’s content. Solely relying on meta data as the means of indexing site information has long proved an unreliable method of identifying quality search returns. Keywords in the meta tag were not truly relevant to the site’s actual keywords. Inaccurate, incomplete, inconsistent and manipulated data in meta tags and other attributes within the HTML source of a page generated irrelevant searches. It’s not only black hat SEO techniques to generate top rankings that has inspired the search engines to apply ever more sophisticated algorithms. It’s the fact that as time goes on all manner of spurious and unusual interpretations, spellings, logic and intentions come to the fore.

It’s by being able to identify that when someone types in Orange they are probably referring to the mobile phone operator and not the fruit that can lead the engine to more accurate returns. Geography and theming can also have a bearing on search returns, as can timescales, topicality and ‘freshness’. There are in fact more than 200 types of information, or what Google calls ‘signals’, that go to determine relevance. Google is increasingly using signals based on the history of what individual users have searched for in the past, providing results that reflect an individual’s interests. For example, a search for ‘fender’ will return different results for a user who is a guitarist than for a user who collects spare parts for old American cars.

The reason for identifying the increasing complexity, breadth and depth of the search engine algorithms is to highlight how absolutely essential it is to apply basic principles from the very outset when undertaking SEO.

Carefully defining business objectives is the starting point from which to establish your Semantic Space. It’s vital to establish a vivid picture of the language, the words and the phrases that will entice appropriate visitors to your site. Also, it defines what you want those visitors to do – the call to action. Your optimum audience online will define your organization in their own terms. By clearly defining your message you can establish an overlap of terms that the audience uses and the ones you want them to use. This overlap is your semantic space and acts as the foundation for your entire campaign.

The key is finding the right terms and phrases, thus defining your semantic space correctly. Use the wrong ones, and you can waste a lot of time and money pursuing the wrong market, or even a non-existent market. At SEO Consult we employ a sophisticated process that chooses keywords by balancing search frequency, competition and relevance.

We compile a list of preferred terms then expand that list using data mining techniques to 20,000 or more semantically similar terms and phrases. The expanded list includes synonyms, common misspellings and alternative spellings, such as ‘optimization’ as opposed to ‘optimisation’, and related terms. We then run a range of separate search databases to establish the terms that really do draw interest. This enables us to calculate the Relative Position Index (RPI) and Click Through Potential (CTP) for each term.

Once you have the appropriate terms and phrases then it’s a case of applying them throughout the rest of the SEO process.

When it comes to the world-class copy that each page should contain, there is no magic keyword density formula. Google has the strictest requirements allowing no more than 2% of each page’s words as targeted keywords. A keyword density checker return of more than 2% is considered spamming. Yahoo! and MSN Search have much higher keyword density tolerances – up to 5% of total words. Smart SEO practice will identify that a trade-off is necessary, making sure that copy is optimized but still legible quality English.

Architecturally, an SEO keyword should be used in the title tag (ideally at the beginning), at least once or even twice (as a variation). Once in the H1 header tag of the page and at least three times in the body copy on the page (sometimes a few more times if there’s a lot of text content). A keyword should be included at least once in bold and ideally should also be used at least once in the alt attribute of an image on the page. An SEO keyword should be used once in the URL and at least once (sometimes twice when it makes sense) in the meta description tag.

Links to targeted pages should contain appropriate anchor text, and support the main keyword. Supporting pages should be part of the overall theme, and provide additional keywords. Of particular importance is the need to have as many theme related pages as possible, especially in highly competitive situations.

Make it easy for the search engines. More importantly, make it easy for them on your terms; in your semantic space using the keywords and keywords that you have researched and selected.


What is On-Page Search Engine Optimisation (SEO)?

Posted in Uncategorized by Rajib Roy on November 15, 2008

On-page search engine optimisation (SEO), alongside the development of quality inbound links, is a vital aspect in gaining good search engine rankings and cultivating a healthy online presence. What you’re doing essentially with on-page SEO is defining and honing aspects of your web presence directly under your control and making sure that your website is as search engine friendly as possible. Without adequate TLC sites stand far less chance of getting good results in the search engines – neglect leading to neglect. It’s a process that surprisingly few organisations complete as thoroughly as they might. The name of the game is to create a synergy with your site content; on-page SEO should offer the holistic application of semantic space as defined during the requirements analysis and keyword definition stages.

Internal linking, content, conventions and off-page SEO need to work in harmony to create the right synergy. A topicality that defines site and subject identity helps the search engines acknowledge the site as relevant when addressing user queries.

Perhaps the most important aspect of good on-page SEO is generating quality content. Without useful and informative copy or content there’s little reason for the visit page to exist in the first place.

It’s becoming increasing clear that one of the most powerful SEO considerations when generating content is its ability to focus the topical theme of the site to target a specific niche.

Writing themed and relevant content as well as stimulating human readers excites the search engine spiders, which are constantly scouring the web for related material to share with others. You’ll want to integrate the relevant keywords into the text to attract the search engines, but it’s not something to get hung up on at the expense of a good read. The last thing people need to see when they visit a site are keywords and phrases shoehorned into an article. A 2-8% keyword density is recommended.

If your content is exceptional and manages to link to and from relevant sources, then it has the raw potential to be incorporated into a stella solution and the site is quickly elevated to an authority on its subject. It’s authority that optimisation is trying to nurture and that delivers strong search engine returns and subsequent traffic.

It’s worth bearing in mind that each page on your site can rank independently, with its own search position. If you group enough pages together around a topic or theme, their relevance can rub off on each other. By focusing on specific subject matter and range of content and by introducing new content regularly you can also increase the crawl rate, get indexed more regularly and be listed on a wider range of terms. Content is well and truly king and the search engines love it.

Search engines and spiders also love light, tight and clean code that lets then rip through a site, pulling out all the relevance and indexing it ready for retrieval. It’s vital that the site can be easily crawled and indexed by search engines. From a search engine perspective, the best site in the world is unlikely to rank if the bot can’t extract any content from it. Make sure that your code is valid; bad code can lead to search engines not being able to properly read a page. Use the W3C validator to check your mark-up.

Focus on internal linking. Link to your own pages from other pages in your site that have tremendous link weight. Internal link juice is critically important. The nature and quality of internal links to a page says a lot about the page, with contributory factors including the number of inbound links, placement of the inbound links within the linking pages, anchor text patterns, and the content of linking pages. This sends a clear signal to search engines that a phrase or a group of semantically related material is a predominant theme on your site. Make sure that the links within your site are complete, there are no spider blocks in the system and that the links also have a semantic relevance.

Other important on-page SEO considerations include:

  • Keyword in title – this was cited as the number one on page factor by when their team of SEO experts evaluated the ten factors that have the most effect on Google’s ranking algorithm. It’s critical that your keyword or phrase is contained in the page title. It’s your best chance to achieve a high ranking as well as your best chance to convert a searcher into a site visitor.
  • Use H1 and H2 tags properly – your H1 tag should be near the top of your web page and include your most important keywords. Your second most important level of keywords should be listed in your H2 tag. If you have a third level, place it in an H3 tag.
  • Keyword in Description – write a concise, informative description of your website, using your selected keywords. This description will appear in search engine result pages, it should attract to potential visitors.
  • Domain Name – a domain name that features your keyword(s) or at least closely related words will also help boost your rankings. Similarly, sub-domain names and other inner page URLs should also use words that are relevant to your keywords.
  • Meta Tags – a little old school but still worth applying. Google is not the only search engine and meta tags can be more important factors for other search engines. Insert your keywords in all meta tags, but don’t over do it and risk penalization.

It’s vital to stress again the importance of consistent semantic and topical relevance. The tighter the semantic phrases are, the more concentrated the effect for realising that particular phrase, particularly if you deploy it in your title, H1 tags, in the anchor text on the page and reference it in the first 25 words of the document, the middle and at the end of the page.

The search engines are evolving all the time and smart SEO must follow suit. If your main concerns when thinking of effective on-page SEO is on meta-tags and keywords, instead of thinking about usability, quality semantically themed content development, flawless information architecture and creating robust and dynamic websites that support their own rankings (authority sites), then it’s time to speak to SEO Consult.


White Hat and Black Hat Search Engine Optimisation Techniques Explained

Posted in Uncategorized by Rajib Roy on November 15, 2008

As long as search engines have existed, the debate has raged over white hat and black hat search engine optimisation (SEO) techniques. Some website owners and SEO companies are willing to go as far as is necessary in the pursuit of high search engine placements, while others take a more long-term ‘ethical’ approach to representing their websites and their clients.

If you search online you will soon notice that as well as the ‘should you/shouldn’t you?’ or a ‘dare you/daren’t you?’ arguments and discussions, much of the web chatter revolves around what actually constitutes White Hat and Black Hat SEO in the first place. Discussing ‘shades of grey’ in this context is a strange debate in many ways, as the guideline and regulations are all clearly defined by the various search engines. In reality there’s very little need for debate as adherence to the guidelines will result in White Hat SEO and deviation from them or ignoring them will lead to Black Hat SEO. Violate any of the guidelines and you risk having the site removed from the index. That’s pretty black and white.

At SEO Consult we see it clearly. If a website is optimised for humans it is sure to get high rankings. Good for business. If it is designed to trick search engines into believing it has more search value than it really does it may also get high rankings, but the difference is that it will be found out and will be punished. Very bad for business.

Some people may be happy to ‘churn and burn’. We don’t do that.

Here are some black hat techniques to be avoided:

  • Buying Links – bought links gives extra weight to the promoted page in the search engine algorithms. The paid-for link adds no extra value to visitors.
  • Cyber Hoaxing – a technique used with affiliate programs. A fake news site hosts a hard to prove or disprove sensational fake new story, submitted across a range of social media sites such as Digg, Stumbleupon,, etc. The basic idea is to generate a buzz and get links to your fake news story even capitalizing on the outrage of the setup.
  • Keyword Stuffing/Hidden Text – defined as stuffing a page with keywords and keyword phrases that can be read by search engine spiders, but not by human visitors. They can be located in a hidden div tag, coloured so that they blend into the background, or even placed within HTML comment tags. Old school and increasingly ineffective.
  • Doorway pages – these are web pages with the sole aim of being spidered by search engines and included in the search engine results pages (SERPs). Usually named after the primary keyword being targeted, they are often stuffed with keywords and created in bulk. They will likely have a form of meta refresh tag or JavaScript redirection, sending visitors to the ‘money site’.
  • Web Page Cloaking – cloaking is a technique that shows a doorway page to search engine spiders but the ‘money page’ to human visitors. Both pages are accessed using the same URL with software used to identify the search engine spiders and serve the doorway page to them. Competitors are kept from scraping the content of the optimised doorways, and human visitors are kept from seeing the ugly doorway pages.
  • 302 Redirect Hijacking – the creation of a web page on a high-page-rank domain with a 302 redirect to the page trying to be hijacked. Spiders follow the redirect to the second page and indexes it, but on the SERP, the URL of the indexed page will be that of the page with the redirect.
  • Scraping and Spinning – this is software that grabs content, paraphrases, randomizes, and generates ‘new’ content. Often it will contain links to being promoted. Spinning content into duplicate-content-penalty-avoiding text is considered the holy grail of black hat techniques.
  • Splogs – close cousins to scraping and spinning, splogs are simply meaningless, worthless blogs with automatically generated content. Many splogs read RSS feeds and create a blog automatically. Splogs can be used to get other sites indexed or their PageRank increased by including links to them. It is estimated that over 20% of online blogs are actually splogs.

Last and not least:

  • Link Spamming/spamdexing – this is a way of getting links through the use of automated software, which accesses unprotected blogs through anonymous web proxies and leaves links in their comments.

To an extent the search engines are stuck between a rock and a hard place. On one hand they try to be as transparent as possible to help site owners, but not so transparent that they provide so much information that those so inclined can game the system.

It’s essentially a business decision as to whether an SEO company or webmaster chooses to pursue a policy of adherence to the stated guidelines or to work beyond them. Techniques that violate the guidelines aren’t White Hat. They may be effective, commonplace, non-deceptive or justified in the short term, but that doesn’t make them White Hat. If your business objectives, timescales and reasoning can support Black Hat SEO then go for it, but remember that Black Hat SEO puts the site at risk of being removed from the search index. Not only that but Black Hat SEO begs the questions, ‘what is SEO?’ and ‘what are you trying to achieve?’ If you view SEO as a long-term project, then it’s creating quality content that’s the most relevant result for a desired query, supporting the content in a topical environment with quality relevant inbound links. If making sure that the site can be easily crawled and indexed by search engines with light, tight code then perhaps Black Hat SEO isn’t for you.

People can get very judgmental when it comes to SEO referring to it as ethical or non-ethical. At the end of the day SEO is about results, it’s about delivering business objectives to customers, professionally and over the long term. At SEO consult we’re as creative as we have to be to give our clients a competitive edge. However, we won’t jeopardise their Internet presence in the process for the sake of a ‘quick fix’ solution.


How do Google, Yahoo! and MSN differ in ranking results for Natural Listings?

Posted in Uncategorized by Rajib Roy on November 15, 2008

There are approximately 1.5 billion Internet users worldwide, of whom 50% use search engines every day. Google, Yahoo! and MSN currently account for over 90% of all these searches, being by far and away the three most popular search engines in the English speaking world, (the Chinese language search engine Baidu actually ranks third in terms of numbers of users and searches overall).

Of these big hitters Google is the massively dominant force, the biggest operation in the market by a long way. According to the latest search metrics from comScore, 63% of searches conducted in the USA during August were made using Google, up from 61.9% in July and 56.5% in August 2007.

Despite Yahoo! being the second most visited website in the U.S., and the most visited website in the world it’s share of the search market slipped to 19.6% from 20.5% in July and from 23.3% a year earlier. Microsoft’s share fell to 8.3%, down from 8.9% in July and 11.3% last year.

Though Google, Yahoo! and MSN are all essentially seeking to achieve the same thing – to provide maximum relevance search return to their users – using similar concepts and technology they all apply their own unique techniques and philosophies. Clearly Google’s approach has struck a cord with web users. Not only did Google achieve critical acclaim when it first appeared on the search scene for its clean layout and the efficacy of its search algorithms, it also achieved a critical user mass. Mass adoption continued it’s own momentum through word of mouth recommendation and referral. What started off as a college project a decade ago has resulted almost unbelievably in the creation of what is now one of the world’s most successful companies. Google’s recent acquisition of Youtube, Blogspot/Blogger and Feedburner has expanded its already extensive user reach and fan base.

WPP-owned research company Millward Brown reports that a combination of brand recognition and financial performance gave Google the top spot in a list of global brands. New research estimates its value to be $86bn (£43bn), a 30% year-on-year increase.

A dilemma for search engine optimisation (SEO) professionals is reconciling the optimisation necessary for a site or web page to appear towards the top of the search results on different search engines against the fact that each engine uses their own unique algorithms and definitions of relevance. It’s a global approach and inevitably there is a pay-off between trying to curry favour with the biggest player in the market (Google) and other, smaller but no less relevant search engines.

There are those of the view that if you target all three you’ll end up failing with all three. Others are of the opinion that if you target only one (either Google, or MSN or Yahoo!), you can apply a tight focus to generate predictable and successful rankings. People sometimes put the choice down to domain age – a 2-year-old domain focusing on Google with a 1 year or younger domain best for MSN and Yahoo!. report the following ten factors as having the most effect on Google’s ranking algorithm:

  • Keyword use in title tag
  • Anchor text of inbound link
  • Global link popularity of site
  • Age of site
  • Link popularity within the site’s internal link structure
  • Topical relevance of inbound links to site
  • Link popularity of site in topical community
  • Keyword use in body text
  • Global link popularity of linking site
  • Topical relationship of linking page

Whilst Google’s market dominance might make it a temptation to take these factors onboard wholesale, it would be wise to also consider and compare some of the main optimisation issues of the other major engines.

Whilst the algorithms are jealously guarded and highly secret, there are some trends and general patterns that can shed at least a little light on how Google, Yahoo! and MSN differ in ranking results for natural listings:



  • Text processing query matching.
  • Generally considered better than MSN but inferior to Google in determining if a link is natural or not.
  • Pretty good at crawling sites deeply so long as they have sufficient link popularity to get all their pages indexed.
  • Has huge amounts of internal content and a paid inclusion program, both of which give them incentive to bias search results toward commercial results. In this context paid inclusion might be legitimately regarded as a component of organic optimisation drawing a bias.
  • On the flip side of that is that if you are trying to rank for highly spammed keyword phrases the top 5 or so results may be editorially selected.
  • Off topic reciprocal links may still yield dividends

MSN Search

  • Relatively new to search.
  • Weak at determining the natural or artificial nature of a link.
  • Place a great deal of weight on the page content.
  • Their poor relevancy algorithms are claimed to cause a heavy bias toward commercial results.
  • Likes bursty recent links.
  • MSN is considered inferior to Yahoo! or Google at crawling deeply through large sites.
  • Will generally rank new sites faster than other systems that regard them as un-trusted.
  • Off-topic reciprocal links can yield results in MSN Search.


  • Concept processing query matching.
  • Best at determining link authenticity.
  • Looks for natural link growth over time.
  • Search results biased toward informational resources.
  • May trusts old sites too much.
  • Has aggressive duplicate content filters.
  • If a page is obviously focused on a term Google may filter the document out for that term.
  • On-page variation and link anchor text variation are important.
  • Crawl depth determined not only by link quantity, but also link quality.
  • Off-topic reciprocal links are generally ineffective and may even result in penalisation.

At SEO Consult, our experience and expertise means that we understand the subtleties and nuance of different engines’ behaviour. Depending on the specific business objectives of individual client campaigns, we can advise and act accordingly.


What Does the Future Hold for Search Engine Optimisation (SEO)?

Posted in Uncategorized by Rajib Roy on November 15, 2008

Organic search engine optimisation (SEO) plays a key role as a component of any organisation’s overall Internet marketing strategy. How best to conceive and deploy SEO as part of a successful Internet presence in the future is a very important marketing question deserving of close attention.

SEO has been beset by image problems and a reputation tarnished by the ‘all things to all people’ snake-oil salesmen, as well as self-inflicted wounds attributable to the ‘don’t know what they’re doing’ brigade. Over the last couple of years it has slowly emerged from of the shadow of keyword-spammers and black hats, increasingly evolving into a highly specialised and professional field.

Trying to trick the search engines these days is pointless; the potential penalties and risk of delisting far outweigh any potential short-term gains. Good, ethical, forward thinking SEO is all about compliance and quality, not quantity. There are no spells, no fairy dust and no magic that can generate perfect SEO at the press of a button. As with too good to be true ‘get rich quick schemes’, the only thing that you can guarantee happening with any speed is that you’ll loose your money (and often an amount of professional integrity into the bargain). If a website is compliant with search engine guidelines, contains high quality, regularly updated content and is linked to authoritative sites with topical relevance, then Google, Yahoo!, MSN and the other search engines will evaluate and accredit the website as being fit for its index. If your SEO is applied professionally and with insight and experience, it will most likely rank highly on those sites. A happy search engine is a happy searcher, is a happy SEO company is a (most importantly) happy client. Happiness abounds.

Internet users are now much more demanding, less forgiving and more sophisticated than in the past. They often know what they want and they want it now. Many can sense spam a mile away and will vote with their mice if they feel they’re being played or tricked. Who wants to waste their time, energy and money dealing with companies trying to get one over on you from the outset? Ethical SEO is appreciated by a far more technologically advanced audience, which responds positively to quality SEO. That positive response doesn’t look likely to change any time soon.

Internet users want topicality and they want relevance. Astute search engines bend over backwards to deliver algorithms that cut out the rubbish and deliver relevant results. Astute SEO companies are wise to acknowledge those efforts and work to creatively comply with the engines’ rules and regulations, delivering ethical SEO as part of a total web marketing package.

Considered and professionally applied SEO is and will remain a potent weapon in the acquisition of targeted traffic: target audience/semantic space overlap, smart clean programming, strategic keyword deployment, quality content, inbound links and compelling calls to action all contribute to relevance.

The major search engines are working hard to evolve their personalisation technology – personalisation in the context of specialist marketing niches, particularly local, media, links, social and video. As media delivery becomes richer and more diverse, so the SEO skills of web marketing will need to be applied through a wider range of channels. Matt Cutts of Google says, “Personalization is one of those things where, if you look down the road in a few years, having a search engine that is willing to give you better results because it can know a little bit more about what your interests are is a good thing”. Signals generated by a history of what individual users have searched for in the past are increasingly used by Google to offer results that reflect and are relevant to each person’s interests.

It seems clear that ‘new’ SEO such as social media optimisation, link bait, and the techniques used to naturally create a buzz, attract word of mouth visitors who will visit repeatedly, attract back links and lots of discussion and are representative of the personalisation concept. Localisation (offering different results for different countries) is also a component of personalisation and for a company planning to market internationally there will be SEO implications, both culturally and language-wise.

Other future developments are likely to be based on search and content delivery, with big opportunities for SEO consultancies to evolve into single ‘all Internet services’ providers. A company which can provide a comprehensive range of Internet services and niche specialisation even beyond traditional webdev/SEO/hosting, and in addition offer a range of niche tech requirements such as telecoms and IT services, will benefit enormously.

Emergent Web 2.0 and Mobile Web technologies offer opportunities for instant, on-demand search results and information retrieval. Technologies that will shape development standards and offer further website marketing require fresh ideas and approaches. The massive adoption of PDAs and handheld Internet devices will impact on the accessibility and compatibility of SEO solutions. With tens of millions of mobile devices likely to be sold worldwide every year, it will be come vital to make content easily accessible to in-car browsers, WebTV, Lynx browsers, Screenreaders and PDAs. Cascading Style Sheets (CSS) based sites with strong ratios of content to code will enable screen readers and mobile browsers and devices to work more effectively through the HTML code. Search engines also like high content, clean CSS based sites.

XML feeds and other technologies like RSS (Really Simple Syndication / Rich Site Summary) syndicate summarised website content for third party aggregators and automatically flag fresh content for viewing. Web users, businesses and marketers are all drawn to such dynamic, proactive technologies and initiatives.

Irrespective of platform or medium, SEO will inevitably evolve and adapt in response to the dynamics of the Internet, all the time aspiring to deliver ever more reliable and relevant search results. As the search engines refine their processes, the opportunities to elicit competitive advantage through the traditional exploitation of engine algorithms will diminish, replaced by the far more exciting opportunities afforded by generating websites and content of exceptional worth delivered via new and innovative channels.


How Search Engine Optimisation (SEO) Started and About Search Engine Optimisation (SEO

Posted in Uncategorized by Rajib Roy on November 15, 2008

The story of the search engine begins with ‘Archie’, created in 1990 by Alan Emtage, a student at McGill University in Montreal. At the time the World Wide Web and its protocols did not yet exist. However the Internet did, with many files scattered over a vast network of computers.

The main way people shared data was via File Transfer Protocol (FTP). If you had a file you wanted to share you would set up an FTP server. If someone was interested in retrieving the data, then they could access it using an FTP client. Even with archive sites, many important files were still scattered on small FTP servers. This information could only be located by the Internet equivalent of word of mouth – with somebody posting an email to a message list or a discussion forum announcing the availability of a file.

Archie changed all that. Archie’s gatherer scoured FTP sites across the Internet and indexed all of the files it found. Its regular expression matcher provided users with access to its database. And there it was – the world’s first search engine.

The first website built was at and was put online on August 6th 1991. It provided an explanation about what the World Wide Web was, how one could own a browser and how to set up a web server. It was also the world’s first web directory, since Berners-Lee maintained a list of other websites.

By the end of 1994, the web had 10,000 servers, of whom 2,000 were commercial, and 10 million users. Traffic was equivalent to shipping the entire collected works of Shakespeare every second. Miniscule by today’s standards.

Primitive web protocols were established and technology evolved. As the web grew, it became more and more difficult to sort through all of the new web pages added each day. Web robots were devised that searched the net, following links from site to site capturing and index website URLs in giant search databases. Search engine brands we are familiar with today such as Excite, Lycos, Infoseek and Yahoo! started to appear around the mid 1990’s. Returns varied enormously from engine to engine depending on the technology and so MetaCrawler was developed, reformatting the search engine output from the various engines that it indexes it onto one concise page.

It wasn’t long after the advent of search engines before advertisers noticed the massive popularity of the search engines compared to other types of site. Receiving daily hits in the millions, the search engines had stumbled across a search-driven advertising gold mine. The rewards for websites placed on the search engine’s first page through high search ranking started to grow as visitors clicked through to the site and followed the call to action.

Clicks tuned into cash, as the Internet became financially viable through advertising revenues, e-commerce and other commercial opportunities. Webmasters sought ever-more inventive ways by which to get their sites to the top of search returns. In so doing they created what has since become the multi million-dollar Search Engine Optimisation (SEO) industry.

Over the last decade SEO techniques used to ensure top positions in search results have changed repeatedly, as the search engines battle to retain their integrity in maintaining search relevance as the number one priority when generating search returns. All search engines apply highly complex and top secret algorithmic formulas by which they assess queries and match them to (hopefully) the most relevant returns. These formulas are the core business differentials between search companies – their currency – and the means by which they claim their competitive advantage over each other. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, allowing those results to be false would turn users to find other search sources.

The optimisation/algorithm dance is a game of cat and mouse, with the search companies working hard to keep one step ahead of webmasters and SEO marketers. In the early days of search engine optimisation, getting listed was straightforward. Using descriptive file names, page titles, and meta descriptions with keywords in sufficient density would normally do the trick. Often returns weren’t particularly relevant but people’s expectations weren’t that high. The ‘add URL’ function was king and software that made automated submissions were a large part of SEO strategy.

By the late 1990’s users had began to get frustrated with slow page load times and irrelevant search results. This proved an ideal launch pad for a slick, fast and powerful new search engine – Google. Web users across the world started to recommend it and Google gained a largely deserved critical acclaim and, more importantly, critical mass.

Web masters and SEO professionals began to realise that the engines, and especially Google, liked flat html sites bursting with relevant text (they still do), not the old fashioned, slow, image-laden sites where graphics were dominant. Ethical SEO was coming of age.

This new era brought inevitable problems, with legitimate SEO techniques copied and abused. Spam and spamdexing also started to be an issue with email addresses and micro sites of successful, high-ranking sites targeted.

As search algorithms advanced, automated submission started to become frowned upon and other factors such as link popularity began to have a large effect.

The leading engines were wising up with sophisticated spider technology at the forefront of advancements. Google began to get tough with its linking algorithm, penalising websites that were found to be linking with each other from the same server or IP address and applying filters, identifying unscrupulous techniques and adjusting algorithms accordingly. These days the search engines use sophisticated and complicated algorithms incorporating over 200 criteria to determine site ranking.

To try and trick the search engines these days is futile. What you might get away with today will most likely cause you to suffer tomorrow. Good SEO is now all about compliance. There’s no magic, no silver bullets. If a website is compliant with search engine guidelines, contains high quality, regularly updated content and is linked to authoritative sites with a topical relevance, then Google, Yahoo!, MSN and the other engines will evaluate and accredit the website as being fit for its index. Additionally, if your SEO is applied professionally and with insight and experience, you will most likely rank highly on those sites.


« Previous PageNext Page »