Friday, May 6, 2011

[SEO] All in one top 3 free seo tools

All in one top 3 free seo tools

There are a variety of free and paid SEO tools on the internet they have different primary functions and can help different forms of websites out in different ways. I am going to tell you what my 3 best free SEO tools are so that you can use them to your advantage like I have done to mine.

Google Webmaster Tools

A must have for anyone with a website whether it’s a forum a blog a static website or anything else is Google webmaster tool’s.

To get started with Google webmaster tool’s all you need is a Google account which most people have already from their Google adsense accounts or Google adwords accounts. If you have not got an account on Google then you can sign up for one here www.google.com/accounts/ after you have signed up for your account you need to go to Google webmaster tools and then log in.

Next you can add your website to the webmaster tools account, you will need to verify your website from one of the two ways they give you to choose from, one is to add a meta tag in your head section of your website and the second is to upload a text file, I advise anyone to verify their website through the meta tag option it’s far simpler.

Now Google webmaster tools allows you to do loads one of the most used tools that I use is the “Top Search Queries” which displays the top 20 search queries that you usually rank for and the right column displays the top 20 search queries that often get clicked. However do not take every rank for an accurate reading, it is only an average and is usually a week or two behind.

The second most common tool that I use is the links and mainly the external links this shows you the amount of external links google has found to your website however again don’t take it as an accurate number. You can also see the amount of internal links to every single page on your website, allowing you to see which page is the most commonly linked and which page you may be missing out.

The third most common tool I use is the sitemap tool which allows you to upload an XML sitemap through google webmaster tools, which will help with indexing your website faster and enabling your site to be crawled quicker. You can read more about sitemaps here and why they are essential.

Google webmasters is definitely a must have tool for anyone in the website business, whether its just a personal blog or one of many in your portfolio it’s a key tool which can easily be used to your advantage.

Rank Checker

Rank checker is a firefox extension which allows you to easily check where you rank in Google, Yahoo and MSN. You need Mozilla Firefox to enable you to use rank checker and you can download firefox here.

Rank checker is a great tool which allows you to check your search engine rankings in multiple search engines at the same time. You have the option to check your ranking for one keyword or for multiple keywords. Not only this but you can also save your keywords to rank checker, so in the future you can return to rank checker and check your rankings again with out having to type in all your keywords again.

The tool is really easy to install and appears as a little icon in the bottom right of your firefox browser screen. There is so much you can do with rank checker, why not check it out here. It’s free.

You can find my last recommendation already on the website here SEO Quake

Enjoyed this, why not download the 5 part series as an E Book at 29 pages long I am sure you will find it helpful. Subscribe and download right here.


Source : http://back-links.org/all-in-one-top-3-free-seo-tools/

[SEO] Rank checker SEO tool review

Rank checker is a great tool which I use daily it enables you to check the ranks of your websites for certain keywords you maybe targeting. It is a firefox extension built available to download from SEO book.

Installing the SEO tool is easy and takes about 1 minute after you have installed it firefox will restart and in the bottom right hand corner will appear a brand new icon which looks like a smaller version of the image above.

SEO book not only provides the SEO tool and a download link but they also provide a great step by step guide to installing the tool and putting the tool to its use. They explain what you can do with the tool and how to use it effectively, this is something that every SEO should not work with out.

Download here


Source: http://back-links.org/rank-checker-seo-tool-review/

[SEO] Seo Tools To Increase Your Websites Popularity

Your website has been up and running for a while now and you are stuck wondering why the search engines aren’t showing you any love in the results pages. No matter what you type in, you just can’t find your sites(s), what could be the problem?

Typically, concerning search engines, if you site isn’t showing up at all, the problem is most likely one of search engine optimization or SEO for short.

What is this search engine stuff you’re talking about you ask? Keep reading and I’ll explain.

Search engine optimization is the art of making your site search engine friendly. What this means is that you make your site easy to access by the search engines by taking the guesswork out of what your site is about.

Search engines look for the following (non exhaustive list) things when it spiders or crawls (looks at) a website:

1.) Title of the page

A search engine looks into the code of you web page to read what the subject of you page is. What it reads is called the title tag and this is what shows up at the top of your browsers window when you visit a site.

2.) Description of the page

Search engines also read the description you give for your page in the description Meta tag of your site. This is what you read as a description when you do a search engine search and view the results pages. If you don’t supply a description Meta tag, a search engine will usually use the first paragraph of your page to describe it. This is something you don’t want since you want to provide a succinct but curiosity building description of your page that also shows relevance for the searcher to get them to visit your site.

3.) Keywords of the page

Lastly, as far as the code of your page goes, search engines pay particular attention to the keywords Meta tag of your web page. Similar to the description tag in format, the keywords Meta tag tells the search engine what search terms your page is about such as pets, dogs, Irish setters, this way the search engine can rank your page in terms of importance or relevance when someone does a search for “Irish Setter Dogs As Pets”. When defining the keywords for your page, try to only target a maximum of 3 per page and make sure that the keywords you are using actually exist in your web page at the beginning paragraph, middle of the body and ending paragraph of the page.

4) Emphasis on the page

One of the major on page factors (not code) that search engines use to also rank your page is anchor text (links) and text descriptors like bolding and paragraph headers. Recently, the names you give your page’s images have become important as well as any alt and title tags you use to describe them, and links on the page.

An example of both are:

”Irish

Irish Setters

Paragraph

5.) Relevance of the page

Search engines are designed to rank a web page by the factors above but they also rank your page according to off page factors, one of the most important being back links. Search engines consider your web pages to be more relevant if other related sites point to your site and especially so if the anchor text used happens to be one of the keywords defined in your page’s keyword meta tag.

Many marketers attempt to build off page relevance by using methods such as article marketing or parasite hosting marketing. Article marketing is by far the most proven and used method and parasite hosting, such as on sites like blogger, squidoo and hub pages is a close runner up in proven effectiveness.

Even though these methods work, your site will get more “link” juice from high page ranking sites that are in the same niche as yours and that also use the appropriate anchor text to point to your site. Other websites will typically only point a link at your site or pages if the content you are offering is useful, unique and solves a problem or educates the web site owner’s readers.

So, now that you know how to begin optimizing your site for the search engines to make it more popular, your next step is to make sure that your web pages have the right “in page” text, such as the title, keyword and description as well as “on page” text like bolded keywords and link anchor text.

After this, you should begin creating links to your pages using article marketing and the other methods discussed in this article.

Finally, try to build relationships with other webmasters to make them aware of your site so they will possibly link to it. You can also begin this process your self by using methods such as blog commenting and forum posting.

In closing, I have two quotes for you that I keep right above my workspace at home. Take heed to them because they are full of truth.

“Success is to be measured not so much by the position that one has reached in life as by the obstacles which he has overcome while trying to succeed. ”
Booker T. Washington

“Nothing in this world can take the place of persistence. Talent will not; nothing is more common than unsuccessful people with talent. Genius will not; unrewarded genius is almost a proverb. Education will not; the world is full of educated derelicts. Persistence and determination alone are omnipotent. The slogan ‘Press On’ has solved and always will solve the problems of the human race. ”
Calvin Coolidge

If you’ve picked any pointers from this article that you can put into action, then by all means, do so. You won’t really be able to gain any benefits from your new knowledge if you don’t use it.

Now, I’ll leave you with a saying of my own.

Being the best means being the best that YOU can be. Always strive to be a better you and you will always be the best.


Source : http://back-links.org/seo-tools-to-increase-your-websites-popularity/

[SEO] How To Build Traffic To Your Blog

With the growing interest in blogging as a means of online promotion and branding, a lot of marketers are starting blogs to promote their opinions, products, books and services.

But a blog is like a website. "Write and they will come" isn't exactly a magic formula to bring in traffic by the boatload.

If you need to promote your website in order to build traffic to it, you need to promote your blog as well.

Here are some ways you can become a well-read and influential blogger.

1. Write Posts That People Will Want To Read

This should be common sense, but many marketers tend to forget that their readers are real people and that you need to use the principles of online copywriting to make your headlines and copy interesting to your readers.

If you write posts that people enjoy reading, they will reward you by returning to your blog regularly.

Make your posts conversational, pithy and topical. Keep them short and stick to one topic per post.

Write often and regularly so that both readers and search engines visit your blog more often.

2. Optimize Your Posts for Search Engines

I cover this topic in detail in my article on "Search Engine Optimization For Blogs"

But here are the most important rules to follow to get your posts listed for keywords of your choice.

  • Make sure your blog URL contains the primary keyword you want to optimize for
  • Use your primary keywords in the title of your post
  • Use your secondary keywords in the body of your post
  • Use your keywords in the anchor text of links in the body of your posts

3. Submit Your Blog and RSS Feed To Directories

If you publish a blog you should submit your blog and RSS feed to big directories like Yahoo and Dmoz, as well as the numerous blog directories and search engines.

Here is the best list I've found of places to submit your feed or blog, compiled by Luigi Canali De Rossi, who writes under the pseudonym Robin Good.

Best Blog Directory And RSS Submission Sites

Another list of sites to submit your Blog

4. Ping The Blog Services

There are a number of services designed specifically for tracking and connecting blogs. By sending a small ping to each service you let them know you've updated your blog so they can come check you out.

Bookmark the Ping-O-Matic ping results page so you can visit it and quickly ping a number of services with a single click.

5. Build Links To Your Blog

I recommend the methods here as the best ways to get links pointing to your blog.

  • Link to your blog from each page on your main website
  • Trackback to other blogs in your posts
  • Post legitimate comments on other blogs with related topics
  • Offer to exchange links with other similarly themed blogs and websites

6. Edit Your Blog Posts Into Articles

One of the best methods for promoting your website is to write articles and submit them to article directories.

The suggestion for extending this to edit your blog posts into articles and submit them to directories came from the coach at "Explode Blog Traffic" who also has other noteworthy suggestions at his blog.

You'll find an extensive list of article directories here

7. Create Buzz About Your Blog

Creating a buzz about your blog posts and topic in the local and online media will give your marketing a viral component.

  • Create a controversy around your blog or it's topic.
  • Distribute bumper stickers or other merchandise with your blog's URL and tagline.
  • Write a press release about something newsworthy and tie it in with your blog topic.

8. Capture Subscribers By Email

It may seem strange for a blogger to send out updates by email, but email is still the #1 choice of most people who want to receive news and information.

Using a free service like Feedburner to manage your subscriptions is easy and it allows your subscribers to manage all their subscriptions from one interface.

However, if you want more control over your list and don't mind mailing out the updates yourself, you can use an autoresponder system to capture and follow-up with subscribers.

These tips should give you a good start to building your blog traffic.


Source : http://www.blog-maniac.com/build-blog-traffic.htm

[SEO] Search Engine Optimization for Blogs

Blogging software is really a simple Content Management System (CMS) that easily adds new pages and integrates them into your site's navigational structure and linkage.

Blogs and blog posts are naturally search engine friendly because they are text-rich, link-rich, frequently-updated webpages that use stylesheets or CSS, and have very little extraneous HTML.

Optimizing a blog is very similar to optimizing a website, and optimizing a blog post similar to optimizing a web page.

But depending on the blogging service or software you use, the results may look somewhat different.

If you follow some simple rules for search engine optimization, your blog can rank much higher than static website pages in the search engine results pages.

Here are the most important rules to follow to get your posts listed for keywords of your choice.

1. Use your primary keyword in your blog domain

Whether you purchase a separate domain (recommended) for your blog, or host it on a blogging service or a subdomain of your own site, try to ensure that your URL contains the primary keyword you want to optimize for.

For example, if you want your blog to get found for the keyword "rss" get a domain with the keyword "rss", or use the keyword in a subdomain as in

Getting a domain name with your own name might make for good branding, especially if yours is a personal blog.

But if you're doing it for business and want the targeted traffic to flow your way, keywords in the domain or subdomain are a move in the right direction.

2. Use your primary key phrase in your blog header tags and the title of your posts

If your primary key phrase is "business blogging" make sure that the word business, or blogging, or both, appear in your blog headers (the H1 or H2 tags) as well as the title of each of your posts.

Most blogging software will take the keywords in your post title and put them into the file name of the permalink posts it creates.

For example, if you have a blog on Blogger and title your post "Search Engine Optimization For Blogs", Blogger will automatically create a page with your post and name the file "search-engine-optimization-for-blogs.html" or something similar.

With other server-side software like Wordpress and Movable Type, you may require the mod_rewrite command to save the title of your entries as a permalink.

3. Use your secondary keywords in the body of your post

If you want to get listed for secondary keywords use them infrequently in the body of your post and pepper your blog titles or links with them appropriately.

Don't overdo this or your posts will end up sounding unnatural and spammy to readers.

4. Use your keywords in the anchor text of links

Keyword in links have more importance than simple text.

Use your primary and secondary keywords in the anchor text of links when linking to other blog posts or to other pages on your main site.

Link keywords where they naturally appear in the body text, but again, don't overdo it, or you'll end up with spammy looking pages.

5. Make sure search engines can spider your blog easily

Set up your blog so that the side navigation bar is present on all pages.

Make sure your archives and previous posts are accessible from all pages of your blog so they get spidered easily.

6. Get backlinks from other blogs or websites

Links pointing to your blog or posts are essential to build pagerank and make your blog rank higher in the search engine listings.

Instead I would recommend that you work with professional SEO consulting companies (we like Geary SEO) and focus your linking efforts on the methods below.

Submitting to Blog Search Engines and Directories:

Submitting your blog and RSS feed to blog search engines and directories is essential for getting high-quality links back to your blog.

Here is the best list I've found of places to submit your feed or blog.

Best Blog Directory And RSS Submission Sites

Link Exchanges:

Many similarly-themed blogs are often willing to exchange links with other blogs and form richly interlinked networks or communities. Link exchanges with other blogs are easy to implement with most blogging software.

Trackbacks:

You can also get links back to your blog using trackbacks. One of the disadvantages of using Blogger is that it does not automatically create trackback urls that others can use to link back to your posts.

But if trackbacks are an important component of your linking strategy, I would advise using another software or system that adds this feature automatically.

Comments:

You can also get back links to your blog by posting legitimate comments in response to posts on other blogs.

7. Update frequently

There's no better food for search engine spiders than fresh content.

Post and update your blog frequently using all the rules outlined above and there's no reason why your blog will not get you top rankings in a short period of time.

8. Stay put

Once you create your blog, try to stick to the same domain and blog host or system for as long as you continue to publish.

You could end up losing a lot of your traffic, your readers and all your search engine listings if you decide to move.


Source : http://www.blog-maniac.com/blog-seo.htm

[SEO] Introduction to Search Engine Optimization

Search engine optimization (SEO) is the process of improving the visibility of a website or a web page in search engines via the "natural" or un-paid ("organic" or "algorithmic") search results. Other forms of search engine marketing (SEM) target paid listings. In general, the earlier (or higher on the page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engine's users. SEO may target different kinds of search, including image search, local search, video search, academic search[1], news search and industry-specific vertical search engines. This gives a website web presence.

As an Internet marketing strategy, SEO considers how search engines work, what people search for, the actual search terms typed into search engines and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content and HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic.

The initialism "SEO" can refer to "search engine optimizers," a term adopted by an industry of consultantswho carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. Search engine optimizers may offer SEO as a stand-alone service or as a part of a broader marketing campaign. Because effective SEO may require changes to the HTML source code of a site and site content, SEO tactics may be incorporated into website development and design. The term "search engine friendly" may be used to describe website designs, menus, content management systems, images, videos, shopping carts, and other elements that have been optimized for the purpose of search engine exposure.

Another class of techniques, known as black hat SEO or spamdexing, uses methods such as link farms,keyword stuffing and article spinning that degrade both the relevance of search results and the quality of user-experience with search engines. Search engines look for sites that employ these techniques in order to remove them from their indices.


History

Webmasters and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed to do was submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[2] The process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as anindexer, extracts various information about the page, such as the words it contains and where these are located, as well as any weight for specific words, and all links the page contains, which are then placed into a scheduler for crawling at a later date.

Site owners started to recognize the value of having their sites highly ranked and visible in search engine results, creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997.[3] The first documented use of the term Search Engine Optimization was John Audette and his company Multimedia Marketing Group as documented by a web page from the MMG site from August, 1997 on the Internet Way Back machine (Document Number 19970801004204).[4] The first registered USA Copyright of a website containing that phrase is by Bruce Clay effective March, 1997 (Document Registration Number TX0005001745, US Library of Congress Copyright Office).[5]

Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines likeALIWEB. Meta tags provide a guide to each page's content. Using meta data to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[6] Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines.[7]

By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pagesshowed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, allowing those results to be false would turn users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.

Graduate students at Stanford University, Larry Page and Sergey Brin, developed "backrub," a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[8] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random surfer.

Page and Brin founded Google in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[9] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[10]

By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. Google says it ranks sites using more than 200 different signals.[11] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Notable SEO service providers, such as Rand Fishkin, Barry Schwartz, Aaron Wall andJill Whalen, have studied different approaches to search engine optimization, and have published their opinions in online forums andblogs.[12][13] SEO practitioners may also study patents held by various search engines to gain insight into the algorithms.[14]

In 2005 Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[15] In 2008, Bruce Clay said that "ranking is dead" because of personalized search. It would become meaningless to discuss how a website ranked, because its rank would potentially be different for each user and each search.[16]

In 2007 Google announced a campaign against paid links that transfer PageRank.[17] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, in order to prevent SEO service providers from using nofollow for PageRank sculpting.[18] As a result of this change the usage of nofollow leads to evaporation of pagerank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and Javascript. [19]

In December 2009 Google announced it would be using the web search history of all its users in order to populate search results.[20]

Real-time-search was introduced in late 2009 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.


Relationship with search engines

By 1997 search engines recognized that webmasters were making efforts to rank well in their search engines, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such asInfoseek, adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.[22]

Due to the high marketing value of targeted search results, there is potential for an adversarial relationship between search engines and SEO service providers. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web,[23] was created to discuss and minimize the damaging effects of aggressive web content providers.

Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[24]Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[25] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[26]

Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, chats, and seminars. In fact, with the advent of paid inclusion, some search engines now have a vested interest in the health of the optimization community. Major search engines provide information and guidelines to help with site optimization.[27][28][29] Google has a Sitemapsprogram[dead link][30] to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website. Google guidelines are a list of suggested practices Google has provided as guidance to webmasters. Yahoo! Site Explorer provides a way for webmasters to submit URLs, determine how many pages are in the Yahoo! index and view link information.[31]Bing Toolbox provides a way from webmasters to submit a sitemap and web feeds, allowing users to determine the crawl rate, and how many pages have been indexed by their search engine.

Methods

Getting indexed

The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. Some search engines, notably Yahoo!, operate a paid submission service that guarantee crawling for either a set fee or cost per click.[32] Such programs usually guarantee inclusion in the database, but do not guarantee specific ranking within the search results.[dead link][33] Two major directories, the Yahoo Directory and the Open Directory Project both require manual submission and human editorial review.[34] Google offers Google Webmaster Tools, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that aren't discoverable by automatically following links.[35]

Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by the search engines. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.[36] Additionally, search engines sometimes have problems with crawling sites with certain kinds of graphic content, flash files, portable document format files, and dynamic content. [37]

Preventing crawling

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed, and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[38]

Increasing prominence

A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to most important pages may improve its visibility.[39] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[39] Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's meta data, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL normalization of web pages accessible via multiple urls, using the "canonical" meta tag[40] or via 301 redirects can help make sure links to different versions of the url all count towards the page's link popularity score.

White hat versus black hat

SEO techniques are classified by some into two broad categories: techniques that search engines recommend as part of good design, and those techniques that search engines do not approve of and attempt to minimize the effect of, referred to as spamdexing. Some industry commentators classify these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[41] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites will eventually be banned once the search engines discover what they are doing.[42]

An SEO tactic, technique or method is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[27][28][29][43] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see.

White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the spiders, rather than attempting to game the algorithm. White hat SEO is in many ways similar to web development that promotes accessibility,[44] although the two are not identical.

White Hat SEO is merely effective marketing, making efforts to deliver quality content to an audience that has requested the quality content. Traditional marketing means have allowed this through transparency and exposure. A search engine's algorithm takes this into account, such as Google's PageRank.

Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking.

Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One infamous example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[45]Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.[46]

As a marketing strategy

SEO is not necessarily an appropriate strategy for every website, and other Internet marketing strategies can be much more effective, depending on the site operator's goals.[47] This includes paid search advertising which has its own version of SEO called ATO (Ad Text Optimization). A successful Internet marketing campaign may drive organic traffic, achieved through optimization techniques and not paid advertising, to web pages, but it also may involve the use of paid advertising on search engines and other pages, building high quality web pages to engage and persuade, addressing technical issues that may keep search engines from crawling and indexing those sites, setting up analytics programs to enable site owners to measure their successes, and improving a site's conversion rate.[48] SEO may generate areturn on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. (Some trading sites such as eBay can be a special case for this; it will announce how and when the ranking algorithm will change a few months before changing the algorithm). Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[49] It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic.[50] A top-ranked SEO blog Seomoz.org[51] has suggested, "Search marketers, in a twist of irony, receive a very small share of their traffic from search engines." Instead, their main sources of traffic are links from other websites.[52]

International markets

Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[53] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[54] As of 2006, Google had an 85-90% market share in Germany.[55] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[55] As of June 2008, the marketshare of Google in the UK was close to 90% according to Hitwise.[56] That market share is achieved in a number of countries.

As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable markets where this is the case are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.

Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address. Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language.[55]


Source : http://en.wikipedia.org/wiki/Search_engine_optimization