Sunday, September 20, 2009

SEO Tips



Search engine optimization (SEO) is the process of improving the volume or quality of traffic to a web site from search engines via "natural" ("organic" or "algorithmic") search results. Typically, the earlier a site appears in the search results list, the more visitors it will receive from the search engine. SEO may target different kinds of search, including image search, local search, and industry-specific vertical search engines.



INTRODUCTION


Search Engines - How they Work
Search engines have a short list of critical operations that allows them to provide relevant web results when searchers use their system to find information.

Surfing the Web
Search engines run automated programs, called "bots" or "spiders", that use the hyperlink structure of the web to "crawl" the pages and documents that make up the World Wide Web. Estimates are that of the approximately 20 billion existing pages, search engines have crawled between 8 and 10 billion.

Indexing blog posts
Once a page has been crawled, its contents can be "indexed" - stored in a giant database of documents that makes up a search engine's "index". This index needs to be tightly managed so that requests which must search and sort billions of documents can be completed in fractions of a second.

Processing Queries
When a request for information comes into the search engine (hundreds of millions do each day), the engine retrieves from its index all the document that match the query. A match is determined if the terms or phrase is found on the page in the manner specified by the user. For example, a search for car and driver magazine at Google returns 8.25 million results, but a search for the same phrase in quotes ("car and driver magazine") returns only 166 thousand results. In the first system, commonly called "Findall" mode, Google returned all documents which had the terms "car", "driver", and "magazine" (they ignore the term "and" because it's not useful to narrowing the results), while in the second search, only those pages with the exact phrase "car and driver magazine" were returned. Other advanced operators (Google has a list of 11) can change which results a search engine will consider a match for a given query.

Ranking Results
Once the search engine has determined which results are a match for the query, the engine's algorithm (a mathematical equation commonly used for sorting) runs calculations on each of the results to determine which is most relevant to the given query. They sort these on the results pages in order from most relevant to least so that users can make a choice about which to select.
Although a search engine's operations are not particularly lengthy, systems like Google, Yahoo!, AskJeeves, and MSN are among the most complex, processing-intensive computers in the world, managing millions of calculations each second and funneling demands for information to an enormous group of users.

Keywords and Queries
Search engines rely on the terms queried by users to determine which results to put through their algorithms, order, and return to the user. But, rather than simply recognizing and retrieving exact matches for query terms, search engines use their knowledge of semantics (the science of language) to construct intelligent matching for queries. An example might be a search for loan providers that also returned results that did not contain that specific phrase, but instead had the term lenders.

The engines collect data based on the frequency of use of terms and the co-occurrence of words and phrases throughout the web. If certain terms or phrases are often found together on pages or sites, search engines can construct intelligent theories about their relationships. Mining semantic data through the incredible corpus that is the Internet has given search engines some of the most accurate data about word ontologies and the connections between words ever assembled artificially. This immense knowledge of language and its usage gives them the ability to determine which pages in a site are topically related, what the topic of a page or site is, how the link structure of the web divides into topical communties, and much, much more.

Search engines' growing artificial intelligence on the subject of language means that queries will increasingly return more intelligent, evolved results. This heavy investment in the field of natural language processing (NLP) will help to achieve greater understanding of the meaning and intent behind their users' queries. Over the long term, users can expect the results of this work to produce increased relevancy in the SERPs (Search Engine Results Pages) and more accurate guesses from the engines as to the intent of a user's queries.

Choosing the best
In the classic world of Information Retrieval, when no commercial interests existed in the databases, very simplistic algorithms could be used to return high quality results. On the world wide web, however, the opposite is true. Commercial interests in the SERPs are a constant issue for modern search engines. With every new focus on quality control and growth in relevance metrics, there are thousands of individuals (many in the field of SEO) dedicated to manipulating these metrics in order to control the SERPs, typically by aiming to list their sites/pages first.

The worst kind of results are what the industry refers to as "search spam" - pages and sites with little real value that contain primarily re-directs to other pages, lists of links, scraped (copied) content, etc. These pages are so irrelevant and useless that search engines are highly focused on removing them from the index. Naturally, the monetary incentives are similar to email spam - although few visit and fewer click on the links (which are what provide the spam publisher with revenue), the sheer quantity is the decisive factor in producing income.

Other "spam" results range from sites that are of low quality or affiliate status that search engines would prefer not to list, to high quality sites and businesses that are using the link structure of the web to manipulate the results in their favor. Search engines are focused on clearing out all types of manipulation and hope to eventually achieve fully relevant and organic algorithms to determine ranking order. So-called "search engine spammers" engage in a constant battle against these tactics, seeking new loopholes and methods for manipulation, resulting in a never-ending struggle.

This guide is NOT about how to manipulate the search engines to achieve rankings, but rather how to create a website that search engines and users will be happy to have ranking permanently in the top positions, thanks to its relevance, quality, and user friendliness.

Targeting the Right Terms
Targeting the best possible terms is of critical importance. This encompasses more than merely measuring traffic levels and choosing the highest trafficked terms. An intelligent process for keyword selection will measure each of the following:

Conversion Rate - the percent of users searching with the term/phrase that converts (click an ad, buy a product, complete a transaction, etc.)
Predicted Traffic - An estimate of how many users will be searching for the given term/phrase each month
Value per Customer - An average amount of revenue earned per customer using the term or phrase to search - comparing big-ticket search terms vs. smaller ones.
Keyword Competition - A rough measurement of the competitive environment and the level of difficulty for the given term/phrase. This is typically measured by metrics that include the number of competitors, the strength of those competitors' links, and the financial motivation to be in the sector. SEOmoz's Keyword Difficulty Tool can assist in this process.
Once you've analyzed each of these elements, you can make effective decisions about the terms and phrases to target. When starting a new site, it's highly recommended to target only one or possibly two unique phrases on a single page. Although it is possible to optimize for more phrases and terms, it's generally best to keep separate terms on separate pages, as you can provide individualized information for each in this manner. As websites grow and mature, gaining links and legitimacy with the engines, targeting multiple terms per page becomes more feasible.

Building a Traffic-Worthy Site


One of the most important (and often overlooked) subjects in SEO is building a site deserving of top rankings at the search engines. A site that ranks #1 for a set of terms in a competitive industry or market segment must be able to justify its value or risk losing out to competitors who offer more. Search engines' goals are to rank the best, most usable, functional, and informative sites first. By intertwining your site's content and performance with these goals, you can help to ensure its long-term prospects in the search engine rankings.

Usability
Usability represents the ease-of-use inherent in your site's design, navigation, architecture, and functionality. The idea behind the practice is to make your site intuitive so that visitors will have the best possible experience on the site. A whole host of features figure into usability, including:

Design
The graphical elements and layout of website have a strong influence on how easily usable the site is. Standards like blue, underlined links, top and side menu bars, logos in the top, left-hand corner may seem like rules that can be bent, but adherence to these elements (with which web users are already familiar) will help to make a site usable. Design also encompasses important topics like visibility & contrast, affecting how easy it is for users to interest the text and image elements of the site. Separation of unique sections like navigation, advertising, content, search bars, etc. is also critical, as users follow design cues to help them understand a page's content. A final consideration would also take into account the importance of ensuring that critical elements in a site's design (like menus, logos, colors, and layout) were used consistently throughout the site.

Information Architecture
The organizational hierarchy of a site can also strongly affect usability. Topics and categorization impact the ease with which a user can find the information they need on your site. While an intuitive, intelligently designed structure will seamlessly guide the user to their goals, a complex, obfuscated hierarchy can make finding information on a site disturbingly frustrating.

Navigation
A navigation system that guides users easily through both top-level and deep pages and makes a high percentage of the site easily accessible is critical to good usability. Since navigation is one of a website's primary functions, provide users with obvious navigation systems: breadcrumbs, alt tags for image links, and well-written anchor text that clearly describes what the user will get if he or she clicks a link. Navigation standards like these can drastically improve usability performance.

Functionality
To create compelling usability, ensure that tools, scripts, images, links, etc. all function as they are intended and don't provide errors to non-standard browsers, alternative operating systems, or uninformed users (who often don't know what/where to click).

Accessibility
Accessibility refers primarily to the technical ability of users to access and move through your site, as well as the ability of the site to serve disabled or impaired users. For SEO purposes, the most important aspects are limiting code errors to a minimum and fixing broken links, making sure that content is accessible and visible in all browsers and without special actions.
Content
The usability of content itself is often overlooked, but its importance cannot be overstated. The descriptive nature of headlines, the accuracy of information and the quality of content all factor highly into a site's likelihood to retain visitors and gain links.
Overall, usability is about gearing a site towards the potential users. Success in this arena garners increased conversion rates, a higher chance that other sites will link to yours, and a better relationship with your users (fewer complaints, lower instance of problems, etc.). For improving your knowledge of usability and the best practices, I recommend Steve Krug's exceptionally impressive book, "Don't Make Me Think"; possibly the best $30 you can spend to improve your website.

High Quality Content



Why Should a Search Engine Rank Your Site Above All the Others in its Field?

If you cannot answer this question clearly and precisely, the task of ranking higher will be exponentially more difficult. Search engines attempt to rank the very best sites with the most relevant content first in their results, and until your site's content is the best in its field, you will always struggle against the engines rather than bringing them to your doorstep.

It is in content quality that a site's true potential shows through, and although search engines cannot measure the likelihood that users will enjoy a site, the vote via links system operates as a proxy for identifying the best content in a market. With great content, therefore, come great links and, ultimately, high rankings. Deliver the content that users need, and the search engines will reward your site.

Content quality, however, like professional design, is not always dictated by strict rules and guidelines. What passes for "best of class" in one sector may be below average in another market. The competitiveness and interests of your peers and competitors in a space often determine what kind of content is necessary to rank. Despite these variances, however, several guidelines can be almost universally applied to produce content that is worthy of attention:

Research Your Field
Get out into the forums, blogs, and communities where folks in your industry spend their online discussion time. Note the most frequently asked questions, the most up-to-date topics, and the posts or headlines that generate the most interest. Apply this knowledge when you create high-quality content and directly address your market's needs. If 10,000 people in the botany field are seeking articles that contain more illustrated diagrams instead of just photos, delivering that piece can set your content (and your site) apart from the competition.

Consult and Publish in Partnership with Industry Experts
In any industry, there will be high-level, publically prominent experts as well as a second tier of "well-known in web circles" folks. Targeting either of these groups for collaborative efforts in publishing articles, reviewing your work or contributing (even via a few small quotes) can be immensely valuable. In this manner, you can be assured that your content is both link and visitor-worthy. In addition, when partnering with "experts", exposure methods are built-in, creating natural promotion angles.

Create Documents that Can Serve as One-Stop Resources
If you can provide a single article or resource that provides every aspect of what a potential visitor or searcher might be seeking, your chances for success in SEO go up. An "all-in-one" resource can provide more opportunities than a single subject resource in many cases. Don't be too broad as you attempt to execute this kind of content creation - it's still important to keep a narrow focus when you create your piece. The best balance can be found by putting yourself in the potential users' shoes - if your piece fits their needs and covers every side of their possible interests while remaining "on-message," you're ready to proceed.

Provide Unique Information
Make sure that when you design your content outline, you include data and information that can be found nowhere else. While collecting and amalgamating information across the web can create good content, it is the unique elements in your work that will be noticed and recommended.

Serve Important Content in a Non-Commercial Format
Creating a document format that is non-commercial is of exceptional importance for attracting links and attention. The communities of web and content builders are particularly attuned to the commercialization of the web and will consciously and sub-consciously link to and recommend resources that don't serve prominent or interfering advertising. If you must post ads, do so as subtlety and unobtrusively as possible.

One Great Page is Worth a Thousand Good Pages
While hundreds or dozens of on-topic pages that cover sections of an industry are valuable to a website's growth, it is actually far better to invest a significant amount of time and energy producing a few articles/resources of truly exceptional quality. To create documents that become "industry standard" on the web and are pointed to time after time as the "source" for further investigations, claims, documents, etc. is to truly succeed in the rankings battle. The value of "owning" this traffic and link source far outweighs a myriad of articles that are rarely read or linked to.

Usage monitoring for websites



One of the most valuable sources for data, analysis, and refinement in an SEO campaign is in the statistics available via website tracking and measuring programs. A good analytics program can provide an incredible amount of data that can be used to track your visitors and make decisions about who to target in the future and how to do it.

Below is a short list of the most valuable elements in visitor tracking:

Campaign Tracking - The ability to put specific URLs or referrer strings onto ads, emails, or links and track their success.

Action Tracking - Adding the ability to track certain actions on a site like form submission, newsletter signups, add to cart buttons, and checkout or transaction completions and tying them together with campaigns and keyword tracking so you know what ads, links, terms, and campaigns are bringing you the best visitors.

Search Engine Referral Tracking - Seeing which search engines sent which visitors over time and tracking the terms and phrases they used to reach your site. Combined with action tracking, this can help you determine which terms are most valuable to target.

Referring URLs & Domain Tracking - This allows you to see what URLs and domains are responsible for sending you traffic. By tracking these individually, you can see where your most valuable links are coming from.
First-Time vs. Return Visitors - Find out what percentage of your visitors are coming back each day/week/month. This can help you to figure out how "sticky" and consistently interesting your site is.

Entry Pages - Which pages are attracting the most visitors and which are converting them. You can also see pages that have a very high rate of loss - those pages which don't do a good job pulling people into the site.
Visitor Demographics - Where are your visitors coming from, what browsers are they using, what time do they visit? All these questions and many more can be answered with demographics.

Click Path Analysis - What paths do your visitors follow when they get to your site? This data can help you make more logical streams of pages for visitors to use as they navigate your site, attempt to find information, or complete a task.

Popular Pages - Which pages get the most visitors and which are neglected? Use this data to help improve low popularity pages and emulate highly trafficked ones.

Page Views per Session - This data can tell you how many pages each visitor to your site is viewing - another metric used to measure "stickiness."

Applying the information you learn from your visitor tracking is a science unto its own. Experience and common sense should help to discover which terms, visitors, referrers, and demographics are most valuable to your site, enabling you to make the best possible decisions about how and where to target.

Good luck with your new site. Enjoy!!!!





stats counter


Web Counters

SEO TipsSocialTwist Tell-a-Friend

No comments:

Post a Comment