Jalani skenario yang telah ALLAH SWT tetapkan, jangan pernah anda sesekalipun menyesali apa yang telah ALLAH beri. Menataplah selalu pada kaca masa depan anda

Saturday, May 19, 2012

Search Engine Optimization

Search engine optimization (SEO) is the process of improving the visibility of a website or a web page in search engines' "natural," or un-paid ("organic" or "algorithmic"), search results. In general, the earlier (or higher ranked on the search results page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engine's users. SEO may target different kinds of search, including image search, local search, video search, academic search,[1] news search and industry-specific vertical search engines.
As an Internet marketing strategy, SEO considers how search engines work, what people search for, the actual search terms or keywords typed into search engines and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content and HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic.
The acronym "SEOs" can refer to "search engine optimizers," a term adopted by an industry of consultants who carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. Search engine optimizers may offer SEO as a stand-alone service or as a part of a broader marketing campaign. Because effective SEO may require changes to the HTML source code of a site and site content, SEO tactics may be incorporated into website development and design. The term "search engine friendly" may be used to describe website designs, menus, content management systems, images, videos, shopping carts, and other elements that have been optimized for the purpose of search engine exposure.

Contents

History

Webmasters and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed to do was to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[2] The process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as an indexer, extracts various information about the page, such as the words it contains and where these are located, as well as any weight for specific words, and all links the page contains, which are then placed into a scheduler for crawling at a later date.
Site owners started to recognize the value of having their sites highly ranked and visible in search engine results, creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997.[3] The first documented use of the term Search Engine Optimization was John Audette and his company Multimedia Marketing Group as documented by a web page from the MMG site from August, 1997.[4]
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using meta data to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[5][unreliable source?] Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines.[6]
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, allowing those results to be false would turn users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. Graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub," a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[7] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random surfer.
Page and Brin founded Google in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[8] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[9]
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. Google says it ranks sites using more than 200 different signals.[10] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. SEO service providers, such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Whalen, have studied different approaches to search engine optimization, and have published their opinions in online forums and blogs.[11][12] SEO practitioners may also study patents held by various search engines to gain insight into the algorithms.[13]
In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[14] In 2008, Bruce Clay said that "ranking is dead" because of personalized search. It would become meaningless to discuss how a website ranked, because its rank would potentially be different for each user and each search.[15]
In 2007, Google announced a campaign against paid links that transfer PageRank.[16] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, in order to prevent SEO service providers from using nofollow for PageRank sculpting.[17] As a result of this change the usage of nofollow leads to evaporation of pagerank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and Javascript. [18]
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[19]
Google Instant, real-time-search, was introduced in late 2009 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[20]
In February 2011, Google announced the "Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice, however Google implemented a new system which punishes sites whose content is not unique. [21]

Relationship with search engines

Yahoo and Google offices
By 1997, search engines recognized that webmasters were making efforts to rank well in their search engines, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.[22]
Due to the high marketing value of targeted search results, there is potential for an adversarial relationship between search engines and SEO service providers. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web,[23] was created to discuss and minimize the damaging effects of aggressive web content providers.
Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[24] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[25] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[26]
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, chats, and seminars. Major search engines provide information and guidelines to help with site optimization.[27][28] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[29] Bing Toolbox provides a way from webmasters to submit a sitemap and web feeds, allowing users to determine the crawl rate, and how many pages have been indexed by their search engine.

Methods

Suppose each circle is a website, and an arrow is a link from one website to another, such that a user can click on a link within, say, website F to go to website B, but not vice versa. Search engines begin by assuming that each website has an equal chance of being chosen by a user. Next, crawlers examine which websites link to which other websites and guess that websites with more incoming links contain valuable information that users want.
Search engines uses complex mathematical algorithms to guess which websites a user seeks, based in part on examination of how websites link to each other. Since website B is the recipient of numerous inbound links, B ranks highly in a web search, and will come up early in a web search. Further, since B is popular, and has an outbound link to C, C ranks highly too.

Getting indexed

The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. Some search engines, notably Yahoo!, operate a paid submission service that guarantee crawling for either a set fee or cost per click.[30] Such programs usually guarantee inclusion in the database, but do not guarantee specific ranking within the search results.[31] Two major directories, the Yahoo Directory and the Open Directory Project both require manual submission and human editorial review.[32] Google offers Google Webmaster Tools, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links.[33]
Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by the search engines. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.[34]

Preventing crawling

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed, and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[35]

Increasing prominence

A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to most important pages may improve its visibility.[36] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[36] Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's meta data, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL normalization of web pages accessible via multiple urls, using the "canonical" meta tag[37] or via 301 redirects can help make sure links to different versions of the url all count towards the page's link popularity score.

White hat versus black hat

SEO techniques can be classified into two broad categories: techniques that search engines recommend as part of good design, and those techniques of which search engines do not approve. The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[38] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[39]
An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[27][28][40] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the spiders, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[41] although the two are not identical.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking.
Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[42] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.[43]

As a marketing strategy

SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, depending on the site operator's goals.[44] A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[45]
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[46] Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic. According to Google's CEO, Erick Schmidt, in 2010, Google made over 500 algorithm changes - almost 1.5 per day.[47] It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic.[48] Seomoz.org has suggested that "search marketers, in a twist of irony, receive a very small share of their traffic from search engines." Instead, their main sources of traffic are links from other websites.[49]

International markets

Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[50] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[51] As of 2006, Google had an 85-90% market share in Germany.[52] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[52] As of June 2008, the marketshare of Google in the UK was close to 90% according to Hitwise.[53] That market share is achieved in a number of countries.
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable markets where this is the case are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address. Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language.[52]

Legal precedents

On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[54][55]
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. Kinderstart's website was removed from Google's index prior to the lawsuit and the amount of traffic to the site dropped by 70%. On March 16, 2007 the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[56][57]

S-E-O



.
Google Ranking Factor Checklist


  1. Positive ON-Page SEO Factors

  2. Negative ON-Page SEO Factors

  3. Positive OFF-Page SEO Factors

  4. Negative OFF-Page SEO Factors

Notes for the Above Factors
Brief Google Update List - Panda
The Sand Box
Sources
Vaughn's Summaries

SEO Optimization
Google Summaries
There are "over 200 SEO factors" that Google uses to rank pages in the Google search results (SERPs). What are the search engine optimization rules?
Here is the speculation - educated guesses by SEO webmasters on top webmaster forums. Should you wish to achieve a high ranking, the various confirmed and suspected Google Search Engine Optimization (SEO) Rules are listed below.

The SEO Rules listed below are NOT listed by weight, and not by any presumed relevance - THAT exercise is left up to the reader!
20 458 - 768 10 280


1. Alleged POSITIVE ON-Page SEO Google Ranking Factors (38)
(Keeping in mind the converse, of course, that when violated, some of these factors
immediately jump into the
NEGATIVE On-Page Ranking Factors domain.)

The term "Keyword" below refers to the "Keyword Phrase", which can be one word or more.
Green rows confirmed by Google patent of Aug. 10, 2006
Note -
Patent
Claim
#
Factor
#
POSITIVE
ON-Page SEO Factors
Brief Note
50
-
KEYWORDS
Google patent - Topic extraction
For keyword selection,
try Google Ad Words - Google Trends
HOT
1
Keyword in URL First word is best, second is second best, etc.
HOT
2
Keyword in Domain name Same as in page-name-with-hyphens
- Keywords - Header
HOT
3
Keyword in Title tag Keyword in Title tag - close to beginning
Title tag 10 - 60 characters, no special characters.
-
4
Keyword in Description meta tag Shows theme - less than 200 chars.
Google no longer "relies" upon this tag, but will often use it.
-
5
Keyword in Keyword metatag Shows theme - less than 10 words.
Every word in this tag MUST appear somewhere in the body text. If not, it can be penalized for irrelevance.
No single word should appear more than twice.
If not, it may be considered spam. Google purportedly no longer uses this tag, but others do.
- Keywords - Body
-
6
Keyword density in body text 5 - 20% - (all keywords/ total words)
Some report topic sensitivity - the keyword spamming threshold % varies with the topic.
-
7
Individual keyword density
1 - 6% - (each keyword/ total words)
HOT
8
Keyword in H1, H2 and H3 Use Hx font style tags appropriately
-
9
Keyword font size "Strong is treated the same as bold, italic is treated the same as emphasis" . . . Matt Cutts July 2006
-
10
Keyword proximity (for 2+ keywords) Directly adjacent is best
-
11
Keyword phrase order Does word order in the page match word order in the query?
Try to anticipate query, and match word order.
-
12
Keyword prominence (how early in page/tag) Can be important at top of page, in bold, in large font
- Keywords - Other
-
13
Keyword in alt text Should describe graphic - Do NOT fill with spam
(Was part of Google Florida OOP - tripped a threshold - may still be in effect to some degree as a red flag, when summed with all other on-page optimization - total page optimization score - TPOS).
-
14
Keyword in links to site pages (anchor text) Links out anchor text use keyword?
- NAVIGATION - INTERNAL LINKS
SITE
15
To internal pages- keywords? Link should contain keywords.
The filename "linked to" should contain the keywords.
Use hyphenated filenames, but not long ones - two or three hyphens only.
SITE
16
All Internal links valid?
Validate all links to all pages on site.
Use a free link checker. I like this one.
SITE
17
Efficient - tree-like structure TRY FOR two clicks to any page - no page deeper than 4 clicks
SITE
18
Intra-site linking Appropriate links between lower-level pages
54
-
NAVIGATION - OUTGOING LINKS
55
19
To external pages- keywords? Google patent - Link only to good sites. Do not link to link farms. CAREFUL - Links can and do go bad, resulting in site demotion. Unfortunately, you must devote the time necessary to police your outgoing links - they are your responsibility.
56
20
Outgoing link Anchor Text Google patent - Should be on topic, descriptive
61, 62
21
Link stability over time Google patent - Avoid "Link Churn"
-
22
All External links valid?
Validate all links periodically.
-
23
Less than 100 links out total Google says limit to 100,
but readily accepts 2-3 times that number. ref 2k
-
121
(added)
Linking to Authority Some say this gives a boost -
Others say that is absurd. However, it certainly is the opposite of linking to trash, which WILL hurt you.
- OTHER ON-Page Factors
-
24
Domain Name Extension
Top Level Domain - TLD
.gov sites seem to be the highest status
.edu sites seem to be given a high status
.org sites seem to be given a high status
.com sites excel in encompassing all the spam/ crud sites, resulting in the need for the highest scrutiny/ action by Google.
Perhaps one would do well with the new .info domain class. - Nope. Spammers jumped all over it - no safe haven there. Not so much, now - .info sites can rank highly.
-
25
File Size Try not to exceed 100K page size (however, some subject matter, such as this page, requires larger file sizes).
Smaller files are preferred <40K (lots of them).
-
26
Hyphens in URL Preferred method for indicating a space, where there can be no actual space
One or two= excellent for separating keywords (i.e., pet-smart, pets-mart)
Four or more= BAD, starts to look spammy
Ten = Spammer for sure, demotion probable?
6, 7
12, 13
27
Freshness of Pages Google patent - Changes over time
Newer the better - if news, retail or auction!
Google likes fresh pages. So do I.
8, 9
28
Freshness - Amount of Content Change New pages - Ratio of old pages to new pages
27
29
Freshness of Links Google patent - May be good or bad
Excellent for high-trust sites
May not be so good for newer, low-trust sites
-
30
Frequency of Updates Frequent updates = frequent spidering = newer cache
-
31
Page Theming Page exhibit theme? General consistency?
-
32
Stem, stems, stemmed, stemmer,
stemming, stemmist, stemification
-
33
Synonyms, CIRCA white paper
-
34
LSI
Latent Semantic Indexing - Speculation, no proof
-
35
URL length Keep it minimized - use somewhat less than the 2,000 characters allowed by IE - less than 100 is good, less is even better
- OTHER ON-SITE Factors
5
36
Site Size - Google likes big sites Larger sites are presumed to be better funded, better organized, better constructed, and therefore better sites. Google likes LARGE sites, for various reasons, not all positive. This has resulted in the advent of machine-generated 10,000-page spam sites - size for the sake of size. Google has caught on and dumped millions of pages, or made them supplemental.
4
37
Site Age Google patent - Old is best. Old is Golden.
3
38
Age of page vs. age of site Age of page vs. age of other pages on site
Newer pages on an older site will get faster recognition.
- Note: For ALL the POSITIVE On-Page factors listed above,
PAGE RANK can
OVERRIDE them all. So can Google-Bombing.



2. Alleged Negative ON-Page SEO Google Ranking Factors (24)
Note
Factor
#
NEGATIVE
ON-Page SEO Factors
Brief Note
BAD
39
Text presented in graphics form only
No ACTUAL body text on the page
Text represented graphically is invisible to search engines.
BAD
40
Affiliate site? The Florida update went after affiliates with a vengeance - flower and travel affiliates were hit hard - cookie-cutter sites with massive inter-linking, but little unique content. Subsequent updates have also targeted affiliates.
BAD
41
Over optimization penalty (OOP) Penalty for over-compliance with well-established, accepted web optimization practices. Too high keyword repetition (keyword stuffing) may get you the OOP. Overuse of H1 tags has been mentioned. Meta-tag stuffing.
BAD
42
Link to a bad neighborhood Don't link to link farms, FFAs (Free For All's)
Also, don't forget to check the Google status of EVERYONE you link to periodically. A site may go "bad", and you can end up being penalized, even though you did nothing. For instance, some failed real estate sites have been switched to p0rn by unscrupulous webmasters, for the traffic. This is not good for you, if you are linking to the originally legitimate URL.
BAD
43
Redirect thru refresh metatags Don't immediately send your visitor to another page other than the one he/ she clicked on, using meta refresh.
BAD
44
Vile language - ethnic slur Including the George Carlin 7 bad words you can't say on TV, plus the 150 or so that followed. Don't shoot yourself right straight in the foot. Also, avoid combinations of normal words, which when used together, become something else entirely - such as the word juice, and the word l0ve. See why I wrote that zero? I don't even want to get a proximity penalty, either. Paranoia, or caution? You decide. I always want to try to put my "best foot forward".
BAD
45
Poison words The word "Links" in a title tag has been suggested to be a bad idea. Here is my list of Poison Words for Adsense. This penalty has been loosened - many of these words now appear in normal context, with no problems. But watch your step.
BAD
46
Excessive cross-linking - within the same C block (IP=xxx.xxx.CCC.xxx)
If you have many sites (>10, author's guess) with the same web host, prolific cross-linking can indicate more of a single entity, and less of democratic web voting. Easy to spot, easy to penalize.
"This does not apply to a small number of sites" .. (this author guesses the number 10, JAWG) . . . "hosted on a local server". . Matt Cutts July 2006
BAD
47
Stealing images/ text blocks from another domain Copyright violation - Google responds strongly
if you are reported. ref egol
File Google DMCA
BAD
48
Keyword stuffing threshold In body, meta tags, alt text, etc. = demotion
??
49
Keyword dilution Targeting too many unrelated keywords on a page, which would detract from theming, and reduce the importance of your REALLY important keywords.
??
50
Page edit - can reduce consistency Google patent -
Google is now switching between a "newer" cache, and several "older" caches, frequently drawing from BOTH at the same time.
This was possibly implemented to frustrate SERP manipulators. Did your last edit substantially alter your keywords, or theme? Expect noticeable SERP bouncing.
6 - 7
51
Frequency of Content Change Google patent - Too frequent = bad
32, 33
52
Freshness of Anchor Text Google patent - Too frequent = bad
??
53
Dynamic Pages Problematic - know pitfalls - shorten URLs, reduce variables (". . no more than 2 or 3", M.Cutts July 2006), lose the session IDs
??
54
Excessive Javascript Don't use for redirects, or hiding links
??
55
Flash page - NOT Most (all-?) SE spiders can't read Flash content
Provide an HTML alternative, or experience lower SERP positioning.
??
56
Use of Frames Spidering Problems with Frames - STILL
-
57
Robot exclusion "no index" tag Intentional self-exclusion
-
58
Single pixel links A red flag - one reason only - a sneaky link.
-
59
Invisible text
OK - No penalty - Google advises against this.
All over the place - but nothing is ever done. (The text is the same color as the background, and hence cannot be seen by the viewer, but can be visible to the search engine spiders.) I believe Google does penalize for hidden text, since it is an attempt to manipulate rank. Although they don't catch everyone.
-
60
Gateway, doorway page

(I see changes here - not only does the doorway page disappear, but the main page gets pushed down, as well - this is a welcome fix.)
OK - No penalty - Google advises against this.
Google used to reward these pages.
Multiple entrance pages in the top ten SERPs - I see it daily. There they are at #2, with their twin at #5 - 6 months now. Reported numerous times.
-
61
Duplicate content (YOUR'S)
Duplicate content (THEIR'S) below (Highjack)
OK - No penalty - Google advises against this.
Google picks one (usually the oldest), and shoves it to the top, and pushes the second choice down. This has been a big issue with stolen content - the thief usurps your former position with YOUR OWN content.
-
62
HTML code violations
(The big G does not even use DOCTYPE declarations, required for W3C validation.)
Doesn't matter - Google advises against this.
Unless of course, the page is totally FUBAR.
Simple HTML verification is NOT required (but advised, since it could contribute to your page quality factor - PQF).
- - Since the above 4 items are so controversial, I would like to add this comment:
There are many things that Google would LIKE to have webmasters do, but that they simply cannot control, due to logistical considerations. Their only alternative is to foment fear and doubt by implying that any violation of their "suggestions" will result in swift and fierce demotion.
(This is somewhat dated - G is fixing these things.)
IN GENERAL, this works pretty well to keep webmasters in line. The fallacy of this is that attentive webmasters can readily observe continuing, blatant exceptions to these official pronouncements.

There are many anecdotes about Goggle "taking care" of a problem. Google states that they do not provide hand-tweaked "boosts", but are silent about hand-tweaked demotions. They occur, for sure. To believe otherwise is naive. Wouldn't YOU swat the most obnoxious flies? I would.

It is becoming easier to determine the best thing to do. Try to avoid any Google penalties or demotions.
-
119
(added)
 
Phrase-based ranking, filters, penalties Feb. 2007 - Google patent granted. Do not use phrases that have been associated and correlated with known spamming techniques, or you will be penalized. What phrases? Ahh, you tell me.
-
122
(added)
Poor spelling and grammar Pages that are higher quality and more reputable (i.e. higher PageRank) tend to use better spelling and grammar. Demotion for bad spelling is highly logical.



3. Alleged POSITIVE OFF-Page SEO Google Ranking Factors (43)
Note
Factor
#
POSITIVE
OFF-Page SEO Factors
Brief Note
- INCOMING LINKS :
HOT
63
Page Rank Based on the Number and Quality of links to you
Google link reporting continues to display just a SMALL fraction of your actual backlinks, and they are NOT just greater than PR4 - they are mixed.
-
64
Total incoming links ("backlinks")
Historically, FAST counted best (www.alltheweb.com).
No more - Yahoo (parent) broke it.

In Yahoo search, type in:
linksite:www.domain-name.com
linkdomain:www.domainname.com

Try MSN -
http://beta.search.msn.com
Use link:www.domainname.com

Current TYPICAL Backlink Reporting Ratios -
Google - 30 links
MSN - 1,000 links
Yahoo - 3,000 links
-
65
Incoming links from high-ranking pages
In 2004, Google used to count (report) the links from all PR4+ pages that linked to you. In 2005-2006, Google reported only a small fraction of the links, in what seemed like an almost random manner. In Feb. 2007, Google markedly upgraded (increased) the number of links that they report.
-
66
Acceleration of link popularity
(". . . used to be a good thing" ... Martha)
Google patent
Link acquisition speed boost - speculative
Too fast = artificial? Cause of -30 penalty?
Sandbox penalty imposed if new site?
- FOR EACH INCOMING LINK :
-
67
Page rank of the referring page
Based on the quality of links to you
HOT
68
Anchor text of
inbound link to you
Contains keyword, key phrase?
#1 result in SERP does NOT EVEN need to have the keyword(s) on the page, ANYWHERE!!! What does that tell you? (Enables Google-bombing - search for "miserable failure")
-
69
Age of link
Google patent - Old = Good.
-
70
Frequency of change of anchor text
Google patent - Not good. Why would you do that?
-
71
Popularity of referring page
Popularity = desirability, respect
-
72
# of outgoing links on referrer page
Fewer is better - makes yours more important
-
73
Position of link on referrer page
Early in HTML is best
-
74
Keyword density on referring page
For search keyword(s)
-
75
HTML title of referrer page
Same subject/ theme?
28
76
Link from "Expert" site?
Google patent - Big time boost (Hilltop Algorithm)
Recently reported to give a big boost !
-
77
Referrer page - Same theme
From the same or related theme? BETTER
-
78
Referrer page - Different theme
From different or unrelated theme? WORSE
-
79
Image map link?
Problematic?
-
80
Javascript link?
Problematic- attempt to hide link?
- DIRECTORIES :
-
81
Site listed in DMOZ Directory?
The "Secret Hand" DMOZ Issues
1. Legitimate sites CAN'T GET IN
2. No Accountability
3. Corrupt Editors
4. Competitive Sites Barred
5. Dirty Tricks Employed
6. Rude dmoz editors

Flawed concept - communism doesn't work
Free editing? Nothing is free.
DMOZ Sucks Discussions
DMOZ Problems Discussions

The Google Directory is produced by an unknown, ungoverned, unpoliced, ill-intentioned, retaliatory, monopoly enterprise, consisting of profiteering power-ego editors feathering their own nests - the ODP. AOL is making millions, and needs to police it's run-amok entity. Enough already!
This is a tough one.
Google's directory comes STRAIGHT from the DMOZ directory. You should try to get into dmoz.
But you can't.
Be careful whom you approach with the old spondulix -
Formal DMOZ Bribe Instructions.
It is almost impossible to get into DMOZ. This site cannot get in, after waiting over 2 YEARS (33 months). Not even in the lowest, most insignificant category, "Personal Pages". I guess I just don't "measure up" to the other 20,000+ sites in the personal category.
I'm not the suck-up type - I kissed them off long ago. What a waste of time!

UPDATE: This page (not site) finally got indexed in June 2007, thanks to a legitimate editor. No money was paid.

Google needs to DO SOMETHING about populating its own directory with the skewed, incomplete, poorly determined results from the dysfunctional Open Directory Project - the ODP!
Absolute Power Corrupts Absolutely
-
82
DMOZ category?
Theme fit category?
General or geographic category? Both are possible, and acceptable.
HOT
83
Site listed in Yahoo Directory?
Big boost - You can get in by paying $299 each year.
Many swear it is worth it - many swear it isn't.
-
84
Site listed in LookSmart Directory?
Boost? Another great vote for your site.
-
85
Site listed in inktomi?
Inktomi has been absorbed internally by Yahoo.
-
86
Site listed in other directories (About, BOTW, etc.)
Directory listing boost (If other RESPECTED directories link to you, this must be positive.)
-
87
Expert site? (Hilltop or Condensed Hilltop) Large-sized site, quality incoming links
HOT
88
Site Age - Old shows stability Google patent
Boost for long-established sites, new pages indexed easily
The opposite of the sand box.
-
89
Site Age - Very New Boost Temporary boost for very new sites - I estimate that this boost lasts from 1 week to 3 weeks - Yahoo does it too.
-
90
Site Directory - Tree Structure Influences SERPs - logical, consistent, conventional
-
91
Site Map and more site map Complete - keywords in anchor text
-
92
Site Size Previously, many pages preferred - conferred authority upon site, thus page. Bigger sites = better SERPs
Now, fewer pages preferred, due to proliferation of computer-generated pages. Google has been dropping pages like crazy.
-
93
Site Theming Site exhibit theme? Use many related terms?
Have you used a keyword suggestion tool?
A thesaurus?
- PAGE METRICS - USER BEHAVIOR:
Currently implemented through the Google tool bar?
34, 35
94
Google patent - # of visitors, trend
15,16,21
95
Page Selection Rate - CTR
Google patent - How often is a page clicked on?
36, 37
96
Time spent on page
Google patent - Relatively long time = indicates relevance hit
45, 46
97
Did user Bookmark page?
Google patent - Bookmark = Good
47
98
Bookmark add/ removal frequency
Google patent - Recent = Good?
-
99
How they left, where they went
Back button, link clicked, etc.
- SITE METRICS - USER BEHAVIOR :
Currently implemented through the Google tool bar?
34, 35
100
Google patent - # of visitors, increasing trend = good
-
101
Referrer
Authoritative referrer?
-
102
Keyword
Keyword searches used to find you
-
103
Time spent on domain
Relatively long time = indicates relevance hit
Add brownie points.
38
-
DOMAIN OWNER BEHAVIOR :
40
104
Domain Registration Time
Google patent - Domain Expiration Date
Register for 5 years, Google knows you are serious.
Register for 1 year, is it a throw-away domain?
39
105
Are associated sites legitimate?
Google patent - No spam, ownership, etc.



4. Alleged NEGATIVE OFF-Page SEO Google Ranking Factors (13)
Note
Factor
#
NEGATIVE
OFF-Page SEO Factors
Brief Note
-
120
(added)
Traffic Buying Have you paid a company for web traffic? It is probably low quality traffic, with a zero conversion rate. Some providers of traffic for traffic's sake may be considered "bad neighborhoods". Can Google discount your traffic (for true popularity), because they know it's mostly phony?
Have you read about Traffic Power?
22-29
106
Temporal Link Analysis In a nut shell, old links are valued, new links are not.
This is intended to thwart rapid incoming link accumulation, accomplished through the tactic of link buying.
Just one of the sandbox factors.
18
107
Change of Meanings Query meaning changes over time, due to current events
BAD
108
Zero links to you You MUST have at least 1 (one) incoming link (back link) from some website somewhere, that Google is aware of, to REMAIN in the index.
BAD
109
Link-buying
(Very good IF you don't get caught,
but don't do it -
when caught, the penalty isn't worth it.)
Google patent - Google hates link-buying, because it corrupts their PR model in the worst way possible.
1. Does your page have links it really doesn't merit?
2. Did you get tons of links in a short time period?
3. Do you have links from high-PR, unrelated sites?
41, 42
110
Prior Site Ranking Google patent - High = Good
BAD
111
Cloaking Google promises to Ban! (Presenting one webpage to the search engine spider, and another webpage to everybody else.)
??
112
Links from bad neighborhoods, affiliates Google says that incoming links from bad sites can't hurt you, because you can't control them. Ideally, this would be true.
However, some speculate otherwise, esp., when other associated factors are thrown into the mix, such as web rings.
BAD
113
Penalties - resulting from
Domain Hijacking
(work with Google to fix)
Should result in IMPRISONMENT, forthwith!
Grand Theft, mandatory minimum sentence.
The criminal COPIES your entire website, and HOSTS it elsewhere, with . . . a few changes.
-
114
Penalty - Google TOS violation WMG is the worst offender - gobbles up tons of Google server time by nervous Nellie webmasters. Google even mentions them by name. I think that Google will spank you when you cross the threshold, of say, 100 queries per day for the same term, from the same IP. Google can block your IP. Get a Google API.
??
115
Server Reliability - S/B >99.9% What is your uptime? Ever notice a daily time when your server is unavailable, like about 1:30 AM? How diligent must Googlebot be? This is the worst reason to get dropped - you just aren't there! An ISP maintenance interruption can cause delisting..
-
116
No more room
Pages being dropped from large sites
The 232 problem - Google has hit the 4.3 Gigabyte address space wall. Bull! Google now has over 8 Gigs of indexed pages.
Thousands of pages are disappearing from various huge websites, but I think that it is G just cleaning house, by dumping computer-generated pages.

117
Rank Manipulation by
Competitor Attack

(1. Content theft causing you to get a duplicate content penalty, even though your content is the original - Google has problems tracking original authorship. People are still stealing my content, but nobody trumps me (in Google) with my own content - hats off to Google.)
Examples -
Site-Wide Link Attack
and
302 Redirect Attack
and
Hijacker Attack
Impossible by Google definition (except for a few nasty tricks, like making your competition appear to be link spammers)
Ideally, there SHOULD be nothing that your competition can do to directly hurt your rankings.
However, an astute observer noticed that Google changed their website to read :
Old verbiage = "There is nothing a competitor can do to harm your ranking ..."
New verbiage = "There is ALMOST nothing a competitor can do ..."
An obvious concession that Google thinks that at least some dirty tricks work!

Of course, there will always be new ones!
-
118
Bouncing Ball Algorithm






At least 2, and often 3 identifiable Google Search Algos are currently in use, alternating pseudo-randomly through the data centers.
G has moved to a daily dance. Multiple changing factors are applied daily. GOOD LUCK NOW on trying to figure things out!

IN ADDITION, some the above factors are being "tweaked" daily. Not only are the "weights" of the factors changed, but the formula itself changes. Change is the only constant.

An algo change can boost or demote your site. I put this in the negative factors section, because your position is never secure, unless of course, you are huge (PR=7 or greater). If you simply cannot achieve top position, your only alternative to first page SERP exposure may be Google Ad Words (you pay for exposure).

Today, I searched for an extremely competitive "2-word term", and I found that NOT ONE of the top ten Google SERPs had even one of the words on the page.
YOWSA!
Today's theory - when it doesn't matter, anybody can get #1 in a second, if they know the on-page rules. BUT, after a certain "commercial competitive level", the "semantic analysis" algo kicks in, and less becomes more. The keyword density rules are flipped upon their noggins. I think that we are witnessing the evolution of search engine anti-seo sophistication, right before our very eyes. Fun stuff.




Notes to the Above 122 Google Ranking Factors
1
I have tried to summarize the best opinions of many webmaster forum posters.
2
There are no published rules - this is my continuously changing compilation of SEO chatter. This is my semi-annual, one-way technical Google ranking blog, if you will.
3
If your keywords are Rare and Unique, then Page Rank doesn't matter.
4
If your keywords are very Competitive, then Page Rank becomes very important.
5
The fewer incoming links that you have, the more important on-page factors are, for noncompetitive terms.
6
There are a million ifs, ors, buts . . . I am attempting a concise summary.
Exceptions to EACH of the POSITIVE ON-Page factors are frequent and many.
However, I feel that it is important to score highly on as many factors as possible, since factor weight and even factor consideration are changing constantly - CYA. Not to mention the other SEs.
7
A few words about the LANGUAGE used on the Google site -
in a phrase - "soft spoken". We see it everywhere these days.
I am referring to understatement, sometimes even to the point of confusion.
"significant", "may", etc.
For example, when Google states that maybe it might not be a good idea to do a particular thing, what they SOMETIMES really mean is "If you do it, you are history".
Some Google suggestions are actually commands (STRONG HINTS) in disguise.
At some point, you begin to realize this.
Google just can't tell us everything, literally. Sooooooo, take the hints.
8
Become religious. Seek the light. It's there, but you gotta look.
LISTEN UP! Read the rules. Read between the lines. Carefully.
Differentiate. Project. Carefully analyze your own situation.
Webmaster Guidelines
http://www.google.com/support/webmasters/bin/answer.py?answer=35769

How does Google rank pages?
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=34432

Google Facts and Fiction
http://www.google.com/search?hl=en&q=%22google+facts+and+fiction%22

Search Engine Optimizers
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35291

comments

Total Pageviews

PR