ساحل تنهایی
unread,Apr 2, 2011, 1:15:10 PM4/2/11Sign in to reply to author
Sign in to forward
You do not have permission to delete messages in this group
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to SEO and Site Optimization Journal
Search engine optimization
Search engine optimization (SEO) is the process of improving the
visibility of a website or a web page in search engines via the
“natural” or un-paid (“organic” or “algorithmic”) search results.
Other forms of search engine marketing (SEM) target paid listings. In
general, the earlier (or higher on the page), and more frequently a
site appears in the search results list, the more visitors it will
receive from the search engine’s users. SEO may target different kinds
of search, including image search, local search, video search, news
search and industry-specific vertical search engines. This gives a
website web presence. As an Internet marketing strategy, SEO considers
how search engines work, what people search for, the actual search
terms typed into search engines and which search engines are preferred
by their targeted audience. Optimizing a website may involve editing
its content and HTML and associated coding to both increase its
relevance to specific keywords and to remove barriers to the indexing
activities of search engines. Promoting a site to increase the number
of backlinks, or inbound links, is another SEO tactic. The initialism
“SEO” can refer to “search engine optimizers,” a term adopted by an
industry of consultants who carry out optimization projects on behalf
of clients, and by employees who perform SEO services in-house. Search
engine optimizers may offer SEO as a stand-alone service or as a part
of a broader marketing campaign. Because effective SEO may require
changes to the HTML source code of a site and site content, SEO
tactics may be incorporated into website development and design. The
term “search engine friendly” may be used to describe website designs,
menus, content management systems, images, videos, shopping carts, and
other elements that have been optimized for the purpose of search
engine exposure. Another class of techniques, known as black hat SEO
or spamdexing, uses methods such as link farms, keyword stuffing and
article spinning that degrade both the relevance of search results and
the quality of user-experience with search engines. Search engines
look for sites that employ these techniques in order to remove them
from their indices
History
Webmasters and content providers began optimizing sites for search
engines in the mid-1990s, as the first search engines were cataloging
the early Web. Initially, all webmasters needed to do was submit the
address of a page, or URL, to the various engines which would send a
“spider” to “crawl” that page, extract links to other pages from it,
and return information found on the page to be indexed.The process
involves a search engine spider downloading a page and storing it on
the search engine’s own server, where a second program, known as an
indexer, extracts various information about the page, such as the
words it contains and where these are located, as well as any weight
for specific words, and all links the page contains, which are then
placed into a scheduler for crawling at a later date.
Site owners started to recognize the value of having their sites
highly ranked and visible in search engine results, creating an
opportunity for both white hat and black hat SEO practitioners.
According to industry analyst Danny Sullivan, the phrase “search
engine optimization” probably came into use in 1997.The first
documented use of the term Search Engine Optimization was John Audette
and his company Multimedia Marketing Group as documented by a web page
from the MMG site from August, 1997 on the Internet Way Back machine
(Document Number 19970801004204).The first registered USA Copyright of
a website containing that phrase is by Bruce Clay effective March,
1997 (Document Registration Number TX0005001745, US Library of
Congress Copyright Office).
Early versions of search algorithms relied on webmaster-provided
information such as the keyword meta tag, or index files in engines
like ALIWEB. Meta tags provide a guide to each page’s content. Using
meta data to index pages was found to be less than reliable, however,
because the webmaster’s choice of keywords in the meta tag could
potentially be an inaccurate representation of the site’s actual
content. Inaccurate, incomplete, and inconsistent data in meta tags
could and did cause pages to rank for irrelevant searches.Web content
providers also manipulated a number of attributes within the HTML
source of a page in an attempt to rank well in search engines.
By relying so much on factors such as keyword density which were
exclusively within a webmaster’s control, early search engines
suffered from abuse and ranking manipulation. To provide better
results to their users, search engines had to adapt to ensure their
results pages showed the most relevant search results, rather than
unrelated pages stuffed with numerous keywords by unscrupulous
webmasters. Since the success and popularity of a search engine is
determined by its ability to produce the most relevant results to any
given search, allowing those results to be false would turn users to
find other search sources. Search engines responded by developing more
complex ranking algorithms, taking into account additional factors
that were more difficult for webmasters to manipulate.
Graduate students at Stanford University, Larry Page and Sergey Brin,
developed “backrub,” a search engine that relied on a mathematical
algorithm to rate the prominence of web pages. The number calculated
by the algorithm, PageRank, is a function of the quantity and strength
of inbound links.PageRank estimates the likelihood that a given page
will be reached by a web user who randomly surfs the web, and follows
links from one page to another. In effect, this means that some links
are stronger than others, as a higher PageRank page is more likely to
be reached by the random surfer.
Page and Brin founded Google in 1998. Google attracted a loyal
following among the growing number of Internet users, who liked its
simple design.Off-page factors (such as PageRank and hyperlink
analysis) were considered as well as on-page factors (such as keyword
frequency, meta tags, headings, links and site structure) to enable
Google to avoid the kind of manipulation seen in search engines that
only considered on-page factors for their rankings. Although PageRank
was more difficult to game, webmasters had already developed link
building tools and schemes to influence the Inktomi search engine, and
these methods proved similarly applicable to gaming PageRank. Many
sites focused on exchanging, buying, and selling links, often on a
massive scale. Some of these schemes, or link farms, involved the
creation of thousands of sites for the sole purpose of link spamming.
By 2004, search engines had incorporated a wide range of undisclosed
factors in their ranking algorithms to reduce the impact of link
manipulation. Google says it ranks sites using more than 200 different
signals. The leading search engines, Google, Bing, and Yahoo, do not
disclose the algorithms they use to rank pages. Notable SEO service
providers, such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill
Whalen, have studied different approaches to search engine
optimization, and have published their opinions in online forums and
blogs. SEO practitioners may also study patents held by various search
engines to gain insight into the algorithms.
In 2005 Google began personalizing search results for each user.
Depending on their history of previous searches, Google crafted
results for logged in users. In 2008, Bruce Clay said that “ranking is
dead” because of personalized search. It would become meaningless to
discuss how a website ranked, because its rank would potentially be
different for each user and each search.
In 2007 Google announced a campaign against paid links that transfer
PageRank.On June 15, 2009, Google disclosed that they had taken
measures to mitigate the effects of PageRank sculpting by use of the
nofollow attribute on links. Matt Cutts, a well-known software
engineer at Google, announced that Google Bot would no longer treat
nofollowed links in the same way, in order to prevent SEO service
providers from using nofollow for PageRank sculpting. As a result of
this change the usage of nofollow leads to evaporation of pagerank. In
order to avoid the above, SEO engineers developed alternative
techniques that replace nofollowed tags with obfuscated Javascript and
thus permit PageRank sculpting. Additionally several solutions have
been suggested that include the usage of iframes, Flash and
Javascript.
In December 2009 Google announced it would be using the web search
history of all its users in order to populate search results.
Real-time-search was introduced in late 2009 in an attempt to make
search results more timely and relevant. Historically site
administrators have spent months or even years optimizing a website to
increase search rankings. With the growth in popularity of social
media sites and blogs the leading engines made changes to their
algorithms to allow fresh content to rank quickly within the search
results