SEO (Search enginer optimization) copywriting

0 views
Skip to first unread message

Sachin Jadhav

unread,
Apr 6, 2009, 3:13:21 PM4/6/09
to bim...@googlegroups.com

Hi friends,

Here is an interesting information about SEO copywriting

What exactly is SEO copywriting? 
It may be described as the skillful art of writing content for your website so that it appeals to your targeted traffic / users as well as conforms to the search engine’s requirements for greater visibility on the internet.

For example, your text should contain commonly searched keywords, so that your site is picked up by the search engines. At the same time, however, you have to be careful, that you do not stuff your text with keywords, without bothering about its readability. This might entail your content as being labeled as spam by the search engines and getting banned from them.

The solution lies in writing unique and interesting content, with just the right number of targeted keywords inserted , so both the purposes are served: of rich compelling content as well as a appealing to the needs of the search engines.

How do you write the most effective SEO copywriting?
A helpful suggestion for effective SEO copywriting would be to base your text in the form of an upside down triangle. The base of the triangle should contain your most vital text with key words, while the tip of the triangle should contain information that might not be so important or relevant. The base (the first 200 words of the text) is very important.

It should contain your target keywords as well as an overall overview of the content matter you are writing. The user usually decides to read on the rest of the content only after he feels that the first 200 words are worth while.
Now we come to the art of writing the actual text itself. There is no magic formula for the same, but SEO experts would advice that the ideal number of words per page should be around 250- 500 without around 1-4 keywords placed at strategic points within the content. Avoid stuffing keywords as they might make the language forced and unnatural.
It is also advisable to keep the sentences short and direct and avoid verbose language. At the same time you cannot compromise on the content. It is important to remember that readers usually scan the content for keywords and then read on only if the accompanying text strikes them as important. Therefore it is necessary to strike a balance between rich content and keywords.
Writing shorter paragraphs, and utilizing a lot of white space, enhances the readability of your text.

Also, using bullet points, and emphasizing important points either in bold or italics will make your content reader- friendly.

The skillful use of keywords is important. You should research your target key words before writing your text, rather than inserting them later in pre- written content. This might make the text sound awkward.

Since search engines consider the first 200 pages more important, it is important that the keywords are used near the top of the page and then in proportion later. As head lines are considered by search engines in their rankings of a website, it is extremely important to use key words in the head line.

Last but not the least; you should remember that both your reader and search engines are looking for quality content that is frequently updated. Fresh, relevant updates will draw the readers back to your site and will make you r site more visible on the internet as they will be regularly scanned by search engines.

regards,
Sachin Jadhav
25021
BIM25

ernest kirubakaran

unread,
Apr 6, 2009, 3:19:16 PM4/6/09
to bim...@googlegroups.com
HI all,

Apart from keywords uses, a professional SEO use different optimization techniques with your webpages that include image optimization, text optimization, flash optimization etc. It is normally called on-site optimization.

Geographic optimization is another SEO technique that a professional implements on demand. It's aimed to include webpages in search listings for internet users who search services in a particular geographical region, county or state.

Blogging and creating lot of content rich blogs on wordpress, blogspot, blogflux, hubpages are among latest trends and it has really proved effective in improving search engine rankings.

Social bookmarking is in trend for a pretty long time now. There are hundreds of social media sites that allow dofollow backlinks; improving your link popularity as well as some quality traffic.

Regards,
S Ernest

Ashwini

unread,
Apr 6, 2009, 3:27:44 PM4/6/09
to BIM 25
Webmasters and content providers began optimizing sites for search
engines in the mid-1990s, as the first search engines were cataloging
the early Web. Initially, all a webmaster needed to do was submit a
page, or URL, to the various engines which would send a spider to
"crawl" that page, extract links to other pages from it, and return
information found on the page to be indexed.The process involves a
search engine spider downloading a page and storing it on the search
engine's own server, where a second program, known as an indexer,
extracts various information about the page, such as the words it
contains and where these are located, as well as any weight for
specific words, as well as any and all links the page contains, which
are then placed into a scheduler for crawling at a later date.

Site owners started to recognize the value of having their sites
highly ranked and visible in search engine results, creating an
opportunity for both white hat and black hat SEO practitioners.
According to industry analyst Danny Sullivan, the phrase search engine
optimization probably came into use in 1997.

Early versions of search algorithms relied on webmaster-provided
information such as the keyword meta tag, or index files in engines
like ALIWEB. Meta tags provide a guide to each page's content. But
using meta data to index pages was found to be less than reliable
because the webmaster's choice of keywords in the meta tag could
potentially be an inaccurate representation of the site's actual
content. Inaccurate, incomplete, and inconsistent data in meta tags
could and did cause pages to rank for irrelevant searches.Web content
providers also manipulated a number of attributes within the HTML
source of a page in an attempt to rank well in search engines.

By relying so much on factors exclusively within a webmaster's
control, early search engines suffered from abuse and ranking
manipulation. To provide better results to their users, search engines
had to adapt to ensure their results pages showed the most relevant
search results, rather than unrelated pages stuffed with numerous
keywords by unscrupulous webmasters. Since the success and popularity
of a search engine is determined by its ability to produce the most
relevant results to any given search, allowing those results to be
false would turn users to find other search sources. Search engines
responded by developing more complex ranking algorithms, taking into
account additional factors that were more difficult for webmasters to
manipulate.

While graduate students at Stanford University, Larry Page and Sergey
Brin developed "backrub," a search engine that relied on a
mathematical algorithm to rate the prominence of web pages. The number
calculated by the algorithm, PageRank, is a function of the quantity
and strength of inbound links.[5] PageRank estimates the likelihood
that a given page will be reached by a web user who randomly surfs the
web, and follows links from one page to another. In effect, this means
that some links are stronger than others, as a higher PageRank page is
more likely to be reached by the random surfer.
Google attracted a loyal following among the growing number of
Internet users, who liked its simple design.Off-page factors (such as
PageRank and hyperlink analysis) were considered as well as on-page
factors (such as keyword frequency, meta tags, headings, links and
site structure) to enable Google to avoid the kind of manipulation
seen in search engines that only considered on-page factors for their
rankings. Although PageRank was more difficult to game, webmasters had
already developed link building tools and schemes to influence the
Inktomi search engine, and these methods proved similarly applicable
to gaming PageRank. Many sites focused on exchanging, buying, and
selling links, often on a massive scale. Some of these schemes, or
link farms, involved the creation of thousands of sites for the sole
purpose of link spamming.In recent years major search engines have
begun to rely more heavily on off-web factors such as the location,
and search history of people conducting searches in order to further
refine results.

By 2007, search engines had incorporated a wide range of undisclosed
factors in their ranking algorithms to reduce the impact of link
manipulation. Google says it ranks sites using more than 200 different
signals.The three leading search engines, Google, Yahoo and
Microsoft's Live Search, do not disclose the algorithms they use to
rank pages. Notable SEOs, such as Rand Fishkin, Barry Schwartz, Aaron
Wall and Jill Whalen, have studied different approaches to search
engine optimization, and have published their opinions in online
forums and blogs. SEO practitioners may also study patents held by
various search engines to gain insight into the algorithms.

ernest kirubakaran

unread,
Apr 6, 2009, 4:54:38 PM4/6/09
to bim...@googlegroups.com
Hi all,

This is an article on SEO analytics by Maxwell Payne.

SEO Analytics-

This buzzword term is often overlooked or ignored altogether when launching a promotional blitz of a business online. But SEO Analytics (Search Engine Optimization Analytics) is a critical part of promoting your website because successful SEO of your site will bring traffic from all search engines to your site from relevant searches. Say your business sells cell phones and you want to promote your cell phone online store. Search engines use crawlers and spiders that are search programs that browse the internet looking for keywords that match a search. Say someone types in the words 'new cell phones for low prices' into Google; Google's system then browses all the known pages in it's database and returns results that best match the search. Sites with high traffic and the most number of relevant keywords (such as cell phone in this case) will rank higher in the searches. You can't get higher traffic without people finding your site and thus increasing traffic to it.

Don't be fooled by sites that claim to help you prepare your site for a fee; there are plenty of resources available to help tweak your site to maximize the rankings and placement of your site in all the top search engines. Be sure each page of your website includes valuable keywords relevant to your business within the text, links, and meta tags. Meta tags are hidden html that search engine crawlers use to find sites and put them under relevant topics. You can search meta tags on any search engine and find easy step by step ways to prepare these tags for your site in the site coding.

You can also find free sites that will analyze you site and help you maximize your site's position and rankings in search engines; thus where the term SEO Analytics comes from. These analysis's can show you everything from which search phrases often end in hits to your site to where you can add and improvekeywords.

Once you've made sure your site has a lot of relevant keywords throughout the site (sometimes you put in keywords without even knowing it) you'll want to make sure search engines can find your site and if they don't you can even submit your URL to them. Search for free web site submissions online and you'll come across a slew of sites that allow you to enter your URL and a brief description of your site. That information is then submitted to usually 20-4o search engine sites. Those search engines will then crawl your site, find keywords, and place your site in ever-changing rankings in it's search results making it more likely searches for what your selling will end with customers at your site.

Regards,
S Ernest
Reply all
Reply to author
Forward
0 new messages