Wednesday, August 27, 2008

Submitting Your Web Site in Directories

Once you have optimized your site for search engines, your next step would be to submit it to the search engines and web directories, so that your site is indexed by search engines and visited by many. Are you confused with search engines and directories?, you are not alone; most people do!. Lets see what we can do to clear your confusion ...

Search Engines and Directories - How they differ?

A search engine is a system that enables users to search and retrieve information from the web. Usually, search engines use a software program, generally referred to as crawlers, to index web pages. Various search engines use complex algorithms to index and categorize the web pages. On the other hand, web directories contain a collection of web pages organized into categories and sub-categories. The main difference between directories and search engine is that search engines use software to automatically index the web pages; whereas in a directory, web sites are organized into categories by people.

Why submit in directories?

If you have submitted to at least one directory and have ever checked the traffic to your site from web directories, you would have noticed that it is negligible comparing to the traffic coming through search engines. This fact prompts you to ask, "Is it worth submitting my site in directories?". Before answering that, lets analyze the benefits of listing in directories...

First, directories bring targeted traffic. Though the traffic generated through directories is very limited, this traffic is from people who, by their own choice, come to our site by browsing through categories in the directories. These people are more likely to remain in our site for long and more likely to be our prospects. Another important benefit is that these directories are considered "expert" sites by search engines. So, submitting a site in directories increases its chance of getting good rank in search results. Isn't it worth submitting your site in directories?

Why submit in directories?

Directories can broadly be classified into "Free Directories" and "Paid Directories". There are many free directories available on the web, such as DMOZ, Yahoo!, World Wide Index, AbiLogic, Gimpsy, JoeAnt, etc. Usually the free directories take a long time to list a submitted web site. So be patient and constantly check these directories for your site. The paid directories list a submitted site usually within a day! If you go for the paid option, you can consider the following directories: Arielis, BOTW, BlueFind, Microsoft bCentral, and GoGuides.

• Make sure your site has no broken links, no broken images, no missing pages, no typos. Have a thorough review of your site.
• In the description field of the submission form, provide a readable sentence. That is, do not stuff this field with keywords, because the editor will find it and reject your site. Similarly, avoid marketing language such as "the best", "Most powerful", etc. because the editor will edit the sentence which may result in removal of your genuine keywords.
• Be careful in selecting a suitable category in the directory. If you do not find a fitting category for your site, check the category where your competitor sites are listed and submit your site there.
• Constantly check whether your site is listed in the directory. Free directories take a bit long to list a site

Conclusion

There is no doubt that search engines consider directories as reliable sources of reference. So, link popularity of the sites listed in directories increases manifold, making them to appear on top results. After all, our main objective in web site optimization is increasing our rank in search engines, and submitting in directories is one of those good practices that would ensure a better position in search engines.

Call Gatesix today and find out how our search engine submission services can help your site remain highly visible.

Monday, June 30, 2008

SEARCH ENGINE OPTIMIZATION - An Introduction

You spent a good deal, worked hard for many days and nights and built a great website, but alas! it doesn't get enough visitors. There are millions upon millions of pages of web content out there and your website is totally lost in the shuffle, like the proverbial needle in a haystack. When search engines ignore your site, your site becomes non-existent in the cyber world. The real problem with your website is that it failed to harness the most cost-effective and powerful Internet marketing strategy: Search Engine Optimization (SEO)!

Search Engine Optimization is the process of making your site appear at the top of search engine results for your domain-specific keywords and phrases. The higher your website ranks in the results of Google, Yahoo, MSN etc., the greater the chance that your site will be visited by many, which in turn would skyrocket your sales.

Simply put, Search Engine Optimization is about making your website visible on search engines. Its commercial purpose is to be the first to hit a customer base effectively. Why Search Engine Optimize

Everyday, millions of people search the web to find out what they look for. If your website doesn't come up top in the results, you lose your business to your competitors. SEO is important not only because it brings lots of visitors to your website, but also because it helps to increase the return on investment, if harnessed properly. Say for example, you have a website that sells mobile phones online. Optimizing the site for targeted keywords like 'low cost mobile phones' would bring it to the top results on popular search engines. This would in turn bring prospective customers to your site, which would result in higher sales.

However, Search Engine Optimization (SEO) is a process that requires patience, careful planning, and a long-term approach. Don't worry, here are 4 easy steps that will help you get started with optimizing your website.

4 Easy Steps to get started with Search Engine Optimization

a) Create traffic targeted content
Identify the keywords, i.e., words or phrases using which your potential customers will search the web. Example, if you are a hair salon, they will probably search using the keywords "hair cut", "hairstyle", etc. After identifying and gathering a set of keywords, categorize them under various themes and write web contents on those themes. That is, under the "hairstyle" theme, you can put keywords such as hairstyles, crew cut, long cut, short cut, curly cut, layered cut, etc. Using a Keyword Suggestion tool, you can see how many people search for a certain keyword. Any keyword for which the number of listings is less and the number of searches is more gives you a better chance of ranking higher than keywords for which the listings are more and the searches are less. Once you got the right keyword, use it appropriately in the page so that the page appeals to both the visitors and the Search Engines.

HTML tags such as title tags and meta tags also play a role in ranking top on search engines. These tags must contain only relevant keywords. Neither fill these tags with keywords that don't appear in the content nor leave them empty.

b) Make your pages Search Engine Ready
Check if your pages are search engine ready, i.e., are they crawlable by search engine spiders, is there something in them that will make the Search Engine neglect the page or give it a lower ranking.

To make your pages search engine ready
  • Your website content must be readable. For the sake of appearing top on the search engines, do not make the content clumsy by repeatedly using the keywords. These kinds of clumsy sites may get top rank, but won't get readers.
  • Add only relevant keywords: Do not stuff the website content, title tag, and meta tags with too many keywords. The keywords that are provided in the title and meta tags must appear in the website content.
  • Do not add invisible text: Most of the search engines find them out and ban those sites.

c)Submit your site to Search Engines.
Now that you have made your site search engine ready, start submitting it to the popular search engines. That way, they know your site exists. Some search engines may require periodical submission. If you find the submission process tough, get a Site Submission tool available on the web. (More on Submitting to Search Engines)

d)Track your performance
Often times, people complain that their position in search engines has gone down. We must not forget that Search Engine Optimization is an ongoing process for reasons like, Search Engines changing their algorithm frequently, new sites coming up for the same keywords as yours, your watch sites (competitor sites) optimizing themselves for search engines and so on. Thus, it is important you keep track of where you rank for each of your keywords in the most important search engines and constantly modify the content so that search engines know that your page is active. This enhances your rank on the search results.

Conclusion
Although Search Engine Optimization seems to be so complicated at the outset, it really is a simple and interesting process that you will enjoy doing yourself. Search Engine Optimization is the most cost-effective, easy to implement Internet marketing strategy that can get you more traffic and in effect more revenue. So go ahead, start optimizing your site. Happy Search Engine Optimizing...

Friday, June 27, 2008

9 Simple Steps toward a Search Engine Optimized Website

Search Engine Optimization - An Introduction

Search Engine Optimization is probably the cheapest and most cost effective form of Internet marketing. In fact some studies indicate Search Engine Optimization as being the second most effective marketing strategy. The increasing popularity of Search Engines and the fact that good quality and cost effective traffic can be brought through them to websites has led to the development of a whole industry that revolves around how to make Web pages/Websites more search engine friendly or in other words better optimized. The result of this advent of Search Engine Optimization industry is the feeling among webmasters and novices in the industry that SEO is like rocket science and has to be handled by professionals only. Thankfully, the truth is that anyone can search engine optimize Websites. All it takes to optimize a Web site and get better ranking and traffic are 9 simple steps.

The 9 Simple Steps

1. Select Right Keywords

This is THE most important step and can easily be the reason for your ranking ahead or ranking below your watch sites (competition). Identify the words or phrases using which people search(might search) for your Web page on the Internet. If you sell dog food, your keywords must have something to do with "dog food". Do not use irrelevant keywords, even if they get you more traffic. If you are not sure of the keywords to be used, use any of the following tools to find good target phrases:

  • Overture is a great tool for assessing popularity of target phrases.
  • Googleis great for brainstorming target phrases.
  • Word Tracker can help you assess popularity and compare how the competitors use the target phrase.

2. Analyze Competition

Determine who your competitors are. It is quite simple; search for the keyword on popular search engines. The sites that show up above your page are your competitors. Analyze those sites and find out how effectively they have used the targeted keywords. Remember that the more popular a target phrase is, the more competition there is likely to be. Sometimes, it makes sense to target a less popular phrase where you can corner the market rather than aiming for the highest popularity phrase.

Check out yours and your competitor's link popularity. In many engines, you can type: link:http://domain-name and get a link count for that particular site. The more the link count, the better.

3. Page Creation and Optimization

After identifying the keywords, create Web pages by targeting one word or phrase for each page. One common mistake most novice SEO's do is dump many keywords into a page. Not only does it make it difficult for you to rank high for each of those keywords, but also it makes the page less readable.

i) Make sure your keywords are present in the following places:

  • Title tag
  • META tags: Description tag and Keywords tag
  • Body text: Heading tags, comment tags,alt tags and prominent places on the page content

ii) Conduct HTML validation for your Web pages. HTML validation helps you find out the errors in HTML code, which may prevent search engines from indexing your site.

iii)Then check and ensure that your pages are Spam-free. When your page elements, such as Title tag, META tag, and body text, are stuffed with repeated keywords, search engines may consider them Spam and ban your site.

4. Visual Review of Page

  • Check whether your site has usable navigation.
  • Ensure you have informative and readable content. Good content will ensure that your page appeals to human visitors as well as spiders.

A badly written page may get a good ranking on search engines, but visitors will move away from your site with the same speed they came in. Although Search engine spiders that grade your site will not look for visual appeal, directory editors and human visitors will!.

5.Link Building

Get inbound links from quality sites. Quality sites are those that rank high on search engines and/or have a good Google PageRank. If the sites are in someway connected with the theme of your webpage that will help increase your rank better. DO NOT turn to link farms for link building this can get your site banned in search engines.

6. Submitting to Search Engines

If you have followed steps one to five you are ready to submit your webpages to the search engines. Chances are the search engines found your webpages through the links that you built but if they some how missed your webpages don't worry you can always let them know that your pages exist through Search Engine Submission.

You can refer the below link for details on submission to search engines

Altavista

http://addurl.altavista.com/sites/addurl/newurl

All The Web
http://www.alltheweb.com/add_url.php

Google
http://www.google.com/addurl.html

MSN
http://submitit.bcentral.com/msnsubmit.htm

Yahoo
http://docs.yahoo.com/info/suggest/

7. Submitting to Directories

Search Engines consider Web Directories as expert documents. A presence in popular directories can help you get a better ranking on search engines. You can submit your site to popular directories, such as DMOZ, Yahoo!, World Wide Index, and Microsoft bCentral. For more information on Web Directories and guidelines for submission.

8. Maintain

Search Engine Optimization is a continuous process. Popular Search Engines keep making changes to their algorithms i.e the way they rank webpages quite often. Thus it is imperative that you continually optimize your pages based on the current algorithms to achieve high ranks.
It's important to measure your rankings atleast monthly. Re-optimize any pages that drop in rank and then re-submit or wait for the search engine to revisit the page.

9. Tracking

Ultimately, it is not top rankings you are really after, but more traffic and sales. High traffic is not something that automatically follows top rankings. It is something you get by ranking high for "good keywords". Thus it is important that you track your website usage using a good log file analysis program and find out which keywords and which keywords bring in most visitors to your pages. Use this information to optimize your pages better for these terms and search engines.

Conclusion

At first glance, search engine optimization may look like magic; but actually all it takes is 9 simple steps. Search Engine Optimization is a simple continuous process that helps search engine do their job more efficiently. It does take a lot of time and patience but stay the course search engine optimization pays for itself in increased revenue. It is worth the time and trouble.

If you follow all the above-listed steps, you will definitely see an improvement in the search engine rankings for your keywords. The best part is that you do not have to spend any money on the expensive search engine optimizers!

Sunday, June 22, 2008

An Introduction to Title Tag Optimization

Title tags are the words that appear at the very top of your web browser, and they tell the search engine what the page is about. For example, see the top band of this browser. You will see "Title Tag Optimization - Search Engine Optimization Tips ". This is the title of this page.

Before discussing how to optimize the Title tag, lets see how exactly it looks in your website's HTML code.

< Title of Your Web page Here< /title >
< name="description" content="Brief description of the contents of the page">
< name="keywords" content="keyword phrases that describe your web page">
< /head >
< meta content="Brief description of the contents of the page" name="description" >
< meta content="keyword phrases that describe your web page" name="keywords" >
Title Tags - The Myth
"The Title tag doesn't really do much".

Is that a myth or the truth? Let's see! Title tags function much like the title of a book. Say for example, you need a book on "Search Engine Optimization". You walk into a library, search all the titles, see a book titled "Search Engine Optimization Tips", and take that book. Interestingly, there were some other books in the library, which had more information on Search Engine Optimization than the book you took. But, their title didn't convey that. Search engines also do the same thing: they look at the title of a web page and decide what it is about. There is no doubt among the SEOs regarding the importance of Title tags in Search Engine Optimization. One of the steps to increase a website's rank is optimizing the Title tags.

How to Optimize Title Tag?

Of all the tags, Title tag is definitely the most important when used correctly. When calculating your web page's relevance to a search, most search engines consider the content of the title tag as one of the parameters and display that content in search engine results pages (SERP). Title tag therefore needs to be carefully constructed in such a way that it increases your website's position in the SERP, and it is attractive enough to encourage a surfer to click on your link.

Similar to writing your site content; write your Title tag for your audience first and the search engines second

  • Have your keywords in the Title tag: Including the keywords in the Title tag increases the relevance of your web page, when someone searches the web with that keyword.
  • Keep the Title tag short and readable: Search engines don't prefer long Title tags. In fact, Google prefers short Title tags. Because some search engines display Title tags in the search engine result pages, make them informative.
  • Use different Title tags for different web pages in your site: Never give the same Title tag for all the web pages. The Title tag of a web page must be relevant for that page.
  • Don't include your company name in the Title tag unless you think it will attract more users. Instead of your company name, you can consider a suitable keyword.
  • Never keep the Title tag empty and never use irrelevant words in the Title tag.

Pay attention to writing your title tag. Don't ignore them, they are a powerful tool and must be used to their fullest advantage. The Title tag helps the search engines decide the theme of the web page being crawled for indexing. When a search for keywords is conducted, the Title tag is given heavy consideration by all search engine algorithms. Also remember, each page in your website is unique and needs a different Title tag. Place the most important keyword phrase for that specific page in the Title tag, and the page will get a certain boost in the search engines. Yahoo and MSN Search are especially influenced by keyword-rich Title tags. Look after your Title tags and they will look after your site traffic.

Thursday, June 19, 2008

Optimizing for MSN

The Advent of MSN as a Search Engine

Microsoft had been napping for a long time and ignored the advancements in the field of Search Engine and Content Targeted Advertising. Although now dependent on Yahoo's Intokmi for their search results, Microsoft has made it very clear that they will compete with Yahoo and Google for their share in the Search Engine market. Given Microsoft's aggressive nature in fighting competition, it would be a grave mistake to underestimate them.

The recently re-launched MSN Search and future MSN Search integration with upcoming versions of Windows is about to make MSN one of the biggest and most important players in the world of searching. Thus, it is imperative to get good ranking in MSN if you want the share of traffic they can give to your Web page. Although MSN search spider does a fairly good job in crawling Web pages, you may benefit by submitting your website at http://search.msn.com/docs/submit.aspx .

Optimizing for MSN Search

With Microsoft sharing Yahoo's Inktomi search index to provide their search results, optimizing for yahoo meant optimizing for MSN. But things are changing at a rapid phase and with Microsoft getting active on the patent front, it is evident that they are working on their own search algorithm.

Luckily for us, the rules of Web page optimization that thought to be followed to please the MSN search algorithm aren't very different when compared to those already followed for other search Engines.

What They Lay Emphasis On?

As with most other Search Engines, MSN Search places heavy emphasis on content. They even allow higher keyword density than Google does. For MSN Search, it is best to keep your pages at least 200 words long and have phrases which searchers commonly use. Other than that, they lay importance in the following in the order they are listed.

• As MSN team declares in their blog that, they attach a lot of importance to the number and quality of sites that link to your pages.

• Clean coding is necessary with MSN Search. They even go to the extent of asking Webmasters to ensure that their pages are HTML validated. MSN's spider has a strong preference for well-written code. If a Website's coding is poorly written, it appears that MSN Search downgrades the site's search rankings heavily.

• A well-designed site map with good link text will help the MSN spider to crawl the site and ensure that all pages are indexed.

• Title tag should be less than 80 characters long and should be attractive enough to make a searcher click on the link.

• MSN Search doesn't rank based on Meta Keywords and Description, but it seems to place some importance on meta tags. So adding appropriate meta tags for each page might be beneficial as well.

• MSN Search recommends that an HTML page with no pictures should be under 150 KB. Therefore, ensure that you limit the size of your Web pages to a reasonable limit.

What MSN Doesn't Like?

MSN search lists the following as being search engine unfriendly due to the difficulty search engine robots have with this type of content:
• Frames
• Flash
• JavaScript navigation
• HTML Image Maps
• Dynamic URL's

Techniques not liked by MSN Search

MSN thinks the following to be unscrupulous SEO practices:
• Loading pages with irrelevant words in an attempt to increase a page's keyword density, this includes stuffing ALT tags that users are unlikely to view.
• Using hidden text or links. You should use only text and links that are visible to users.
• Using techniques to artificially increase the number of links to your page, such as link farms.

As you can see, these "rules" are no different from those mentioned by the rest of the industry. So avoid the above-mentioned techniques and the chances of your getting banned by any search engine are remote. On a related note, this is what MSN search's Program Manager, Eytan Seidman, has to say about spamming MSN.

"You crawled my site, so why can't I find it in your search index? This is one is a little bit easier. The reason that this is most likely happening is that we are detecting the page as spam when we analyze the page to build our index. How can you make sure that this does not happen? The best thing to do is to not spam us. On our site owners help, we talk about some of the things that we consider spam. In case you have not read it, here is a quick refresher: dirty javascript redirects, stuffing alt text, white on white links, off topic links etc. We take this stuff very seriously and we are continuously working to improve our spam detection."

Conclusion

With the increasing popularity of MSN Search and with Microsoft planning to make the search a part of their next windows release, your efforts to optimize your site for MSN are sure to pay off. For more details on optimization for MSN Search, read their help document and blog.

Hotel Internert Marketing by Gatesix

Optimization for Yahoo

Why Optimize for Yahoo?

According to a recent study, of all the searches done through search engines, around 25% of searches are done through Yahoo. That means, if your site is not coming up on Yahoo, you lose 25 percent of your potential visitors. Until February 2004, Yahoo used Google results. So, optimizing for Google was enough to get a top rank on Yahoo. As of 17th February 2004, Yahoo dropped Google results and instead showed search results using Inktomi algorithm. Yahoo's shift from Google to Inktomi made optimizing for Yahoo inevitable.

The New Yahoo Search Engine

Although Inktomi / Yahoo search algorithm doesn't differ too much from that of Google's, it is not exactly a clone of Google's algorithm. Based on the search results on Yahoo, it seems Yahoo's new algorithm gives much importance to keyword density in body text, Title tag, and META tags and to inbound links. Therefore, concentrating on these two items will definitely increase your site's ranking on Yahoo.

Keyword Density

The new Yahoo search engine gives more importance to keyword density. A Website with high keyword density may fare well in Yahoo. The average keyword counts that seem to work are as follows:

  • Title Tag - 15% to 20%: Yahoo displays the title tag content in its result page. Therefore, write the title as a readable sentence. A catchy title will attract the reader to come to your Website.
  • Body Text - 3%: Boldfacing the keywords sometimes boosts the page's ranking. But, be careful not to be awkward to the readers. Too much of boldfaced content irritates readers.
  • META Tags - 3%: In META description and Keyword tags, provide important keywords at the beginning. Do not use the keywords repeatedly in the Keyword tag, because Yahoo may consider it Spamming. Write the description tag as a readable sentence.
Inbound Links/Back Links

Yahoo considers inbound links highly important. Inbound links are the links from other sites pointing to your site. Having considerable links, with appropriate link texts, from quality sites increases your site's ranking in Yahoo.

Static Pages versus Dynamic Pages

Like most other search engines Yahoo prefers static pages to dynamic pages. Sometimes, Yahoo may fail to index dynamic pages. Therefore, consider the following tips to ensure that Yahoo indexes all your web pages:

  • Have static pages with keyword-rich content; it increases the rank of your site on Yahoo
  • If you have some important dynamic pages, prepare a site map or quick links section with links to all the Web pages. This would help the Yahoo spider to craw all your pages.

Frames

Most of the search engines, Yahoo in particular, hate frames. Avoid using frames on your site, because Yahoo spider finds it difficult to crawl them.

The sure-shot solution to rank high on yahoo is simply getting plenty of back links from quality sites, and then having copious keywords in the body text, title, META, and alt tags. As Yahoo holds the 25% share of the Internet searches, it is prudent to have your Web pages optimized for Yahoo. When your Web page ranks high on Yahoo, you get additional traffic that will convert into increased sales.

Vincent S Brown - 1st September 2005

Internet Marketing by Gatesix

Tuesday, June 17, 2008

Podcasting and SEO

Podcasting and SEO: How to SEO your podcasts

There has been plenty of discussion in the blogosphere about blogs and search engine optimization (SEO). Google in particular seems to love blogs. Blogs are rich in content, heavily linked, with links that tend to be contextual, and without much in the way of code bloat or gratuitous flash animation. In short, blogs are search engine friendly out-of-the-box.

But what about SEO’ing a podcast, the blog’s newest cousin?

Podcasting (where anyone can become an Internet radio talk show host or DJ) presents unique opportunities to the marketer/content producer that blogging does not. I expound on this a bit more in my recent MarketingProfs article but the benefits of podcasting from an SEO standpoint wouldn’t seem as obvious. Podcasts are usually audio content, so you don’t get all this rich textual content that the search engine spiders can snarf up. You also don’t get the rich inter-linking that happens with blogs because you can’t embed clickable URLs throughout your MP3 files.

Nonetheless, I believe you can SEO your podcasts. Here’s how:

  • Come up with a name for your podcast show that is rich with relevant heavily searched-on keywords.
  • Make sure your MP3 files have really good ID3 tags — rich with relevant keywords. ID3V2 even supports comment and URL fields. The major search engines may not pick up the ID3 tags now, but they will! And besides, there are specialty engines and software tools that already do.
  • Synopsize each podcast show in text and blog that. Put your most important keywords as high up in the blog post as possible but still keep it readable and interesting.
  • Encourage those who link directly to your MP3 file to also link to your blog post about the podcast.
  • Consider using a transcription service to transcribe your podcast or at least excerpts of it for use as search engine fodder. Break the transcript up into sections. Make sure each section is on a separate web page and each separate web page has a great keyword-rich title relating to that segment of the podcast. And, of course, link to the podcast MP3 from those web pages. There are many transcription services out there, where you can just email them the MP3 file or give them an URL and they send you back a Word document. Here’s a partial list of transcription services.
  • Submit your podcast site to podcast directories and search engines such as audio.weblogs.com.
  • Let people in your industry, such as bloggers and the media, know that you have a podcast because podcasting is quite new and novel. It will be more newsworthy and link worthy than just another blog in your industry.
  • Don’t just get up on your soapbox. Have conversations with others, in the form of recorded phone interviews, and podcast those as well. Pick people who have great reputations on the web and great Page Rank scores, and ask that they link to your site and to your podcast summary page.

This isn’t meant to be a comprehensive list of tactics. It is simply meant as a catalyst for creative thinking. SEO, in particular the link building aspect, isn’t about just following a set list of formulae. It is about creatively thinking outside the box and differentiating yourself in ways that make your site eminently more links worthy than your competitors.

Search Engine Optimization for Podcasts

By Grant Crowell | March 9, 2006 Podcasting is comparatively new, though there are already numerous podcast search engines and it's important to optimize your audio files if you want listeners to find your spoken content.

A special report from the Search Engine Strategies conference, December 5-8, 2005, Chicago, IL.

Podcasting—recording an audio or video file and uploading it to the web so that users with iPods or other media players can download the content—is a hot subject. Panelists on this session focused on how best to prepare and optimize podcasts for search engines.

Podcasting and search

"Podcasting is an interesting challenge from a search standpoint," said Joe Hayashi, Senior Director of Product Management at Yahoo "It is not only audio, it's also video. It's also a subset of audio—it is meant to be consumed in a particular way. Podcasts are a subset of multimedia, and the techniques to really find a podcast need to scale across multiple domains."

In some ways, podcast search engines are similar to traditional search engines except that podcast search engines crawl the Web constantly for rich media files. "If we come across things like podcasts or any other audio or video file," said Suranga Chandratillake, Co-Founder and CTO of Blinkx, "we ingest those into our index and allow people for people to search for that content on either our own site or thru various syndication partners."

"Most people are still wondering what a podcast is and have trouble not only finding it," he said. "So we put a lot of energy into not only the search (finding) aspects but the consumption aspects as well. We have done a variety of things—search, editorial, a browsing system and a tagging system for podcasting."

"We're really leveraging the community out there to provide great content to people," Hayashi continued. "We provide a lot of community tools: a tagging system, a ratings and review system—this lets us discover high quality content. The tagging system also influences search results."

Metadata and Podcasts.

In the past, many multimedia search engines relied heavily on metadata to determine relevancy. Now these search engines are able to utilize speech recognition to determine the content of an audio file.

"Podscope is the first podcast search engine that actually looks for and listens to every spoken word in a podcast," said David Ives, President and CEO of TVEyes. "We believe that speech recognition and actually cracking open the audio file is essential for finding relevant podcasters. We have a solution called 'pinpoint audio' which enables us to play an audio snippet to determine the relevancy of that term within a podcast."

"Metadata alone is not a sufficient indexing criteria to find relevant podcasts," he said.
"I also agree that just metadata is not enough, said Chandratillake.”The average podcast today, which is about 15-20 minutes long, only has 25-30 words describing it. There is no way that short description contains everything that is in the 'meat' of podcast. That is also why we use speech recognition to understand more completely what it’s about."

Podcast optimization tips and guidelines

Speakers offered the following tips and guidelines for optimizing podcasts:

  • Promote only one feed. "Many podcasters create a podcast, then move over to a different content management system, promote a new RSS feed, and wind up with all of these different feeds out there for every podcast," said Dick Costolo, CEO of Feedburner. "You want your content to be easily discovered. Promoting one feed makes it easy for search engines to know where your content is."
  • Optimize the audio file. A lot of people listen to a podcast on their computer as well as MP3 players.
  • Close the findability gap. "Optimize a landing page for each episode of your show, as well as your category page," said Amanda Watlington, owner of Searching for Profit. "Provide subscription information on the landing pages that's very visible."
  • Build correct and valid feeds. "Validate your feeds with feed validator tools," said Watlington. "Remember that iTunes does not redistribute. So you must build a separate feed for iTunes. I like to promote doing 3 separate feeds: a 2.0 feed, a media feed and an iTunes feed."
  • Include a transcript or summary. Whether or not it is a transcript or a summary will depend on the podcast's time span. "If you're giving just a little short tip, that's one thing," said Watlington. "Typically, a summary is all you need for your landing page, a nicely optimized page that covers the podcast's high points."
For marketers, more of your focus needs to be on the development and findability side, not gadget seduction. "Nobody is going to listen to the podcast no matter how elegant it may seem," Watlington concluded. "Focus on findability, focus on quality content and engaging the user. Focus on something people will want to listen to."

Page Rank Algorithm

Few Important Points to Note about PageRank

The first points we may notice from theses mathematical explanations about PageRank are so important that we prefer telling them here.

The PageRank of one page B only depend on 3 factors:

• the number of pages Ak linking to B,
• the PageRank of each page Ak,
• the number of outward links of each page Ak.

So it does not depend on the following criteria:

• the traffic of the sites linking to B,
• the number of clicks on the links to B within the pages Ak,
• the number of clicks on the links to B within the results pages in Google.

These points having been mentioned, let's go on with a question that should interest a lot of webmasters

The Page Rank Algorithm


The original Page Rank algorithm was described by Lawrence Page and Sergey Brin in several publications. It is given by

PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))

where

PR(A) is the Page Rank of page A,
PR(Ti) is the Page Rank of pages Ti which link to page A,
C(Ti) is the number of outbound links on page Ti and
d is a damping factor which can be set between 0 and 1.

So, first of all, we see that Page Rank does not rank web sites as a whole, but is determined for each page individually. Further, the Page Rank of page A is recursively defined by the Page Ranks of those pages which link to page A.

The Page Rank of pages Ti which link to page A does not influence the Page Rank of page A uniformly. Within the Page Rank algorithm, the Page Rank of a page T is always weighted by the number of outbound links C(T) on page T. This means that the more outbound links a page T has, the less will page A benefit from a link to it on page T.

The weighted Page Rank of pages Ti is then added up. The outcome of this is that an additional inbound link for page A will always increase page A's Page Rank.

Finally, the sum of the weighted Page Ranks of all pages Ti is multiplied with a damping factor d which can be set between 0 and 1. Thereby, the extend of Page Rank benefit for a page by another page linking to it is reduced.

The Random Surfer Model

In their publications, Lawrence Page and Sergey Brin give a very simple intuitive justification for the Page Rank algorithm. They consider Page Rank as a model of user behaviour, where a surfer clicks on links at random with no regard towards content.

The random surfer visits a web page with a certain probability which derives from the page's Page Rank. The probability that the random surfer clicks on one link is solely given by the number of links on that page. This is why one page's Page Rank is not completely passed on to a page it links to, but is devided by the number of links on the page.

So, the probability for the random surfer reaching one page is the sum of probabilities for the random surfer following links to this page. Now, this probability is reduced by the damping factor d. The justification within the Random Surfer Model, therefore, is that the surfer does not click on an infinite number of links, but gets bored sometimes and jumps to another page at random.

The probability for the random surfer not stopping to click on links is given by the damping factor d, which is, depending on the degree of probability therefore, set between 0 and 1. The higher d is, the more likely will the random surfer keep clicking links. Since the surfer jumps to another page at random after he stopped clicking links, the probability therefore is implemented as a constant (1-d) into the algorithm. Regardless of inbound links, the probability for the random surfer jumping to a page is always (1-d), so a page has always a minimum Page Rank.

A Different Notation of the Page Rank Algorithm

Lawrence Page and Sergey Brin have published two different versions of their Page Rank algorithm in different papers. In the second version of the algorithm, the Page Rank of page A is given as

PR(A) = (1-d) / N + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))

where N is the total number of all pages on the web. The second version of the algorithm, indeed, does not differ fundamentally from the first one. Regarding the Random Surfer Model, the second version's Page Rank of a page is the actual probability for a surfer reaching that page after clicking on many links. The Page Ranks then form a probability distribution over web pages, so the sum of all pages' Page Ranks will be one.

Contrary, in the first version of the algorithm the probability for the random surfer reaching a page is weighted by the total number of web pages. So, in this version Page Rank is an expected value for the random surfer visiting a page, when he restarts this procedure as often as the web has pages. If the web had 100 pages and a page had a Page Rank value of 2, the random surfer would reach that page in an average twice if he restarts 100 times.

As mentioned above, the two versions of the algorithm do not differ fundamentally from each other. A Page Rank which has been calculated by using the second version of the algorithm has to be multiplied by the total number of web pages to get the according Page Rank that would have been caculated by using the first version. Even Page and Brin mixed up the two algorithm versions in their most popular paper "The Anatomy of a Large-Scale Hypertextual Web Search Engine", where they claim the first version of the algorithm to form a probability distribution over web pages with the sum of all pages' Page Ranks being one.

In the following, we will use the first version of the algorithm. The reason is that Page Rank calculations by means of this algorithm are easier to compute, because we can disregard the total number of web pages.

The Characteristics of Page Rank

The characteristics of Page Rank shall be illustrated by a small example.

We regard a small web consisting of three pages A, B and C, whereby page A links to the pages B and C, page B links to page C and page C links to page A. According to Page and Brin, the damping factor d is usually set to 0.85, but to keep the calculation simple we set it to 0.5.












The exact value of the damping factor d admittedly has effects on PageRank, but it does not influence the fundamental principles of PageRank. So, we get the following equations for the PageRank calculation:

PR(A) = 0.5 + 0.5 PR(C)
PR(B) = 0.5 + 0.5 (PR(A) / 2)
PR(C) = 0.5 + 0.5 (PR(A) / 2 + PR(B))

These equations can easily be solved. We get the following PageRank values for the single pages:

PR(A) = 14/13 = 1.07692308
PR(B) = 10/13 = 0.76923077
PR(C) = 15/13 = 1.15384615

It is obvious that the sum of all pages' PageRanks is 3 and thus equals the total number of web pages. As shown above this is not a specific result for our simple example.

For our simple three-page example it is easy to solve the according equation system to determine PageRank values. In practice, the web consists of billions of documents and it is not possible to find a solution by inspection.

The Iterative Computation of PageRank

Because of the size of the actual web, the Google search engine uses an approximative, iterative computation of PageRank values. This means that each page is assigned an initial starting value and the PageRanks of all pages are then calculated in several computation circles based on the equations determined by the PageRank algorithm. The iterative calculation shall again be illustrated by our three-page example, whereby each page is assigned a starting PageRank value of 1.

Iteration PR(A) PR(B) PR(C)
0 1 1 1
1 1 0.75 1.125
2 1.0625 0.765625 1.1484375
3 1.07421875 0.76855469 1.15283203
4 1.07641602 0.76910400 1.15365601
5 1.07682800 0.76920700 1.15381050
6 1.07690525 0.76922631 1.15383947
7 1.07691973 0.76922993 1.15384490
8 1.07692245 0.76923061 1.15384592
9 1.07692296 0.76923074 1.15384611
10 1.07692305 0.76923076 1.15384615
11 1.07692307 0.76923077 1.15384615
12 1.07692308 0.76923077 1.15384615

We see that we get a good approximation of the real PageRank values after only a few iterations. According to publications of Lawrence Page and Sergey Brin, about 100 iterations are necessary to get a good approximation of the PageRank values of the whole web.

Also, by means of the iterative calculation, the sum of all pages' PageRanks still converges to the total number of web pages. So the average PageRank of a web page is 1. The minimum PageRank of a page is given by (1-d). Therefore, there is a maximum PageRank for a page which is given by dN+(1-d), where N is total number of web pages. This maximum can theoretically occur, if all web pages solely link to one page, and this page also solely links to itself.

The Random Surfer Model


In their publications, Lawrence Page and Sergey Brin give a very simple intuitive justification for the PageRank algorithm. They consider PageRank as a model of user behaviour, where a surfer clicks on links at random with no regard towards content. The random surfer visits a web page with a certain probability which derives from the page's PageRank. The probability that the random surfer clicks on one link is solely given by the number of links on that page. This is why one page's PageRank is not completely passed on to a page it links to, but is devided by the number of links on the page.

So, the probability for the random surfer reaching one page is the sum of probabilities for the random surfer following links to this page. Now, this probability is reduced by the damping factor d. The justification within the Random Surfer Model, therefore, is that the surfer does not click on an infinite number of links, but gets bored sometimes and jumps to another page at random.
The probability for the random surfer not stopping to click on links is given by the damping factor d, which is, depending on the degree of probability therefore, set between 0 and 1. The higher d is, the more likely will the random surfer keep clicking links. Since the surfer jumps to another page at random after he stopped clicking links, the probability therefore is implemented as a constant (1-d) into the algorithm. Regardless of inbound links, the probability for the random surfer jumping to a page is always (1-d), so a page has always a minimum PageRank.


A Different Notation of the PageRank Algorithm

Lawrence Page and Sergey Brin have published two different versions of their PageRank algorithm in different papers. In the second version of the algorithm, the PageRank of page A is given as

PR(A) = (1-d) / N + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))

where N is the total number of all pages on the web. The second version of the algorithm, indeed, does not differ fundamentally from the first one. Regarding the Random Surfer Model, the second version's PageRank of a page is the actual probability for a surfer reaching that page after clicking on many links. The PageRanks then form a probability distribution over web pages, so the sum of all pages' PageRanks will be one. Contrary, in the first version of the algorithm the probability for the random surfer reaching a page is weighted by the total number of web pages. So, in this version PageRank is an expected value for the random surfer visiting a page, when he restarts this procedure as often as the web has pages. If the web had 100 pages and a page had a PageRank value of 2, the random surfer would reach that page in an average twice if he restarts 100 times.

As mentioned above, the two versions of the algorithm do not differ fundamentally from each other. A PageRank which has been calculated by using the second version of the algorithm has to be multiplied by the total number of web pages to get the according PageRank that would have been caculated by using the first version. Even Page and Brin mixed up the two algorithm versions in their most popular paper "The Anatomy of a Large-Scale Hypertextual Web Search Engine", where they claim the first version of the algorithm to form a probability distribution over web pages with the sum of all pages' PageRanks being one.

In the following, we will use the first version of the algorithm. The reason is that PageRank calculations by means of this algorithm are easier to compute, because we can disregard the total number of web pages.

Wednesday, June 11, 2008

Google Trend Update

Google Trends is a service that can be used to see how popular certain search terms are across geographic regions, cities, and languages. Google updated its Trends tool that allows users to see the popularity of a term and compare the level of interest in favorite topics--or people, such as those who of you who like to Google your name. It has updated its Trends tool, to include numbers.

With the new Google Trends, you can now view the numbers on the graph and can also download them to a spreadsheet.

Google Trends analyzes a portion of the search engine giant's Web searches to compute how many searches have been done for the terms entered relative to the total number done on Google over time. Based on that information, the Search Volume Index graph charts the results. Users can search up to five terms.

Previously the tool allowed people to view graphs showing trends including how frequently particular key terms were searched for across geographic regions, by different age groups or in different languages, however no numerical data could be inputted.

In the Google Blog, Google has explained the new updates to Google Trends in a very 'delicious' manner. The comparison in the example is between two prominent ice cream flavors, vanilla and chocolate. The aim is to learn as to how many searches are made for each flavor.

A subset of the tool is Google Hot Trends, which shows what people are searching for on the day of their search. Instead of showing the most popular searches overall, which would always be generic terms, Hot Trends highlights searches that experience sudden surges in popularity and updates that information hourly. Google's algorithm analyzes millions of Web searches performed on the search engine and displays the results that deviate the most from their historic traffic pattern. The algorithm also filters out spam and removes inappropriate material.

Now the file can be downloaded allowing users to analyse it along with the numbers involved, although it will remain scaled with relative results rather than actual ones.

Google Trends can be used for fun as well as for a more practical purpose, although users must first sign into their Google account. The search engine company has been busily expanding its brand of late. Earlier in the year it unveiled plans for the creation of Google Health, an online database that gives internet users access to their own medical histories.

Currently, Google Trends is only available in English and in Chinese. Hot Trends is only available in English. The company said it hopes to roll out Google Trends in other regions and languages in the future.

Tuesday, June 3, 2008

Presentation of Google

With nearly 50% of the traffic generated by the whole of the search engines and directories (in France), Google must not be neglected. From a PhD research subject for two American academics (Larry Page and Sergey Brin) Google became a company on the international scene.

The success of this engine comes on the one hand from the algorithm worked out by the 2 founders, and on the other hand of the application of an elementary principle: the simplest things are sometimes most effective. In this case, Google chose a very stripped interface, without advertisement, by concentrating its services on the search for Web pages and nothing else. The engine also enjoys a very great speed in the interrogation of its data base.

In addition to results of research considered to be relevant by many users, Google succeeded to index a very great number of pages: its "index" is from now on one of the largest in the world (if it is not the first), with approximately 2 billion pages. Recently, new types of documents were indexed, in addition to the traditional HTML: Word, Excel, Acrobat, PowerPoint, WordPad, etc.

The algorithm is based on two systems

1) a precise analysis of the contents of the indexed pages (keywords, occurrences, positions in the document, type of HTML tag, etc.)
2) a classification of the pages according to their popularity (PageRank), calculated from the topology of
the Web (i.e. the whole structure of the documents and the links between them).

Indexing of Webpages by Google

Google set up a crawler-type software, named Googlebot. It is a robot indexing Web pages (and now other types). Its principle is simple (but not its implementation!): when it reads a page, it adds to its list of pages to visit all those linked to the page in the current process.

Theoretically, it should thus be able to know the majority of the pages of the Web, i.e. all those which are not orphan (a page is known as orphan if no other links to it). The volume of data to be treated being important, this robot is a program distributed on hundreds of servers.

In addition to the knowledge of the greatest number of pages, Google also wants to index them regularly, because many the pages are updated from time to time. Moreover the frequency of visit of Googlebot on a Web page depends on its PageRank : the larger it is, the more it will often index it. From one passage to another, Googlebot can detect a page become non-existent ("error 404").

This colossal mass of information will be analyzed by Google in full details. Each word or sentence will be associated to a type, based on HTML tags. Thus a word contained in the title will be considered to be more significant than in the body text. These types may be classified according to their importance (title of the page , headings H1 to H6, bold, italic, etc). This preprocessing, associated with other criteria including the PageRank, makes it possible to provide the most relevant results in first.



Wednesday, May 28, 2008

Hospitality and Search Marketing

By, Max Starkov and Jason Price

Search engine marketing is an essential component of the hotel direct online distribution strategy. According to Forrester research about 80% of overall website visits begin in a search engine or a directory service. Many other surveys also show that up to 85% of Internet users rely on search engines to locate relevant information on the Web (e.g. Google, Yahoo, MSN, etc). Search marketing is an extremely dynamic field. Search algorithms change, new search techniques and formats introduced, new search services launched, new challenges emerge on a daily basis that keep search marketers busy. The implications of all this in hospitality are enormous and some highlighted in this article.

Background

In 2005, online travel sales will account for an estimated 30 percent of total travel sales- up from 25 percent last year and 21 percent in 2003, according to a recent report by Merrill Lynch. By 2007 online travel sales will represent 39 percent of all travel revenue, with growth from direct suppliers outpacing that of online travel intermediaries.

In hospitality, last year over 25% of all revenues will be generated from the Internet (20% in 2004, 15% in 2003) (HeBS, PhoCusWright). Another 25% of hotel bookings will be influenced by the Internet but transacted offline through call center, walk-ins, group bookings, and even via email inquires. Indeed the explosive growth in online hotel reservations was best illustrated when for the first time in mid 2004 Internet bookings surpassed GDS bookings.

The same Merrill Lynch analysis concluded that search engines are driving much of the increase in online bookings. This report estimates that travel search technology accounted for $600 million in direct bookings last year. What’s more, it predicts that search-related bookings will double each year through 2007.

Search Engines & Search Behavior

Search engines and search marketing has received much global attention. Search engines are as pervasive as the Internet. Google is now a public company with a market cap of $80 billion; MSN launched a new search engine; AOL announces the creation of its own search engine; and traditional marketing budgets are being rewritten for search marketing and the web.

Top Search Engines Ranked by Search Share, July 2006:

(2006 Nielson/NetRatings): Source Net Rating from SearchEngineWatch.com

  • Google | 49.2%
  • Yahoo! Search | 23.8%
  • MSN Search | 9.6%
  • AOL Search | 6.3%
  • Ask Jeeves | 2.6%
  • Others | 8.5%
In recent years research firms have begun studying the influence of search on consumer behavior, and its impact on the travel industry. They have identified online users as only somewhat satisfied with search results and are willing to switch from one search engine to another showing very little loyalty. Here are some more of their findings:

Search Behavior:

  • 1 in 2 Internet users will use one or more search engines in a search
  • 1 in 3 use a search engine tool bar installed on the web browser
  • 17% use search engines for specific reason: Yahoo to search music; Google to search for a song
  • Relevance is still the driver; sponsored results have to be relevant
  • 3 out of 4 will start at the search engine when going to a website
  • 22% are looking for a website they already have in mind

(Keynote Research)

Search engine loyalty is low:

  • Searches on one search engine occurs on other search engines
  • 58% of searches conducted on Google are then applied to Yahoo or MSN
  • Loyalty cannot be taken for granted
  • Loyalty is low b/c switching cost is low

(Nielson NetRating)

Search in Travel is Destination Focused

Unlike other e-commerce categories, Internet users search for travel and hospitality services and offerings within the context of the destination. Therefore the search engine strategy for travel and hotel websites is subject to a different methodology than what the generalist SEO (Search Engine Optimization) companies offer. Marketing a bank, eyeglass store, or dental office does not factor the characteristics or intensity of the destination. Nor do generalists differentiate travel search behaviors from general online consumers.

A destination-focused search engine strategy requires in-depth knowledge of the travel and hospitality industry, extensive destination research, destination target keyword analysis, and destination search behavior. Only a destination-focused search engine strategy can help the travel and hotel website leverage the popularity of the destination to its benefit.

Search in Travel (includes hospitality):

  • 73% use search to find travel; 27% went directly to travel site
  • Travel is top 4 category in use of search requests and some studies report in the top four types of searches
  • Travel searches originating from a search engine tends to lead to travel being purchased two weeks out.
(Performics)

With such vast numbers of searches originating from search engines, clearly search engines are an essential component to the hotel’s direct online distribution strategy. Ranking high on the engines and consistently staying there along with matching the right budget to compete effectively are all major competitive issues with advertisers online.

Why Search Engine Ranking is Important - “The Golden Triangle”

The order in which the hotel appears on a search engine is of absolute importance. As far back as 2002, the Bear Stearns industry report Web Storm Rising stated, “Our research uncovered that being listed in the top five assures the highest level of bookings, and that after the fifth slot, bookings drop dramatically. Approximately 50% of people on the first page will go to the second page and so on.”

Over the last year or so, a new term, “Golden Triangle” has entered the search marketing vernacular. Novel research using beams of light that bounce off the eyeballs of online test users and onto a conditioned computer screen, captured certain patterns of online viewing behavior when on search engines. The highest concentration of visualizations appeared on the top three to four listings in the natural listings and top one to two in the sponsored listings. Basically a triangle began to form as more people tended to look in this top corner of the page now referred to in search marketing as the “Golden Triangle.”

Here are some other findings:

  • Drop off begins after the 3rd natural listing
  • First position in sponsored links drew 28% of visualizations
  • Beyond the rank of 8 in the natural listings, there was a 50% drop off
  • Bear Stearns 50-50 rule no longer stands; more like 80-20
  • People who search below the fold are “more deliberate” seekers (may suggest have something already in mind to find)

So the conclusion drawn from above is that competing on the search engines by appearing as early and as often as possible is of increasing importance. How a website achieves top position is not a simply adjustment of the web page and the money starts to flow in but a concerted effort that requires time, expertise, and resources in website optimization and search marketing.

In hospitality, search marketing is part of your online distribution strategy. We have all become travel agents with our desktop, laptop, PDA, or other electronic devises and the strategy is to reach your specific customer segments when they are searching for you.

Lodging companies that do not have the marketing budget of the major intermediaries must rely even more on search engine referrals. Therefore good positioning of your hotel website on the major search engines is of critical importance and can directly affect your bottom line.

Note : The above article is based on collection various resources.

Tuesday, May 20, 2008

Concept of Database Marketing

There is a quote by "Thomas Prendergast"
"The Big guys know database and if you don't your lost!"

Database Marketing is a powerful and competitive weapon - especially on the Internet. The growth of database marketing is rooted in the small business philosophy of staying close to the customers, under-standing and meeting their needs and treating them well after the sale.

Corporate marketing is tied to BIG, general marketing or advertising campaigns with a single untargeted message. This message may be based on the companies Unique Selling Proposition (USP). However, customers have different needs and a single USP spelt out to the whole market is no longer enough.

Messages must be tailored to specific segments of the market and ultimately to the market segment of one, the individual customer. Computerizing the customer database makes it possible to address messages more specifically and market additional products to each customer.

The characteristics of fully fledged database marketing are ....
  1. Each customer and prospect is identified as a record on the marketing database; markets and market segments are groups of individual customers.
  2. Each customer and prospect record contains not only identification and access information but also a range of marketing information. It also includes information about past transactions and about campaign communications.
  3. This information is accessible before, during and after the process of each interaction with the customer/prospect, to enable "you" to decide how to respond to the customer/prospect's needs.
  4. The database is used to record customer/prospects responses to campaigns.
  5. The information is available to marketing policy makers to enable them to decide such things as which target markets/segments are appropriate for each product/service etc.
  6. Selling many products to each customer, the database is used to ensure that the approach to the customer is co-ordinated; and a consistent approach developed.
  7. The database eventually replaces market research. Marketing campaigns are devised such that the response of customers to the campaign provides information, which the company is looking for.
  8. Marketing Management automation is developed to handle the vast amount of information generated by DBM. This identifies opportunities and threats more or less automatically! This is fully fledged marketing automation. Very few companies have succeeded in doing this; but many have it as their goal.

DBM presents many challenges to management. It requires careful maintenance of great volumes of detailed customer data. Accessing the data, interpreting it, and using it to drive or support the marketing function requires a long-term marketing systems development policy.
It also requires computing and marketing people to work together, often educating each other. And, it may well require most people in the company to forget their traditional way of doing business.

DBM will only work if dealing with customers is viewed as an on-going process (Customer Contact Process).

The ladder of loyalty is a key concept in DBM.
1) No awareness of business or product/service.
2) Awareness of business
3) Awareness of product/service
4) Positive perception
5) Reognition of personal benefit
6) Enquiry
7) Objections overcome
8) Sale of product or service
9) Entry into continued relationship.

DBM is used to move customers up the ladder.

The essence of database marketing is communicating directly with the customers and asking them to respond in a tangible way. It provides the means for the customer or prospect to respond and is set up to measure and fulfill the response.

It sets up or reinforces a relationship with the customer, which is "fulfilled" when we follow up a customer's response to our communication. Fulfillment may be in many ways; a personalized email, a telephone conversation, sending literature (pdf), a sales visit, attendance at a web seminar, exhibition or store, or sending products to the customer.

So DBM is a broad discipline, not a separate marketing communications medium, but a way of using any medium to elicit the desired response.

Source :- Veretekk

Friday, May 9, 2008

Google Now Comes in Hindi

By MD Ansari
IF YOU are a person who is more comfortable with Hindi than English and are highly keen to translate some text, a web page, an issue or a query in Hindi as well as have results translated from English, don’t worry. Internet’s undisputed king when it comes to search engines - Google, is making your dream a reality by offering Hindi translation services. So it is a double bonanza for Indian Internet users, especially those who do not know English but are keen to enjoy the benefits of the Internet.

It is a well-versed fact that only 30 per cent of Indian population can read and write English and the remaining are at sea when it comes to the language. It means that about 700 million people are not taking advantage of the Internet. Keeping this in mind, Google Inc has introduced automated translation between Hindi and English. This is an audacious step to ensure that Hindi speaking people can also enjoy the Internet regardless of their unfamiliarity with the English language. There is no doubt that this ground-breaking step will certainly increase the number of users of Internet.

Google, the monster search engine for translator supporting Hindi language is the most advanced in a series of updates focussed at multi-lingual users not only in India but also in the world. Some of the multilingual features offered by Google include data entry, blogs and search. Internet users can type in Hindi using a regular keyboard - type the word as it sounds; like type ab apna sa laga using regular English characters, hit the space bar, and Google converts it to the corresponding Hindi words.

This feature is also available in one of the widely used social networking websites - Orkut. Now, its users proudly used Hindi translation service for sending scraps not just in Hindi but also in other Indian languages like Tamil, Telugu, Kannada and Malayalam.
There is no denying the fact that Google has tried to attract Hindi speaking people who want to use the Internet to meet their demands. Before the advent of Google translation service, they were compelled to sit away from the Internet and consequently Google and other search engines were unable catch millions of Hindi speaking people of India and the world. Therefore, on behalf of all Hindi speaking people, I am congratulating Google from the bottom of my heart for making their dream come true.

According to Techtree Staff
The popular Google translation service now offers Hindi translation as well. So users can now get automated translation between Hindi and English. If you want to translate some text, a Web page, or issue a query in Hindi as well as have results translated from English, you can do it here. Of the total literate Indian population, only about 13 percent are English literate. This launch would ensure that Hindi speaking users can also enjoy the benefits of the Internet, regardless of their familiarity with the English language.

Google Translator supporting Hindi language is the latest in a series of updates focused at multi-lingual users in the country. Some of the multilingual features offered by Google include Data entry, blogs, and search. Users can type in Hindi using a regular keyboard -- type the word as it sounds. For example, type "mera" using regular English characters, hit the space bar, and Google converts it to the corresponding Hindi word. This feature is also available to Orkut users for sending scraps not just in Hindi but also in other Indian languages including Tamil, Telugu, Kannada, and Malayalam.

Wednesday, May 7, 2008

Google AdSense Slow To Report


Problem affecting AdSense publishers globally

Webmasters wonder why their earnings reports from AdSense aren't working, and so is Google.

Something's amiss with AdSense, as publishers told Google they have seen problems with the usual reports from the service. Numerous reports at the AdSense help forum cited lack of updates or wrong earnings in the information being provided from the program.

Google responded from their Inside AdSense blog about the issues. "Our engineers are currently investigating the issue and working to resolve it as quickly as possible. Please be assured that your account data has still been tracked, so this issue will not affect your earnings or payments," they said.

To help webmasters keep up with Google progress on these problems, Google opened a new tracking page. The Known Issues page tracks AdSense issues, and Google's suggested workarounds when available.

No new updates about the reporting issue have made the page as yet. Complaints on the AdSense Help forum continue to roll in, as webmasters find discrepancies in their earnings as well as seeing clicks that apparently generated no page impressions or click-throughs.

Problems appeared to hit AdSense publishers in multiple regions. Complaints rolled in from the UK, Canada, India, and the Czech Republic as well as the US. Since Google has not updated the status of the investigation for over 12 hours, the issue could be much more serious than they revealed.

The two largest threads discussing the issues can be found at WebmasterWorld and Google Groups, but there are plenty of other threads throughout the forums with complaints.

Tuesday, May 6, 2008

Role of an SEO in Internet Marketing

Today search engines play a great role in providing information about every thing, whether you want information about any latest movies, songs, foods, hotels, any specific services every thing is provided by search engines.

But there is one question arises which information is helpful to user & which is not? This question is very common between the users who uses Google, Yahoo, MSN for finding services & other information, because in this corporate world every one wants that, their services will reach to the users & there is lot of competition for this & the result of this war of corporate world is suffered by the users.

So what should be the solution for providing right information to users. Yes now this is the right question. The popular search engines follows some strategies & the websites who follow these strategies will listed in SERP. The strategy of these are very simple, they says that, the websites are content rich & the content will be well presented. The search engines hates the flashy & heavy images.

Now the another question arises, how the websites are designed & developed according to search engines ? This is again a good question. To develop & designed websites according to them somebody have to do deep research of Google, Yahoo & MSN. But who will do this, a designer, a developer or someone specialist in search engines. Yes a specialist, a specialist who know all about Google, Yahoo & MSN, will optimize websites according to the them. From here the role of SEO (Search Engine Optimizer) comes who will do optimization for your websites.

A SEO assist in designing & developing the websites according to Google, Yahoo & MSN. But the work of SEO is not ends here, it has total responsibility to promote the website, so that more people can reach to websites & use the services, products & information. The SEO community is related to marketing industry, that's why it brings more focus on to the end-user. Due to this marketing background SEO's enables to find the targeted audience.

But how to find an effective SEO expert, who will bring you in front of their corporate world. All are proving that they are best, but the best is one who gave results. GateSix Technologies is specialist which offers SEO services of web site to build up your corporate identity, increasing your product presence across the world and make to stay ahead of your competition. If you are looking for leadership status or maintaining your leadership status, GateSix is the right choice for your search as a Search Engine Optimization Company. Gatesix is expertise in helping to strategically improve your search engine placement with desired keywords by using innovative and widely accepted search engine optimization technique and implement search engine marketing campaign to have the top ranking among your competitors.

So why are you standing behind in the crowd, & waiting for your turn, just try Internet Marketing Services of Gatesix.

Friday, April 11, 2008

SEO FAQ's

Q No1 :- What is SEO?
SEO stands for Search Engine Optimization and is defined as (in my own words):

"The process of finding out the best keywords for a web site and by the use of optimizing the web site along with other off-page work making that web site attain a higher position in the search engine result pages (SERPs) for those selected words."

Although the exact calculations used by the search engines are kept secret, there is lot of knowledge and observations in this field from thousands of webmasters worldwide.

It could be said to be a branch of online marketing. In general terms you can say that it means to make a web site more visible and make it look important in the eyes of search engines.

Not being familiar with SEO and not applying it compared to actually doing the right things can make a huge difference in terms of visitors to your web site.

Q No 2 :- How do I find out the best keywords to target?
The "best" keyword depends on the following main factors:

The Amount of Traffic it will generate
Often people choose keywords based on how popular they think they may be. Mostly it is based on "real world" factors rather then fact which is readily available. For instance, I recently saw someone who proposed they were going to go after the term "nursing homes" due to the aging population.
Although this area may be growing quickly, those interested in finding out more information often do not use a computer or if they did, would not be researching it online. Although they would still get visitors, they would find there are much better keywords to target with profit in mind. When it comes to traffic, the best measurement is actual searches. This will tell you how many people search in a day or month for that term and it can be a great indicator. By far, the most popular tool for finding this out is the Keyword Suggestion Tool. This tool combines the two most popular ways of judging popularity, Wordtracker and Overture. In addition, it suggest related keywords and lists their traffic. Always remember however, that this is total searches. These search numbers will always be divided among the SERPs.

The Difficulty of Attaining a Top Ranking
If you simply chose the keywords with highest amount of traffic, you could still lose money. This is because these keywords typically warrant a lot more work to rank for. A perfect keyword is one that has a lot of searches but little SEO competition and moderate to easy to rank for. The best tool I have found for this is the Keyword Difficulty Tool created by Rand Fishkin. It will give you an indication of the amount of SEO work required which you can balance against the number of searches.

The Profitability of that Keyword
There are also keywords where you may get 1000s of visitors with only one conversion while others where you can achieve 1 for every 100. This should be factored in as unless you make your money per impression, you want the highest number of conversions per visitor. The best way I know to evaluate this is to run an AdWords account. The amount of data you receive by starting a campaign can be very useful in establishing the conversion rate. I believe it is always better to spend $10 to find out a keyword isn't profitable then to spend 6 months getting it to number one, THEN find out its a dog.

The moral of the story is although there is no such thing as a "perfect" keyword; you can find the best ones for you by using a combination of the factors above.

Q No 3 :- What is KEI and how do I use it?
KEI stands for Keyword Effectiveness Index. KEI is a ranking system based on how popular a key word is and how much competition it has on the Internet. The higher the KEI number, the more popular your keywords are and the less competition they have. It also means that you'll have a much better chance getting ranked high on a search engine. A low KEI score means not many people are searching for that keyword and it has too much competition. Hence, eliminate all KEI scores with a low number and choose those with a high KEI score. The higher the score, the more profitable your keywords will be to your web site.

Q No 4 :- What are the most important things for on-page optimization?
On-page optimization is the part of SEO where you deal with the pages itself, opposite to off-page optimization and keyword analysis.

Here is my opinion on the most important elements in on-page optimization and some brief information about them.

Content
It has been said over and over since years that a successful way of establishing a web site is by adding more good unique content to it on a regular basis. It cannot be stressed enough although still people are not doing it and want some "magic" to happen.

Make sure that you have your targeted keywords included in the content on your web pages in a natural way. My rule of thumb is that you write without thinking about it and when you finish you can look over the text and add the keyword maybe on one or two places extra where it fits in. If it looks spammy or too much in any way then reduce it. You are writing for the visitor and not for the search engines - never forget that.

Also make sure that the text is unique as if it is the same as on other web pages it can raise a red flag at major search engines and duplicate web pages and even whole domains gets erased from the index.

Weight factors
By placing your targeted keyword in places such as the title, H1 and H2, Strong tags and Emphasis tags you put more weight on those words and for the search engines it becomes more relevant for those words.
But don't overdue it because if you place the same word or phrase on all the weight tags on the same page you can get hit by the over-optimization penalty which means that basically the search engines figured you tried to cheat them and push you down the rankings.

Navigation structure
Make sure that the search engine spiders can follow the internal links on your site. If you have a site based on a database it is recommended that you use mod_rewrite to get the best benefit.

If you have lot of pages on your site and they are buried down in the navigation tree of 3 or more clicks away than I recommend the use of a sitemap and a link to it from each page of your site. A sitemap makes is easy for search engines to find all your pages and it can also be a great resource for your visitors to find a specific page quickly.

If one or more web pages of your site is more important than the other ones, like the home page, then get more links to it from the other web pages of the site. A good example is to have a link, "home", on each web page back to your home page. That is also useful for your visitors and it gives more power to the home page in the search engine rankings.

Q No 5 :- Is the use of meta tags dead?
Yes, in fact they have not been relied upon since many years. An article from Danny Sullivan stating exactly this was released in October 2002.

There are still some minor “stone age” search engines around that uses them.
The main reasons why they have been ceased to work are mostly from these factors:
Most webmasters tried to fool the search engines with Meta tags unrelated to their content and services.

With improved FTS (full text search) tool kits from verity and many other companies, search engines can index your web pages and know the theme of your web page. With such advanced APIs, search engines like google can easily decide, what your website is about and what your website offers.

Some of the basic features of FTS API are that they can filter out text of your webpage and get important statistics such as:

1) How many times a word gets repeated
2) How far each repeated words are from each other
3) How many times a particular word gets repeated in a particular sentence
4) How far a word 'Online' appears from words like 'Party', 'invitations' to see if that sentence makes any sense.
5) They can easily figure out, if you are doing keyword dumping.

So with such API's the webmaster should concentrate on the content/layout and not put the meta tags as a main concern.

However it has been tested that the meta keyword tag still has a minor influence on the rankings and the meta description tag should be used as it is some times shown in the SERPs (search engine result pages).

Q No 6 :- Where in my code should I put the keywords?
We all know it is not enough to have your keyword in the Meta keyword tag.
Here is a list of places to put it in the source code, ordered by estimated weight:

1) Title tag.
2) H1 and H2.
3) In paragraphs and general text on the site.
4) In STRONG tags: Keyword
5) In the file names of the web document: www.domain.com/keyword.html
6) ALT description attributes on image tags:
7) TITLE attributes on anchor tags:
8) SUMMARY attributes on tables:

9) In the file names of images:
10) Meta description tag.
11) Meta keyword tag.

Q No 7 :- How is the best way to write the title?
The title is most probably the single most important place to put your keyword.
Have the word in the beginning of the title and also in the end. Try to vary it in different forms as well.
If you want to brand your company name you should keep that name in the end.
Try to follow this and at the same time make it look natural and appealing for the visitors. Remember that this is what is most visible in the SERPs for the visitor.

Q No 8 :- What is the best way to write the URL's?

In regards to Google it has been stated by two of their staffs involved in the SEO community (Google Guy and Matt Cutts) that the dashes (-) are better than underscores (_) when writing the URLs. This has also been confirmed by my own tests on the matter.

In regards to Yahoo, MSN and other search engines I actually don't know but I think it varies.

And speaking of putting a dash in URLs, hyphens are often better than underscores [Ed. Note: bolded by Matt :)]. hollywood-hotel.html is seen as two words: “Hollywood” and “Hotel”. hollywood_hotelis seen as one word: hollywood_hotel. It’s doubtful many people will be searching for that.

Q No 9 :- Which factors are considered unethical or black hat SEO?

Page cloaking is a BIG one, which basically consists of using server side scripts to determine whether the visitor is a search engine or a human and serving up different pages depending. The script would serve up a keyword rich, totally souped up page to search robots, while giving humans an entirely different page.

Link farms are considered black hat. Link farms are basically sites, which consist of masses of links, for the purpose of getting rankings in search engines and turning a profit (usually off of affiliate program advertisements on the website).

Duplicate content can keep a page from getting indexed, and some people have even reported trouble with entire websites being duplicated by unethical webmasters, causing problems with their rankings.

Spamming keywords, in meta tags, title tags, or in your page's content is a black hat SEO trick that used to work well back in the 90's. It's long since been basically exterminated as a useful trick.

Linking to "bad neighborhoods", or sites that have been banned from search engines for using the above tricks, while not particularly black hat is definitely unhealthy for your own sites rankings.


Q No 10 :- How should good navigation look like?

Good Navigation can be broken down to one word - "Breadcrumbs"

If you remember the fairy tail "Hanzel and Gretel" you will recall that when the children were kidnapped, they dropped breadcrumbs so that their rescuers would be able to find them. In our situation, imagine its your site that has been kidnapped and the Search Engines are your "rescuer".

An example of Breadcrumbs would be the following

*Home-->Sublevel1-->Sublevel2

This type of navigation allows the Search Engines to find all of your pages in the most efficient and thorough way. It also aids "Deep Crawls" which are crucial to dynamic sites which may have 100,000 or more pages. Not only should you use this style of navigation behind the scenes but displaying the Breadcrumbs somewhere on the site will help both the Search Engines and visitors alike. A perfect example of this is www.dmoz.org

The second part of navigation is the "site map". This is a page which contains a link to every one of your pages. To be fully optimized, the links should have a descriptive anchor text to further help the page you are lining to. In addition your site map should always be within one click of the index page as this will help the Search Engines find it quickly.

Using these two methods of navigation will ensure your site gets fully indexed and will add to your user’s experience

Types of Links
The major type of navigation to avoid or to at least compensate for is JavaScript pulldown menus. Because Search Engine bots will not follow these, it is important to compensate by having text links somewhere else on the page. These can be in the footer or worked in elsewhere in the content. In fact, JavaScript as navigation in general has been shown to hinder indexing. There are a few alternative ways to code your JavaScript however if you always code a backup plan, you will enjoy easy indexing without the worry Answer provided by:

A very important point to remember is that always link to your main pages from your home page and if possible from other pages as well.
If you have a link from your home page to another page of yours then you are telling the search engine that this page is also important and needs to be indexed soon. Same goes for pages linked from many pages within the website. A good practice is to add a link to sitemap from your home page. The sitemap contains link to all other pages and helps in faster indexing.
Additionally all your pages should also have a link to the home page so people can return to home page in case they find lost.
This also adds weight to your home page from a search engine's point of view.
The text in the link should describe the page as closely as possible and if using image links you must add ALT tags in the image.
If you really need to have a JavaScript navigation then you should also add text links at the bottom of the page .

Q No 11 :- What should my domain name look like?

This is a dilemma that webmasters have faced more and more as there are fewer prime domains to choose from. There is a difference between the optimal domain name and the one you have to settle for. In general, if you are picking a domain for SEO purposes, it should contain the keywords you are targeting. Many will say because there is an extra bonus in the SERPs for this (File that under Theory, Assumption, or Speculation), but the true power is in the SEO power that your natural links will garner. When people link to you on their own, they may use the name of your website. Most of the times they will get it from your URL. If your keywords are there, they will then be in the anchor text used.

As far as the other factors of the domain, unless you are after a local market (i.e. UK and therefore .co.uk), always go for the .com . This is trusted in general by newbies who may hesitate at a .net (Hard to believe but true)

The last important factor is the use of "-"'s. Avoid them if possible but if you have to, use them conservatively. More then 2 may be viewed as Spam so doesn’t cross that threshold.

If you aren't choosing your domain for SEO reasons, pick something unique. After all, who heard of www.google.com 7 years ago

Q No 12 :- What is PageRank and how does it work?
One big way in which search engines are granting importance to web pages are based on how many other sites are linking to it.

PageRank is a system implemented by Google that measures web page's importance with only taking into account the links and links related factors. The other big search engines (Yahoo, MSN) probably have a similar system but without an official measurement.

The basics are that each web page is giving votes to the links on that page and the power of the vote is determined by the number of links pointing to the page that gives the vote.

The votes are divided among the links on the page so the lesser number of links, the more share of the vote is given to each link.

Internal links are given a higher share than external.

The scale is between 0-10 but in fact it has many decimals which we cannot see.
0-3 is the most common, 4 is obtained by getting an amount of links and 5 take some work to get. Only a short list of sites on the internet has PageRank 10.

Q No 13 :- What are some recommended ways to get more links to my site?
The best way to get more links to your site is through hard work.

In order to find sites that are relevant to yours and allow you to post a link on, you can perform the following searches:

Replace the keyword with the phrase you are targeting.

keyword "add url"
keyword "add site"
keyword "submit site"
keyword "submit url"

If you perform these searches for various keywords you will find plenty of sites that will allow you to post your link on and that are related as well.

You may also submit to directories. This is a long and hard process but it is an easy and free way to get some incoming links to your site.

Another way that I just discovered is the following query on Google:

intitle:add+url OR intitle:submit+your+site OR intitle:add+your+site "your keyword"

That will list sites in which you probably could add sites related to yours

Another proven successful way is to write an article about a subject you know. When you finish the article you write a short bio about yourself in the end with a link to your site and then you submit it to article submission sites. You can find lists of such sites on various places with up to 100 places you can submit your article. If you are lucky many will accept your article and publish it on their site and each one will have a link back to you.

If you provided a high quality article and are lucky it is likely that your article will get published on authority pages and your page with the article will have some PageRank as well.

The trick here is to write something good and useful and also to submit it to many places.

It is an excellent form of marketing, not only with the backlinks you get but also from the readers who reads your name and bio.

Q No 14 :- Can incoming links hurt me?
There has been some recent speculation that links to your site that looks spammy may hurt your rankings.

So how do spammy links look like?

1) Thousands from the same IP.
2) All the same anchor text.
3) Only up temporarily and have no real age.
4) Coming from spammy/questionable/not-so-good sites.
5) Has a set pattern.
6) Unnatural growth, like suddenly thousands overnight.

Q No 15 :- What is a Link Farm?
A linkfarm is any type of website in which there is no real content, service, or purpose, but rather just a load of non-related reciprocal links to other places. Generally linkfarms are built to increase search engine rankings and turn a profit, which means they're also generally littered with advertisements from affiliate programs the site owner has partnered with.

Linkfarms are not to be confused with Linkdumps, which are simply places people dump all kinds of links to content on various websites.

Q No 16:- Can I get my site accepted faster into DMOZ?
DMOZ (www.dmoz.org) has the ability to frustrate webmasters with its opaque site selection process. Though the guidelines are clear and fair, the actual implementation is not.

The only 3 things you can do to possibly get into DMOZ faster are:

1) Follow all the editorial guidelines
2) Choose the right subcategory
3) Submit a meaningful site.

Despite following all 3 of these instructions, you might be in a position where your site is not added for years, if at all. This has more to do with the availability of editors for the site, and their choice.

There are stories about people managing to "bribe" editors into getting their sites into DMOZ faster. These are unverified, and I would not recommend this route.

Despite all its faults, the popularity of the DMOZ data is so high that DMOZ submission is well worth the 15 minutes it takes. Good Luck.
Trust level: Proven and confirmed

There is a 4th way to get your site listed in DMOZ, become an editor.

There is a proven way of getting your site listed faster, if the site has got some kind of connection to a local area (county/town). If so you can submit it to the local area section (county town) and then after getting listed use the change form to apply a change to the main directory.


Q No 17 :- Does the web host have anything to do with the ranking?
As such, a web host can not help to improve the rankings. Search Engines will not rank you higher because your site is hosted on a particular web host (means buying hosting from Yahoo does not means your site will rank well in Yahoo) but yes if you choose a web host that remains down a lot that can certainly have a negative effect on your rankings.

So do keep in mind that though a web host will not help to improve your SE rankings but yes a bad host can have a bad effect on your rankings.

There is an additional factor to this.

Google and probably the other major search engines are checking the IP numbers on links and it is well known that by linking to a "bad neighborhood" can trigger filters which can cause your site to go down on the rankings.

Now this is when you link to a bad neighborhood, although here, imagine being in a bad neighborhood. That is, your host is hosting other bad web sites and they have the same or same class of IP like your innocent site. Guess what Google will think? Right. Answer provided by:

Q No 18 :- What is relative vs absolute links?
When designing a site, you will always face the question of whether you should use relative or absolute links. Later in this answer I will explain the benefits of each but first here is a definition:

Relative: Relative links usually look something like index.html or /folder/page.html. The way the page knows where to go is all relative to the page the link is placed on. A link to index.html for example, will only work if the file is found in the current folder.

Absolute: Absolute links usually look something like http://www.example.com/page.html. This is a full url and the linked to page will be found regardless of where that link is located on the site.

Which you use depends on the following:

Speed: When your browser goes to find a page with a relative url it looks within the existing site. When it uses an absolute url it leaves the site for an instant and "refinds" the page. This means when it comes to speed, relative is the way to go.

Ease of Design: When you are designing a site using notepad, the danger with relative urls is that if you move a folder, it can break your entire site. As each page depends on another, if you are not a find and replace whiz, absolute may be your best bet.

SEO: This is the one area that I would place firmly in "Theory or Assumption". The truth is that we know broken links can hurt you so the most important is to choose a technique of linking that works best for your site.

Q No 19 :- How do I get my web site to show up in the search engines?
As you might have noticed the submit URL function on the major search engines does not really help that much.

The best way to get your web site listed faster in the search engines is to get links to it from authority web sites (those with high PageRank).

By getting a great link from for example the home page of an established PageRank 7 web site can make your new web site and it's subpages indexed in a matter of hours.

If this is not an option for you then other good ways is to put a link to it from big forums signature, directories, friend’s sites etc.

If you wonder why only the home page is listed and not the rest it is because the search engine spiders first visits one time and then comes back for a "deep crawl". To speed up this process, the solution is to get more links to your site. Answer provided by:

Q No 20 :- What are poison words?
Poison words are those which Search Engines will decrease your ranking for if found in your URL, Title, or Description. These can be a disaster when it comes to ranking and therefore you should be very careful to avoid them.

Poison Words Can be Divided into three categories

Ranking Killers (Adult Words and Obscenities)
Use of these words will cause the major Search Engines to analyze the website as "Adult". For a mainstream website this can mean your visitors never find your site. If one of these words is necessary, its always best to star out one of the letters. The most popular forum software and CMS systems will allow you to do this through their Admin panels

Ranking Decreasers
These include words the major Search Engines associate with a lower quality site. These include but are not limited to Links, Search Engine, Bookmarks, Resources, Directory, BBS, Paid-to-Surf and Forum. The pages containing these words will still be indexed however they will find it very difficult to rank for their main keywords.

Speculative
These words are similar to Ranking Decreases except they are merely theory at this point. They include words like "Free" or "Offer". Many of the pages using these words seemed to take a hit in the Bourbon Update.

The moral of the story is that you should be very careful with your Keyword choices, especially when writing your title and description. If you have a forum or a CMS system, use the profanity filters whenever possible

We can also make some assumptions on Google in particular by referencing the stop words used by their AdSense PPC program. These can be found on Vaughn's Summaries. There you will see a comprehensive list of Keywords that stop AdSense ads from showing. Although this isn't a direct correlation, it does give some insight into the way Google views Keywords Answer provided by:


Q No 21 :- What are stop words?
Stop words are the small common words that the search engines are ignoring.

They should be avoided as much as possible in places like the title, headings and maybe some other places.

Q No 22 :- How do I create a sitemap?
A sitemap should contain links to all your pages. A good rule of thumb is to limit the number of links to under 100 per page, so you may consider splitting sitemaps up by topic, alphabetically, by brand or other logical grouping.

A useful tool to create a basic sitemap is the Xenu's Link Sleuth, a free site crawler that produces an HTML sitemap that can be edited to fit your site's design.
If your content is dynamically driven from a database, you should also create your sitemap(s) from the same data.

Q No 23 :- What is the best way to redirect a site?
The best method is the so called 301 redirect as in this way you also tell the search engines that the page in question has been moved permanently (302 is temporary and is very risky for SEO purposes).

Here are the ways to do it:

Examples using .htaccess

Redirect 301 /oldfolder http://www.toanewdomain.com
Redirect 301 /oldurl.html http://www.yourdomain.com/newurl.html

If your server has windows then use it this way on the file that is being moved:

For windows server

You can also use PHP or ASP for this and then just add these lines at the top of the file:

PHP

header("HTTP/1.1 301 Moved Permanently");
header("Location: http://www.newdomain.com/newdir/newpage.htm");
exit();

ASP 301

Redirect using Meta refresh or JavaScript is not recommended and may even be harmful as this method has been used a lot by spammers and is most likely making the search engines penalizing your site.


Q No 24 :- What is TrustRank?
TrustRank, as defined in a whitepaper by Jan Pederson (Yahoo!), Hector Garcia-Molina (Stanford), and Zoltan Gyongyi (Stanford), located at http://www.vldb.org/conf/2004/RS15P3.PDF, is a system of techniques to determine if a page is reputable or if it is spam. This system is not totally automated, as it does need some human intervention.

TrustRank is designed to help identify pages and sites that are likely to be spam or those that are likely to be reputable. The algorithm first selects a small seed set of pages which will be manually evaluated by humans. To select the seed sites, they use a form of Inverse PageRank, choosing sites that link out to many sites. Of those, many sites were removed, such as DMOZ clones, and sites that were not listed in major directories. The final set was culled down to include only selected sites with a strong authority (such as a governmental or educational institution or company) that controlled the contents of the site. Once the seed set is determined, a human examines each seed page, and rates it as either spam or reputable. The algorithm can now take this reviewed set of seed pages and rate other pages based on their connectivity with the trusted seed pages.

The authors of the TrustRank method assume that spam pages are built to fool search engines, rather than provide useful information. The authors also assume that trusted pages rarely point to spam pages, except in cases where they are tricked into it (such as users posting spam urls in a forum post).

The farther away a page is from a trusted page (via link structure), the less certain is the likelihood that the page is also trusted, with two or three steps away being the maximum. In other words, trust is reduced as the algorithm moves further and further away from the good seed pages. Several formulas are used to determine the amount of trust dampening or splitting to be assigned to each new page. Using these formulas, some portion of the trust level of a page is passed along to other pages to which it links.

TrustRank can be used alone to filter the index, or in combination with PageRank to determine search engine rankings.

Q No 25 :- What is the sandbox?
The sandbox is a kind of filter implemented by Google in March 2004 that applies to maybe 99% of all new sites.

It's function is to push down new web sites in the SERPs.

My theory is that it was Google's solution to stop new spam sites that was created to rank high in the SERPs. It took probably some time before the Google spiders could detect such as a site and ban/penalize it and by that time the creator probably made several new ones.

When this phenomena was first noticed in March 2004 it was seen that it could take two months before a new site was "released" and could rank normally again. However by now October 2005, a time of half a year or more is normal and as long as more then a year has been reported.

By my own observations I have seen in Google that new sites can rank unusually high in the Google SERPs for some weeks before the sandbox filter gets activated. Answer provided by:

Q No 26 :- How do I get out of the sandbox faster?
One theory is that it has to do with link aging.

This means that as soon you put your site live, start getting quality links to it and try to keep them there forever.

The sandbox is a collective filter that still has a lot of confusion and speculations surrounding it. One of the biggest areas is how one can "escape" it or at least make your stay there shorter. The theories (and I stress they are theories) break down into the following areas.

Link Speed
Some SEO's claim the sandbox actually has nothing to do with time and is actually more a function of linking. There is a double edge sword here. To rank high you need lots of links, but to avoid the sandbox you can't get too many links. As a compromise its proposed that you build your links exponentially which basically means if you get 10 the first week you would get 15 the next week and 20 the following, etc. Its believed it is this slow, steady and consistent increase in your number of incoming links that causes Google to see it as a legitimate and emerging site. So in the first month you may have accumulated 300 links but by spreading them out over the month you have avoided the sandbox filter. Alternatively, you can buy your domain before you even start designing your site and slowly accululate links. This way by the time you are completely up and ready, you will not be subject to the sandbox.

Link Quality and Relevance
Majority of SEO's and Webmasters tend to go after reciprocal and directory links when starting their first campaign. After all they are most likely a PR0 with a site no one has ever heard of, so they go with the one technique where these factors have little importance. What results is 100's of unrelated and unimportant links. Google has been devaluing the importance of directories over the last year and with the addition of hundreds a day, this trend seems here to stay. In many theories it is believed that if Webmasters get the majority of their links from "Authority" sites, they will be viewed as important and therefore not subject to the filters reffered to as the Sandbox

Purely Age
There are still those that feel that age is still the most important factor. If you are registering a new domain name, there is no way around this. However, many SEO's buy old and existing domain names and add their own content. If you are able to get a DMOZ listed domain, you may help your chances even more as it will have an existing link structure. There are many services on the web that offer lists of upcoming expirations so you may be able to grab one at a bargain price. Be aware though that there is a number of SEO's that believe that these domains will be "reset" when ownership is changed so you may be subject to the sandbox anyway.

Q No 27 :- How do I write the robots.txt?
As you may, know the robots.txt is a tool used by the search engines to see which pages not to index. It is useful if you have sensitive information or other pages you don't wish to get indexed - that is all it does.

It is simply a text file names robots.txt located in the root.

It contains a line for user-agent and then what to disallow or allow.

Example:

User-agent: *
Disallow: /password.html
Disallow: /temp/


As the * is a wildcard the above means that all user-agents will index all pages except the file password.html and the directory temp and it's content.

Here is another example when disallowing all from a bad spider.

User-agent: badspider
Disallow: /

Q No 28 :- Does a site keep its PR when ownership changes?
The good thing about PageRank is that it is blind. Unless something inappropriate is going on with an individual, Google does not care about the humans who own/buy/sell websites. They focus entirely on the domain name and the website behind it. Changes in ownership are inconsequential to Google.

PageRank rises and falls entirely on the merits of the website and Goggle's algorithms. The risks of PageRank falling or rising are purely technical.

If anything, Google is very forgiving of websites that go inactive. There is a huge aftermarket of entrepreneurs seeking inactive domains with PageRank attached for its reaale value. Google often times is slow to identify inactive domains. This is mostly due to the fact that just because a PageRanked website goes inactive, it's links can remain active for years.

Last year I purchased an inactive domain name with a PageRank of 6 with several hundred links showing up in Google/Yahoo/MSN. Six months later, its PageRank dropped to a 4. It all honesty, it really should have been a 4. I just got to enjoy a PageRank of 6 for a few months. What this tells us is that Google is going to sooner or later correct a website's PageRank. But Google isn't the bad guy here, they are just applying the rules as strictly as they can with what they know. Google doesn't really worry about buying and selling because they know that the technical factors will keep everything even in the long run.

Purchasing a website in itself is no more risky than having owned it for years. A potential buyer needs to understand what the websites SEO strategy has been and how it gets its traffic. If the former owner spent a fortune purchasing traffic and the new buyer isn't planning on pursuing the same kind of traffic then it isn't unreasonable to expect the performance of the website to degrade. If the former owner was engaged in "black hat" or other inappropriate activities in creating traffic and search engine visibility and Google catches on, then the website will suffer along with whoever might own it. When purchasing anything of value, (including websites) the buyer must be diligent to protect themselves.

Q No 29 :- How can I tell my search engine rankings?
There are several ways of checking where your pages are in each Search Engine's Results Pages (a.k.a. SERPs) for a given search word or phrase.


Manually - Go to the Search Engine (SE) of interest and search like a punter would, then scroll down every page to see where you are.

Automated DIY - With most prominent SE's, you can sign up to be able to use the Application Protocol Interface (API) to query their database without using their front-end. This way you can simulate normal searches or make use of the more advanced API calls and retrieve ranking positions directly.

Automated 3d Party - There are several 3d party tools out there on the web made by people who have implemented an API querying interface for you. All you do is (pay and) sign up. Most are web based some are downloads. A search in your favorite SE for phrases like "Keyword Tracker" should show some of these.

Automated the wrong way - beware of tools which find your ranking by 'scraping' ordinary SERPs instead of utilizing the purpose built API calls. SE's aren't very happy about these practices for obvious reasons.

RSS Feeds - some SE's offer RSS feeds of SERPs which you could interpret with a script of yours and find where you are. Please refer to the program's Terms of Service for details on allowed usage.

Note: - Search Engines often differ SERPs based on user location. What you see in your browser is not necessarily what others see in their browser. Also, most SE's utilize a range of datacenters all of which could potentially have different datasets or ranking algorithms. What you retrieve with an API call might well differ from what you find manually.

Some SE's now also store your search history to try and understand patterns applicable to your searching behavior. This may also cause discrepancies in your ranking position findings.

Q No 30 :- What are all the Google Operators and what do they do?
The advanced operators, as far as SEO is concerned can be divided into 2 categories: Alternate Query Types and Query Modifiers.

Alternate Query Types

cache: If you include other words in the query, Google will highlight those words within the cached document. For instance, [cache:www.google.com web] will show the cached content with the word "web" highlighted.

link: The query [link:] will list webpages that have links to the specified webpage. For instance, [link:www.google.com] will list webpages that have links pointing to the Google homepage.

Note there can be no space between the "link:" and the web page url.

related: The query [related:] will list web pages that are "similar" to a specified web page. For instance, [related:www.google.com] will list web pages that are similar to the Google homepage.
Note there can be no space between the "related:" and the web page url.

info: The query [info:] will present some information that Google has about that web page. For instance, [info:www.google.com] will show information about the Google homepage.
Note there can be no space between the "info:" and the web page url.

This functionality is also accessible by typing the web page url directly into a Google search box.

Query Modifiers

site: If you include [site:] in your query, Google will restrict the results to those websites in the given domain. For instance, [help site:www.google.com] will find pages about help within www.google.com. [help site:com] will find pages about help within .com urls. Note there can be no space between the "site:" and the domain.

allintitle: If you start a query with [allintitle:], Google will restrict the results to those with all of the query words in the title. For instance, [allintitle: google search] will return only documents that have both "google" and "search" in the title.

intitle: If you include [intitle:] in your query, Google will restrict the results to documents containing that word in the title. For instance, [intitle:google search] will return documents that mention the word "google" in their title, and mention the word "search" anywhere in the document (title or no). Note there can be no space between the "intitle:" and the following word.

Putting [intitle:] in front of every word in your query is equivalent to putting [allintitle:] at the front of your query: [intitle:google intitle:search] is the same as [allintitle: google search].

allinurl: If you start a query with [allinurl:], Google will restrict the results to those with all of the query words in the url. For instance, [allinurl: google search] will return only documents that have both "google" and "search" in the url.

Note that [allinurl:] works on words, not URL components. In particular, it ignores punctuation. Thus, [allinurl: foo/bar] will restrict the results to page with the words "foo" and "bar" in the URL, but won't require that they be separated by a slash within that URL, that they be adjacent, or that they be in that particular word order. There is currently no way to enforce these constraints.

inurl: If you include [inurl:] in your query, Google will restrict the results to documents containing that word in the url. For instance, [inurl:google search] will return documents that mention the word "google" in their url, and mention the word "search" anywhere in the document (url or no).
Note there can be no space between the "inurl:" and the following word.

Putting "inurl:" in front of every word in your query is equivalent to putting "allinurl:" at the front of your query: [inurl:google inurl:search] is the same as [allinurl: google search].

Q No 31 :-I lost my rankings, what could have happened?
It can be a frightening experience. You check your rankings and suddenly they have either plummeted or are non-existent. Your swallow hard and feel sick to your stomach. It's happened to all of us and here are a few possible explanations.

Fluctuation
This is the most common explanation and happens on a daily basis. Search Engine Result Pages (SERPs) are in a constant state of flux with Google leading the pack. Gone are the days of the "Google Dance". This has also meant, unfortunately, that you can be Top 10 one day and disappear the next. The only thing you can do in this case is be patient and give it a few days to see if your ranking reappear.

Stop or Poison Words
Have you added new content lately? It is possible you have added what Google calls "Stop" or "Poison" words. These may include Adult words or commonly words like Links, Search Engine, Bookmarks, Resources, Directory, BBS, Paid-to-Surf and Forum which although won't knock you out of the SERPs, they can drop you a few spots

Broken Links
The major Search Engines view a site with dead end links as "broken". In some cases this can cause a major hit to your rankings. It is always advised to run a check using a tool like a Broken Link Checker and fixing them. I have seen a site rebound time after time within days of fixing this issue

Links to Bad Neighborhoods
A single link to a banned site or a bad neighborhood can torpedo your rankings. A banned site can usually be shown by a site that had previously been indexed but is no longer in Google. A grey toolbar can indicate either a new site or one that has been banned. A bad neighborhood is one that employs black hat tactics. This may include a link farm. Link Farms are one of the most often asked about things and the general rule of thumb is "If it looks like one, it is one". Avoid linking to sites with hundreds of links on its pages with little to no content

Use of Black Hat Methods
This will usually result in an all out ban and includes things like Hidden Text, Cloaking, etc
To get back into the index usually requires you to remove the offending method and email Google (or whichever SE banned you) asking to be reindexed. In most cases you will be successful although be prepared to wait several weeks if not months

These are just a few of the reasons for a sudden loss in rankings though the most common. The best rule of thumb is to always build and SEO for visitors first and Search Engines second. If you try to stick to this, you should enjoy more stable rankings without the fear of your site disappearing overnight.

A few more reasons other than those mentioned are as follows:

An algorithm update.
An algo update is different from a "regular database update". In case of a database update, you will possibly regain your positions without doing anything in a few days. However in an algo update, you will have to study the new ranking criteria’s based on the search results. This could be very difficult if you are unfamiliar of SEO practices and this is the reason why it is said that SEO is an ongoing process and not a one time thing.

Site update at your end.
You would have added new content on your webpage that is in search engine results. Generally a change of about 5-10% is considered as fine and you will not see a fall in your ranks. However a change much more than that might be an indication that some large amount of data has been added / replaced and is needed to be reindexed / analyzed. In this time of reindexing, the search engine may not put your web site on the position that you have seen it earlier on. This is possibly viewed as a "site redoing" by search engines and to maintain the quality of the SERP's they may drop your page for a very short time from the position. The retaining of positions will depend on factors like the relevancy of the data to the keywords, richness of the data popularly known as content value, uniqueness (no duplicate content), and use of black hat seo techniques on the pages. The percentage calculated includes the html coding, JavaScript’s, content, and other factors. An example I can mention here is "adding many pages to a web site suddenly". The listings are generally dropped and only if the content is found qualitative, the position retained. But if found to be some automatic generated content, then it's possible that the listing be dropped for a long time. Search engines list pages and not web sites in SERPs but do consider the whole web site when listing a single page in the SERPs. If adding poor content to a website can drop a quality page from that web site, then it is sure that if poor content be added to the page itself, it drops in listings.

It is best to consult your SEO before making changes to your pages. If you have not followed any black hat techniques, then this could be a possible reason for a drop in listings.

Q No 32 :-What is ModRewrite and when should I use it?
Mod_Rewrite is a set of functions built into .htaccess, an Apache module which allows for all sorts of nifty tricks with URL's, Error Pages, etc.

Mod_Rewrite specifically is used mostly for changing dynamic links, such as:

http://www.example.com/shop.php?cat=4&id=123

into search engine friendly static URL's, such as:

http://www.example.com/shop/4/123.html

While the dynamic link won't get indexed easily and likely not at all without external sites linking directly to that URL, the static link will get indexed with simple internal linking.

Mod_Rewrite is a very powerful tool for creating static links for content management systems, forums, and the like. From personal experience, after converting dynamic links on a games website I run to static links using .htaccess, my search engine results have multiplied many times over.

Q No 33 :-In the long run, how do I get enough or more visitors?
1)Good content and lots of it updated regularly over a period of months/years

2) Onpage optimization (static URLs, proper title and meta tags, use of H1, H2 tags, etc)

3) Offpage optimization (link popularity, building up inbound links)

4) Community building tools (forums, article comments, whatever applies)

5) Viral marketing (release something people want for free with some tag to your site on it)

6) Promotion and Advertising (promote your site wherever you go however you can)

Q No 34 :-What are "Supplemental Results" in Google?

http://www.google.com/webmasters/faq.html

Q No 35:-. Why my site is labeled "Supplemental"?

Supplemental sites are part of Google's auxiliary index. We're able to place fewer restraints on sites that we crawl for this supplemental index than we do on sites that are crawled for our main index. For example, the number of parameters in a URL might exclude a site from being crawled for inclusion in our main index; however, it could still be crawled and added to our supplemental index.

The index in which a site is included is completely automated; there's no way for you to select or change the index in which your site appears. Please be assured that the index in which a site is included does not affect its Page Rank.

Q No 36 :-How can we attract GoogleBot?
First of all, the GoogleBot must find your page through links from other pages that are already indexed by Google.

Then to get GoogleBot to visit again and again you should add fresh content frequently - for example by using a blog.

Q No 37 :-What is the Future of SEO?
Trust level: Theory, assumption or speculation

The future of SEO is undoubtedly one where:

1) one-way text links from relevant pages continue to be the most valuable links

2) reciprocal linking continue to decline

3) the 'shotgun' approach to link buying declines

4) mass email link requests decline

5) free directory submission declines

6) niche directory submission increases

7) article PR (article submission) increases

8) article submission sites (e.g. EzineArticles, GoArticles, and ArticleBlast play a much bigger
and more important role in helping online publishers locate quality articles (due to the increasing article volume)

9) user popularity is just as important as link popularity, which means:

10) the quality of article PR improves in order to increase site traffic, credibility, and loyalty

11) the quality of website content improves in order to convert traffic and encourage repeat visits

12) Clearly, the choices for SEOs will be pretty much limited to paying for links at niche sites and/or engaging in article PR. Being an SEO copywriter, I may be a little biased, but for me, article PR is the hands-down winner in this comparison:

13) It satisfies Google's criteria for relevance and importance. Linking site owners include your article and link because, in doing so, their site becomes more useful to visitors, and their business gains credibility and authority.

14) It generates hundreds of free links quickly enough to make it worth your while, but not so quickly as to raise red flags at Google (in the form of link dampening).

15) Links are permanent and you don't have to pay to keep them there.

16) You get a lot of qualified referred traffic who already trust you and your expertise. This satisfies Google's visitor popularity criteria, while at the same time bringing you a lot of extra customers.

Q No 38 :-What is Google Analytics?
Google Analytics is a free web-stats solution which not only reports all the regular site stats, but also integrates directly with Google AdWords giving webmasters an insight into the ROI of their pay-per-click ads. According to Google, "Google Analytics tells you everything you want to know about how your visitors found you and how they interact with your site."

Why is this such a landmark move? Because for the first time ever, Google will have access to your real web stats. And these stats will be far more accurate than those provided by Alexa. Furthermore, Google's privacy statement says: "We may also use personal information for auditing, research and analysis to operate and improve Google technologies and services.". Now let's put two and two together:

Google is 'giving' every webmaster in the world free access to quality web-stats.

Millions of webmasters will accept this 'gift', if only because it integrates directly with their Google AdWords campaigns.

Google will then have full access to the actual web stats of millions of commercial websites.

Google will have the right to use these stats to develop new technologies.


Q No 39 :-What is the difference of IP delivery and cloaking?
Here is a comment from Matt Cutts of Google dated April 18, 2006 on his blog:

IP delivery: delivering results to users based on IP address. Cloaking: showing different pages to users than to search engines.

IP delivery includes things like "users from Britain get sent to the co.uk, users from France get sent to the .fr". This is fine-even Google does this?

It's when you do something *special* or out-of-the-ordinary for GoogleBot that you start to get in trouble, because that's cloaking. In the example above, cloaking would be "if a user is from Googlelandia, they get sent to our Google-only optimized text pages."

So IP delivery is fine, but don't do anything special for GoogleBot. Just treat it like a typical user visiting the site.

So it all comes down to intent.