Open federation for Google Talk

By Mike Jazayeri, Product Manager, Google Talk

We've just announced open federation for the Google Talk service. What does that mean, you might be wondering. No, it has nothing to do with Star Trek. "Open federation" is technical jargon for when people on different services can talk to each other. For example, email is a federated system. You might have a .edu address and I have a Gmail address, but you and I can still exchange email. The same for the phone: there's nothing that prevents Cingular users from talking to Sprint users.

Unfortunately, this is not the case with many IM and Internet voice calling services today. You can only talk to people on the particular service you have an account on (so you need an account on every service to talk to everybody, which is pretty cumbersome). With open federation, you get to choose your service provider and you can talk to people on any other federated service (and vice versa).

In addition to the Google Talk service, many other companies, universities, and corporations support open federation today. This means you can now talk to millions of users around the world all with a single account on the service provider of your choice.

We think this is pretty exciting -- though perhaps not as exciting as something Star Trek-related -- and we hope it will bring us one step closer to making IM and Internet voice calling as ubiquitous as email.

Google Teams with Motorola for Mobile Search

Source:-PromotionWorld.com

Motorola, Inc. and Google Inc. announced a global alliance to enable users easy access to Google on Motorola handsets.

Motorola will integrate a Google icon onto select devices so that users can connect directly to Google anytime, anywhere at the click of a button. These mass-market, Internet-optimized handsets will be distributed from early 2006 to select Motorola customers worldwide.

"Many of our customers have been asking for mobile devices integrated with their consumers' favorite online search services. By featuring Google on Motorola handsets for those customers, we are making it easier for consumers to connect to the information they need when they need it," said Scott Durchslag, Corporate Vice President and General Manager of Global xProducts for Motorola's Mobile Device business.

"Access to information is imperative for people on-the-go. Whether checking the local weather or locating the restaurant of their choice, consumers today require personalized search services that are tailored to their needs," said Nikesh Arora, vice president, European Operations, Google Inc.

Motorola handsets with the Google icon are expected to be available to consumers starting in Q1 of 2006.

Motorola is a Fortune 100 global communications leader that provides seamless mobility products and solutions across broadband, embedded systems and wireless networks. In your home, auto, workplace and all spaces in between, seamless mobility means you can reach the people, things and information you need on the go.

Seamless mobility harnesses the power of technology convergence and enables smarter, faster, cost-effective and flexible communication. Motorola had sales of US $31.3 billion in 2004.

Google's innovative search technologies connect millions of people around the world with information every day.

Spam, What is it Good for, Absolutely Nothing!

by Jay B Stockman

Over time, unless the growth of spam isn\'t stopped, it will destroy the usefulness and effectiveness of email as a communication tool.

Unsolicited Commercial Email (UCE), or spam has reached epidemic proportions, and continues to grow. According to American Online, of the estimated 30 million email messages each day, about 30% on average was unsolicited commercial email. As a result of its very low marginal costs, spam has become extremely prolific. Regardless of how many emails are sent out, the spammers\' costs are low, and constant. With numbers like these, there is a tremendous burden shifted to the Internet Service Provider (ISP) to process and store that amount of data. Huge volumes of this junk may undoubtedly contribute to many of the access, speed, and reliability problems suffered by many ISPs. Further, many large ISPs have experienced major system outages as the result of massive junk email campaigns. Spam is an issue about consent, not content. Regardless of whether the UCE message is an advertisement, porn, or a winning lottery notice, the content is irrelevant. If the message was sent unsolicited, and in bulk then the message is spam.

This junk e-mail is more than just annoying; it costs Internet users, and Internet-based businesses millions, even billions, per year. When a spammer sends an email message to a million people, it is carried by numerous electronic systems on route to its destination. The systems in between are bearing the burden of carrying advertisements, and other unsolicited junk for the spammer. The number of spams sent out each day is truly overwhelming, and each one must be handled efficiently, and expeditiously by many systems. There is no justification for forcing third parties to bear the load of unsolicited advertising. Ultimately, these costs are passed on to YOU, the consumer.

Spam originates in one of two ways, it is sent directly by the spammer from and under their control, or via illegal third party exploitation such as open proxies or open relays. Spammers get your address in a variety of ways. If you sign up for, and provide your email address, these seemingly friendly sites can turn around and sell your email to advertisers. Additionally, if you have your email address on a Web page, it is easy for unscrupulous advertisers to \"harvest\" it, and add you to their lists.

There are ways to reduce the number of spam messages, however it is presently impossible to stop them all. Spam filters, are software applications that redirect emails based on the presence of certain common phrases, or words. These automated measures are prone to being defeated by clever spammers. Additionally, there is a risk of important emails being deleted as spam. In 2003, Congress passed a sweeping law, CAN-SPAM act of 2003, which basically prohibits the use of deceptive subject lines and false headers in all emails. Additionally, the FTC is authorized (but not required) to establish a do-not-email registry. The CAN-SPAM Act took effect on January 1, 2004.

Spam is based on theft of service; it wastes time, money, and other resources. Spam can and will overwhelm your electronic mail box if it isn\'t fought. Over time, unless the growth of spam isn\'t stopped, it will destroy the usefulness and effectiveness of email as a communication tool.

Google Explains Why 'Failure' Search Leads To Bush



As the Justice Department and Google continue to butt heads, the search engine is going to lengths to show it’s not politically biased in any way. But not everyone agrees with Google.

When you go to google.com, do a search for “miserable failure.” The top page that comes up is the official White House biography of President George W. Bush. That result has prompted complaints against Google by conservative users. They assume it reflects a political bias by the search engine.

Truth is, though, that it’s a result of something called “Googlebombing.” Here’s how it works: Google’s search results are generated by computer programs that rank Web pages in large part by examining the number and relative popularity of the sites that link to them.

In this case, a number of webmasters are using the phrase “miserable failure” to describe and then link to President Bush’s Web site. Thus, the leader of the free world has been “Googlebombed.”

It’s by no means the first time this has happened in some form or fashion, either as a political statement or just a prank. But in this case, the search engine has taken the unusual step of responding to a Googlebomb.

In a statement, they say, “We don't condone the practice of googlebombing, but we're also reluctant to alter our results by hand in order to prevent such items from showing up. Pranks like this may be distracting to some, but they don't affect the overall quality of our search service, whose objectivity, as always, remains the core of our mission.”

Given that, it probably won’t be long before conservative webmasters and bloggers respond with a Googlebomb of their own.

In case you’re wondering, the same thing happens when you search for “miserable failure” on Yahoo! Some are also now referring to Googlebombing as cyber-graffiti.

Mobile Term - Glossary

API

Application Programmers Interface
An interface between a software and an application allowing the application to use some functionality of the software. Used by programmers who write applications that can interact with other applications.

ASCII

American Standard Code for Information Interchange
This standard was established to achieve compatibility between various types of data processing equipment. ASCII is the common code for microcomputer equipment. It is used to describe numbers, letters, and the most common special characters. Languages such as Chinese, Japanese and Korean, however, use also many different characters. To illustrate those types of scripts you need Unicode.

CAPI

Common Application Programming Interface
CAPI is an application programming interface standard that is used to access ISDN (see ISDN) equipment. When an application wants to communicate with an ISDN card it sends a standard series of commands to the card. These commands form the CAPI standard and give developers and users a chance to use a well-defined mechanism for communications over ISDN lines without being forced to adjust to hardware idiosyncrasies.

CIMD

Computer Interface to Message Distribution
A special protocol for sending and receiving SMS messages using the large customer accounts of the network operators.

COM

COMmunication ports
Communication Port is the name of a serial communications port in DOS systems. DOS supports four serial ports:COM1, COM2, COM3 and COM4.

DDE

Dynamic Data Exchange
DDE is an interprocess communication system built into the Macintosh, Windows and OS/2 operating systems. DDE enables two running applications to share the same data. So, if you have a file that is connected to another document over DDE and you change that file, all documents or files communicating with the altered file will be updated accordingly.

Dual Band

European Telecommunications Standards Institute
Dual Band cell phones can work on networks that operate on different frequency bands. This is especially useful if you travel between areas covered by different networks, for example GSM 900 and GSM 1800.

EDGE

Enhanced Data Rates for Global Evolution
EDGE is a technology that gives GSMA and TDMA similar capacity to handle services for the third generation of mobile telephony. EDGE was developed to enable the transmission of large amounts of data at a high speed of 384 kilobits per second. EDGE is also referred to as E-GPRS (abbreviation for Enhanced GPRS).

EMS

Enhanced Messaging Service
is based on SMS. You can receive and send messages that include pixel pictures and animations, ring tones, sound effects and formatted text as long as you own an EMS-capable mobile phone. Phones that are not EMS capable will only display the unformatted text.

EPOC

EPOC is an operating system for mobile communication devices, like e.g. PDAs. It was developed by Symbian, a joint-venture with Ericsson, Matsushita, Motorola, Nokia and Psion.

ERMES

European Radio Messaging System
ERMES is a pan-European wide area paging network working in Europe, the Middle East and Asia.

ETSI

European Telecommunications Standards Institute
The European standardization body for telecommunications. ETSI is a non-profit making organisation whose mission is to produce the telecommunications standards that will be used for decades to come throughout Europe and beyond. ETSI was founded in 1988 by the members of the CEPT. ETSI promotes the world-wide standardization process whenever possible. Its Work Programme is based on, and co-ordinated with, the activities of international standardization bodies, mainly the ITU-T and the ITU-R.

GPRS

General Package Radio Service
GPRS is a standard for wireless communications which runs at speeds up to 150 kilobits per second. The current GSM (see GSM) systems only reach 9,6 kilobits. GPRS, which supports a wide range of bandwidths is an efficient use of limited bandwidth and is particularly suited for sending and receiving small bursts of data, such as e-mail and Web browsing, as well as large volumes of data.

GSM

Global System for Mobile Communication
Originally developed as a pan-European standard for digital mobile telephony in 1991. GSM today is one of the leading digital cellular systems available in more than 100 countries. GSM uses narrowband TDMA, which allows eight simultaneous calls on the same radio frequency. GSM reaches a speed of 9.600 BIT/s and can thus not be used for bandwidth applications.

GSM 1800
also known as DCS 1800 or PCN, GSM 1800 is a digital network working on a frequency of 1800 MHz. It is used in Europe, Asia-Pacific and Australia.

GSM 1900
also known as PCS 1900, GSM 1900 is a digital network working on a frequency of 1900 MHz. It is used in the US and Canada and is scheduled for parts of Latin America and Africa.

GSM 900
GSM 900, or just GSM, is the world’s most widely used digital network (see general explanation for GSM)

GUI

Graphical User Interface
A GUI allows users to navigate and interact with information on their computer screen by using a mouse to point, click, and drag icons and other data around on the screen, instead of typing in words and phrases.

HSCSD

High Speed Circuit Switched Data
HSCSD is a circuit-linked technology for higher transmission speeds, up to 57 kilobits per second, primarily in GSM systems.

HTTP

Hypertext Transfer Protocol
A protocol used on the Internet by web browsers to transport text and graphics. It is focuses on grabbing a page at a time, rather setting up a session.

Intranet

The Intranet is a network based on TCP/IP protocols (an internet) belonging to an organization, usually a corporation, accessible only by the organization's members, employees, or others with authorization An Intranet's Web sites look and act just like any other Web sites, but the firewall surrounding an Intranet fends off unauthorized access.

IP

Internet Protocol
IP belongs to the TCP/IP protocol family, an acknowledged industrial standard for communication between open systems. The Internet Protocol defines how information travels between systems across the Internet.

IP address

The IP address is the identifier for a computer or device on a TCP/IP network. Networks using the TCP/IP protocol route messages based on the IP address of the destination. The format of an IP address is a 32-bit numeric address written as four numbers separated by periods. Each number can be zero to 255. For example, 1.160.10.240 could be an IP address. Within an isolated network, you can assign IP addresses at random as long as each one is unique. However, connecting a private network to the Internet requires using registered IP addresses (called Internet addresses) to avoid duplicates.

IPX/SPX

Internet Package Exchange/Sequenced Package Exchange
IPX and SPX are transfer protocols in a novell network. IPX is a datagram protocol used for connectionless communication. SPX is a transport layer protocol (layer 4 of the OSI (Open Systems Interconnections) model). The SPX layer sits on top of the IPX layer (layer 3) and provides connection-oriented services between two nodes on the network. SPX is used primarily by client/server applications. Whereas the IPX protocol is similar to IP, SPX is similar to TCP. Together, therefore, IPX/SPX provides connection services similar to TCP/IP.

ISDN

Integrated Services Digital Network
ISDN is a public, digital telecommunications network, that offers high speed transmission of voice, data and video through existing fixed line infrastructure. Most ISDN lines offered by telephone companies give you two lines at once, called B channels. You can use one line for voice and the other for data, or you can use both lines for data to give you data rates of 128 Kbps, three times the data rate provided by today's fastest modems.

JMS

Java Message Service
An API that supports messaging between computers in a network. JMS is a specification that defines the Java language interface to a messaging service and a means for exchanging XML-based transactions.

LAN

Local Area Network
LAN is a small data network covering a limited area, such as within a building or group of buildings. Upon connecting one LAN to other LANs over any distance via telephone lines and radio waves a system of LANs called WAN can be established.

MMS

Multimedia Messaging Services
MMS is a further development of the EMS standard. Besides graphics and pictures an MMS-message can also include audio- and video clips. MMS-messages can be sent between mobile devices as well as from mobile devices to e-mail recipients. Including animations and videos such a message will be much longer than the currently common 160 characters per text message.

MO

Mobile originated
MO denotes the capability of the GSM system to send a message from a mobile phone via a Service Center and to provide information to the mobile phone about the delivery or failure of that message.

Modem

Abbreviation of Modular/Demodular
The modem converts digital computer signals into analog form for transmission over analog telephone systems

Mobile Phone Network

A mobile phone network or system consists of a network of cells. Each cell is served by a radio base station from where calls are forwarded to and received from mobile phones by wireless radio signals.

MT

Mobile Terminated
MT marks the capability of the GSM system to send a message from the Service Center to a mobile device where the message is either received, or, if the recipient device is unavailable, stored for later delivery.

NOVELL

US manufacturer of the common network system Novell Netware. Netware has been a corporate standard for building LANs for more than a decade.

ODBC

Open Database Connectivity
ODBC is an “open database connection”, a standard database access method developed by Microsoft. ODBC allows access to any data from any application, regardless of which database management system is handling the data.

Operator

An operator is a company that operates a telephone network, for example D1, Vodaphone or AT&T.

OTA

Over the Air
The method used to remotely manage applications on a subscriber handset.

Paging

Paging is a single direction radio service for alerting subscribers and leaving messages.

PCMCIA

Personal Computer Memory Card International Association
PCMCIA is an international organization consisting of some 500 companies that has developed a standard for small credit card sized devices, called PC cards. There are three different types of PCMCIA cards, each with the same rectangular size (85.6 by 54 millimeters) but different widths.

PCS

Personal Communication System
PCS is the collective term for American mobile telephone services in the 1900 MHz frequency band.

PDA

Personal Digital Assistant
A hand-held computer such as a PocketPC.

PIN

PIN is a code used for all GSM-based phones to establish authorized access to certain functions or information. In general the PIN code will be delivered together with your subscription.

PING

Packet Internet Groper
PING is a utility to determine whether a specific IP address is accessible. It works by sending a packet to the specified address and waiting for a reply. PING is used primarily to troubleshoot Internet connections. There are many freeware and shareware PING utilities available for personal computers.

POP3

Post-Office-Protocol
POP3 is a protocol used to retrieve e-mail from a mail server. For sending electronic messages SMTP is used.

PSTN

Public Switched Telephone Network
The public, analogue telephone (dialing) network.

ROAMING

International roaming means that you can use networks other than your own when traveling abroad. To allow international roaming, there has to be an according agreement between the network operators in question.

SDK

Software Developer’s Kit
Application software tools and specifications provided by a software manufacturer. An SDK enables computer application developers to create programs, plugins, or add-on enhancements that will work with a certain piece of software.

Service Provider

A service provider is a company that provides services and subscriptions to telephone, mobile phone and Internet users.

SIM

Subscriber Identity Module
A SIM card is a small printed circuit board that has to be inserted in any GSM-based cell phone when subscribing. It includes subscriber details, security information and the personal directory of numbers.

Smart Messaging

Smart Messaging is a concept developed by Nokia for sending and receiving ring tones, picture messages, operator logos, business cards, calendar requests, and Internet settings over the Short Message Service (SMS). Using special characters in the beginning of the SMS the receiving mobile phone identifies the respective type of message.

SMPP

Short Message Peer to Peer
SMPP is a communication protocol for sending and receiving short text messages, mainly used for corporate and high volume SMSC links.

SMS

Short Message Service
SMS is a currently very popular data service within GSM-networks. With SMS it is possible to send and receive short messages of up to 160 characters to and from mobile phones via the network operators' message center (SMSC).

SMSC

Short Message Service Center
A SMSC regulates the message transfer to and from the according mobile phones in GSM-networks. This includes not only short text messages but also fax, voice- or e-mails. The SMSC delivers the messages, temporarily stores them in case the respective recipient is currently not available and takes care of charging. There is at least one SMSC per network.

SMTP

Simple Mail Transfer Protocol
SMTP is a standard protocol for handling the transfer of e-mails across the Internet.

SQL

Structured Query Language
The international standard database language used in updating, querying and managing relational databases. It can be used to sort, retrieve and filter data extracted from a database.

TAP

Telecator Alphanumeric Protocol
TAP is the currently most common communication protocol for short messages besides UCP. It is mainly used with public modem- and ISDN-access (e.g. D1 or E-Plus).

TAPI

Telephony Application Programming Interface
TAPI is an application programming interface (API) for connecting a PC running under Windows to telephone services. TAPI was introduced in 1993 as the result of joint development by Microsoft and Intel. The standard supports connections by individual computers as well as LAN connections serving many computers. Within each connection type, TAPI defines standards for simple call control and for manipulating call content.

TCP/IP

Transmission Control Protocol over Internet Protocol
TCP/IP is a suite of network protocols used to connect hosts on the Internet. TCP/IP uses several protocols, the two main ones being TCP and IP.

UCP

Universal Computer Protocol
UCP is a communication protocol for SMS-messages, that are sent via modem- or ISDN-access or so-called large customer accounts.

UMTS

Universal Mobile Telephone System
UMTS is the name for the third generation mobile telephone standard in Europe, standardized by ETSI (see ETSI). UMTS will deliver broadband information at speeds up to 2 MBIT/s. Besides voice and data, UMTS will deliver audio and video to wireless devices anywhere in the world through fixed, wireless and satellite systems.

Unicode

Unicode is an international standard used to encode text for computer processing. It is a subset of UCS. Unicode's design is based on the simplicity and consistency of ASCII, but goes far beyond ASCII's limited ability to encode only the Latin alphabet. Unicode provides the capacity to encode all of the characters used for the major written languages of the world. To accommodate the many thousands of characters used in international text, Unicode uses a 16-bit code-set that provides codes for more than 65'000 characters. To keep character coding simple and efficient, Unicode assigns each character a unique 16-bit value, and does not use complex modes or escape codes.

Unix

UNIX is a computer operating system that is designed to be used by many people at the same time and that has TCP/IP built-in. It is a quite common operating system for servers on the Internet.

VAS

Value-added services
Value-added services provide additional "value" to users of mobile GSM infrastructure, e.g. in form of mobile games, logos and ringtones (mobile entertainment)

Visual Basic

Visual Basic is a programming language and environment developed by Microsoft and based on the BASIC language. Visual Basic was one of the first products to provide a graphical programming environment and a paint metaphor for developing user interfaces.

VBA

Visual Basic for Applications
Is the abbreviation for „Visual Basic for Applications“. By integrating VBA into ones own software applications it is possible to authorize third party persons to extend the respective program by a number of features.

VSMSC

Virtual SMSC
Extended version of a direct link to the SMSC (see SMSC) of a network operator where the user receives his own number for a SMS message center. The VSMSC allows establishing and managing of closed user groups and the use of reverse-billing (user pays incoming as well as outgoing messages over his VSMSC).

WAN

Wide Area Network
A computer network that spans a relatively large geographical area. Typically, a WAN consists of two or more LAN. Computers connected to a wide-area network are often connected through public networks, such as the telephone system. They can also be connected through leased lines or satellites. The largest WAN in existence is the Internet.

WIN32 API

Windows 32 Application Programming Interface
WIN32 API is a 32-bit programming interface under Windows.

WAP

Wireless Application Protocol
WAP is a free, unlicensed protocol for wireless communications that makes it possible to create advanced telecommunications services and to access Internet pages from a mobile phone. WAP is supported by all operating systems.

X.25

This is a synchronous protocol used by computers to talk to the world wide network of data communication computers. X.25 also refers to the global network of computers, Datapac in Canada, Tymnet and TELNET in the USA, Transpac in France etc. In most countries you can tap into this net with a local phone call. This is a separate, more regulated network than the Internet.

XML

Extensible Markup Language
XML has been developed to improve the functionality of the Web and to compensate the disadvantages of HTML by providing more adaptable and flexible information identification. It is not a fixed format like HTML, therefore the term extensible. XML is a "meta-language" which lets you design your own markup language no matter how many different types of documents you have. It is not just for Web pages. XML can also be used to store any kind of structured information and much more. Because of its rapid distribution and the increased support of software industry it provides a secure investment.

Keyword Advice - on keyword research and identification, keyword placement

By Thomson Chemmanoor Website: http://www.website-promotion-expert.com


Keyword Identification is one of the most important part in a SEO
process. Keywords are the doors and windows for search engine
traffic. Surfers and information seekers through out the world use
internet via keyword to find information on any products or
services. These set of keyword phrases are vital for your search
engine placement.

Keyword Identification is the first step while creating your content
for your site. Identify keyword related to your niche topics should
be the first priority while creating web content. The quality of
these keywords is critical for a good ranking in search engines.
Researching the right keywords for your website should be the key,
so that when people perform a search for a particular keyword phrase
they will be able to find your site. So how do you do that will be
the questions in everybody's mind. I have categories this article
into 3 phrases like keyword research or keyword identification,
keyword selection, keyword placement, keyword density, keyword
ranking and keyword marketing.

Before I would go further I would like to discuss about building
your website for your business you should focus on certain other SEO
strategies like clean navigation, a search engine friendly website
architecture so that search engine spiders or robots can easily
gather information from your website.

Once your site structure is done it is time to create the content
for the site, which is the crucial part and where they keyword
research, keyword density comes into play. If you follow these steps
which I have mentioned below it is sure that your website can enjoy
a higher ranking in search engines.


Step 1
Keyword Identification or keyword research:- This should be first
step while building your content, because your content will be build
around with the keyword and phrases which has been identified at
this stage. Which is very critical and if not done correctly you
won't achieve the desired results. There are different ways to
identify how people search on the web.
1) Think about what keyword would you chooses while you search for
certain products or services. Think from the customer point of view
and not from search engine or SEO point of view. Collect and write
it down. I suggest using a spreadsheet so that it will help you in
the stages as well.
2) The other way is you can ask your own customers or friends that
how do they search on the net to find that particular products or
services. Usually all people think differently and might not use the
same keyword phrase to find that product or services. Collect them
and store it.
3) Keyword suggestion tools. There are 100's of tools on the web
some of them are free and other charge a fee. I suggest using
Overture keyword tool and Google keyword tool (free) and
wordtracker (paid). Type in your keyword and phrases into these
tools you will find 100's of keywords for that particular products
or services is been search on the web every month. Also it will show
how many times that words are been searched on the internet. Tools
like wordtracker will help you to find that how many times these
keywords are been searched and how much competition is there for
that particular keyword phrases. If you focus on high searched
keyword it would be hard for your website to rank well because of
competition. So always choose a keywords that is less competitive,
so that you website will rank well for that particular keyword
phrase on the search engine results.
4) The other way is if you have already had a website online check
its web log and gather that keywords which helps people to find your
site. This will help for existing website to recreate the content.

After these steps are done use your spreadsheet to determine the
keywords It is ideal to have more keywords rather than focusing on a
few words. In that spreadsheet matches the following keyword in
terms like how many times the keywords are searched and how much
competition is that keyword facing on the web. As I said before
always choose less competitive keywords. Once these stage is over
start constructing your website content with the identified

keywords.

Step 2 Keyword placement:-Placing the keyword is very crucial,
because if the keywords are not place properly the search engine
spiders will have a difficult time in identify and storing
information from your web pages. You most important keyword should
appear on the title of the page at least once which all search
engine given relevancy. Next the same or related keyword should be
there on the heading of the page. Also use a related keyword while
starting a paragraph and through out the web page evenly. The
density factor is discussed below.

Step 2
Keyword density:- Search engine robots are very intelligent spiders
they can determine while some tries to cheat them. Some webmasters
and SEO experts use keyword spamming or keyword dumping which is
ethical for a good search engine optimization. If search engine
spiders indexes a web pages it determine who much times that
keywords has been used on a particular web page. So Keyword density
is important. An ideal density would be anything between 10-15 %
keyword densities on the page. Also use keywords which are related
to that phrase so that it gets relevancy for that web page.For
example When I search for a keyword "dog supplies" I found 100 of
related terms which like "dog supplies online" "dog supply store"
etc so think and figure out keywords which are more related to that
topic. Also Sprinkle the keywords phrases right across the web page
and not concentrate at a certain place.


At this stage you website content should be ready keep in mind with
the 3 top priority factors. This will determine your website success
or failure in search engine. The next stage is keyword marketing.

Keyword marketing:- Some of the techniques you will use as a off
site factor rather than on the side, because it affects only the
marketing process. As you all know all search engines give links
more relevancies in the algorithm. So after your keyword
identification, density and placement are over you should focus on
marketing your site. Like site submission, article submission, link
popularity while all these techniques are for increasing inbound
links to your site. So you would be using your anchor text for
creating a link pointing to your site. So here is where keyword
marketing comes to play. You should use your key important phrase in
these anchor text also mix up your related keywords. Keyword
marketing is also vital for Pay per click marketing where keyword is
the king. So gather keywords related to your business and use it
widely in your PPC campaign. Pay per click industry is solely
depends on keyword marketing. If you website want the maximum ROI
(return of Investment) you will have to apply the maximum keywords
while bidding your keywords. For example your website visibility
will increase if you choose maximum keyword for your Pay per click
marketing.

The bottom-line is to gain a high ranking in search engine use
keyword which is related to your business on your onsite and offsite
website marketing. Keyword is the king when it comes to content and
content is the king when it comes to Search Engine Marketing.

Google and Jagger's Aftermath

Starting somewhere between September 22 and November 17, 2005, Google launched a major update to their search algorithm which shook up the search engine optimization (SEO) community and millïons of website rankings. The update has been named Jagger and is apparently finished.
The keywords that people used to find your site within Google may not be producing as many visits any more because the Jagger changes caused your rankings to plummet. Of course many people have seen their rankings stay the same or improve in Jagger's aftermath too.

If your site's rankings have decreased, what can be done to get back to where you were or better in the post-Jagger Google world?

There are still a lot of questïons to be sure, but there are some good beginnings of answers as well. Since this update was rolled out over months and in three distinct phases, it has been much more difficult to determine what factors have been given more or less weïght.

For instance, IBL (inbound links to your site) have always been important to achieve high rankings in Google. But, there are many different kinds of IBL's. Link trades, where you put my link on your site and I put your link on my site may be less valuable than a one-way link. This has been the case for a while, but is the importance of each changed since Jagger? Probably. I don't know all the answers, and I don't think anyone knows all the answers save the people at the 'plex (short for Google-plex).

What are some theories? Hëre are some of the top ones, but I am not saying they are necessarily true or false. And, this is not a full list; there are most likely numerous other factors that affect Google rankings after Jagger that no one has recognized at all yet. The following list consists of ideas I have read online, which I spend hours each day doing, or that are the result of some of our own hard-earned observations using the large number of clients' websites in many different industries to learn from. Read the following with a grain of salt, which is always a good idea when reading any articles or forum posts about SEO or Jagger.

Things That Could Possibly HELP You More In Jagger's Aftermath


Aged Domains - Sites with domains that are older rank better today - the older the domain, the better its rankings with all other things being equal. (This is probably true to some degree).

Very Relevant Links - IBL (inbound links) and OBL (outbound links) relevancy is more important after Jagger. This means that if you point to related sites or you get links from other sites that are related to your website, you may rank better after Jagger with all other things being equal. (This is probably true to some degree as well).

Links From Trusted Sites Help - TrustRank (or a similar concept) is more important than ever after Jagger. TrustRank is a concept that says if you get a link pointing to your site that is highly trusted by Google (trusted either programmatically or by human editors), then you will rank better with all other things being equal. (See http://www.vldb.org/conf/2004/RS15P3.PDF.

Variety of Links - Links from .edu and .org websites are good for increasing your rankings and are more important than ever. (It's vital to get links förm a wide variety of websites. Just like your investing, you need to diversify your IBL's. (This has probably been true even before Jagger).

Aged Links - The older the link that points to your site, the more weïght it's given. (This also has probably been true even before Jagger).

Embedded Links - Links that are embedded in sentences and paragraphs instead of stand-alone links are weighted more heavily. (This may be true soon if not already).

Article Links - Articles are what directories had been a year or two ago for link building. Links from the author by-line or within the article that point back to your site will positively affect your rankings. (And this is one reason I've chosen to write this article).

Fresh & Unique Content - Nöw, more than ever, regularly updated and added ordinal content will help your rankings. (This is almost definitely true.)

Be a Big Guy - If you are a big behemoth site like Wikipedia, Yahoo, AOL, Ebay, Amazon, etc., you will rank better than you did before Jagger.

High Traffïc & Stickiness - User popularity statistics nöw, or will soon, affect rankings. In other words, user actions on your website, like how long they stay (stickiness), how many pages they visit, and even how many people visit your site in a given period, can all affect how Google ranks your site. (This may be true soon if not already).

Things That Could Possibly Not Help You Anymore, or May Even HURT You More In Jagger's Aftermath


Duplicate Content - Any kind of duplicate content can hurt your rankings. Some say this only refers to other sites having the same content as you while others say even duplicate content within your own site can be bad. I find the latter hard to believe since all sites have repeating slogans, phrases, checkout instructions, or any number of other duplicate sentences within the same site. (Use Copyscape.com to find people who are stealing your original written content and publishing it on their site).

Hidden Text - Hidden text within your html, in
tags, CSS, or comments, can negatively affect your rankings. (This is something you should nevër do).

Footer Links – Some say links in the footer are currently disregarded. (This is one we have found no evidence for).

Directory Links - Links from directories are weighted less. (This is one we have found no evidence for, but is most likely true or will be soon)

Decreased Rate of Link Building - The speed and volume of inbound link creation to your site from other websites, if changed, can negatively affect your rankings more so nöw. (This one is most likely true too).

Reciprocal Links - Reciprocal link trades are worth less then they were before or are worth nothing. (It's probably true that they are at least worth less today).

Linking to Bad Neighborhoods - Reciprocal link trades hurt your rankings when you link to sites that are considered 'bad neighborhoods' by Google, such as link farms or sites that are banned by Google. (This is most likely true and has been for a while).

Link Schemes - Participating in link schemes such as Co-ops or Link Vault can hurt your ranking more than help them. (I have not found any evidence of this so far for my clients' sites, but this could be true).
Again, I don't think anyone outside Google knows which of the above factors are true or false, and how each one affects a given keyword phrase's ranking. In fact, that's the idea. Google doesn't want people 'gaming' their system. There are so many variables that need to be considered that it is very difficult to figure out which ones affect what.

So, what do you do if your site's ranking has dropped since Jagger?

If your site was ranking well in the Google SERP's (search engine ranking position) before Jagger, then it was nowhere to be found right after Jagger hit, and your site has still not bounced back at all, then you probably tripped a filter, got penalized or even banned. You may have duplicate content on another site, or someone copied a lot of your content, or you may have a canonical issue (where yoursite.com and www.yoursite.com are considered two different sites by Google causing it to look like duplicate content). You may have hidden text, or keyword stuffed your pages or any number of other things. You're definitely going to need more knowledge than this article can give you to get your rankings back.

Some say that Google updates have happened before around the same time of year, and many sites that tanked came back after the first of the year. I don't know if this is true, we'll just have to wait and see. For those who have still not rebounded, this may be nice to know.

Interestingly, most of our clients' sites either stayed the same or improved after Jagger. Our own company site improved. But unfortunately, a few of our other clients saw some decreases in their rankings right after Jagger, and have since rebounded, but not at quite the same pre-Jagger levels. Here's what we did for them:


Scoured their site for bad outgoing links and made sure that each site they linked to was indexed by Google and was not trying to game Google. Any questionable links were deleted immediately. But we did not get rid of all our link partners, we just culled.

Determined the ratio of the different types of incoming links to learn where improvements were needed. In other words, we determined the percentage of links to their site that were link trades, one-way links from related sites, one-ways from unrelated sites, link advertisements, directory links, forum signature links and more. We then advised them to increase their one-way related inbound links that are embedded in sentences, and not concentrate so much on link trades and stop getting one-way unrelated link development altogether.

Cleaned up the HTML on every page, made sure all tags were closed and that there was no extraneous code on any page. And we put CSS and java-script's in separate files.

Took out any inadvertent hidden text. One client had keywords in comment tags in their HTML that we deleted.

Decreased file size of pages, by taking out old links and superfluous verbiage, and by re-optimizing the .gif's and .jpg's.

Wrote much more succinct Meta descriptions and on-page verbiage.

Made sure that every title tag on every page within the site was different.

Coached them about the importance of continually developing good, quality, original content.

Brainstormed ways in which their sites could entice other webmasters to link to them because of what their site offers, such as good content, frëe Web tools, articles and many other things. This is called natural linking and what Google regards as the only legitïmate way to build links. Therefore, this is vital.
We tried to look at the overall link development strategy, the value of their site, and the quality of the site, both the content quality and the html quality. A clean, simple, fast-loading site with natural links pointing to it from a variety of other related websites, some .org's and .edu's, others from trusted authority sites, and many from small related websites, that adds fresh and unique content daily, will rank well in Google over time and won't be affected by any update, including Jagger.

The best way for you to learn what to do in Jagger's aftermath is to read articles like this, participate in forums that discuss these topics, and most importantly, by experimenting with your own sites to see what works. This takes time and patience. So does building quality sites that have things to offer and that subsequently get natural links. But it's all worth it.

Post published by Afzal Khan

Canonical URL

Q: What is a canonical url? Do you have to use such a weird word, anyway?
A: Sorry that it’s a strange word; that’s what we call it around Google. Canonicalization is the process of picking the best url when there are several choices, and it usually refers to home pages. For example, most people would consider these the same urls:
www.example.com
www.example.com/
example.com
example.com/
www.example.com/index.html
example.com/home.asp
But technically all of these urls are different. A web server could return completely different content for all the urls above. When Google “canonicalizes” a url, we try to pick the url that seems like the best representative from that set.

Q: So how do I make sure that Google picks the url that I want?
A: One thing that helps is to pick the url that you want and use that url consistently across your entire site. For example, don’t make half of your links go to http://example.com/ and the other half go to http://www.example.com/ . Instead, pick the url you prefer and always use that format for your internal links.

Q: Is there anything else I can do?
A: Yes. Suppose you want your default url to be http://www.example.com/ . You can make your webserver so that if someone requests http://example.com/, it does a 301 (permanent) redirect to http://www.example.com/ . That helps Google know which url you prefer to be canonical. Adding a 301 redirect can be an especially good idea if your site changes often (e.g. dynamic content, a blog, etc.).

Q: If I want to get rid of domain.com but keep www.domain.com, should I use the url removal tool to remove domain.com?
A: No, definitely don’t do this. If you remove one of the www vs. non-www hostnames, it can end up removing your whole domain for six months. Definitely don’t do this. If you did use the url removal tool to remove your entire domain when you actually only wanted to remove the www or non-www version of your domain, do a reinclusion request and mention that you removed your entire domain by accident using the url removal tool and that you’d like it reincluded.


Q: So when you say www vs. non-www, you’re talking about a type of canonicalization. Are there other ways that urls get canonicalized?
A: Yes, there can be a lot, but most people never notice (or need to notice) them. Search engines can do things like keeping or removing trailing slashes, trying to convert urls with upper case to lower case, or removing session IDs from bulletin board or other software (many bulletin board software packages will work fine if you omit the session ID).

Q: Let’s talk about the inurl: operator. Why does everyone think that if inurl:mydomain.com shows results that aren’t from mydomain.com, it must be hijacked?
A: Many months ago, if you saw someresult.com/search2.php?url=mydomain.com, that would sometimes have content from mydomain. That could happen when the someresult.com url was a 302 redirect to mydomain.com and we decided to show a result from someresult.com. Since then, we’ve changed our heuristics to make showing the source url for 302 redirects much more rare. We are moving to a framework for handling redirects in which we will almost always show the destination url. Yahoo handles 302 redirects by usually showing the destination url, and we are in the middle of transitioning to a similar set of heuristics. Note that Yahoo reserves the right to have exceptions on redirect handling, and Google does too. Based on our analysis, we will show the source url for a 302 redirect less than half a percent of the time (basically, when we have strong reason to think the source url is correct).

Q: Okay, how about supplemental results. Do supplemental results cause a penalty in Google?
A: Nope.

Q: I have some pages in the supplemental results that are old now. What should I do?
A: I wouldn’t spend much effort on them. If the pages have moved, I would make sure that there’s a 301 redirect to the new location of pages. If the pages are truly gone, I’d make sure that you serve a 404 on those pages. After that, I wouldn’t put any more effort in. When Google eventually recrawls those pages, it will pick up the changes, but because it can take longer for us to crawl supplemental results, you might not see that update for a while.

That’s about all I can think of for now. I’ll try to talk about some examples of 302’s and inurl: soon, to help make some of this more concrete.

Google's Video Store Premiers

The trend for selling video on the Internet "vending-
machine-style" got a huge boost with the announcement by
search giant, Google, that their online video store opened
for business.

On the surface, it appears they'll just sell old episodes
of the "Brady Bunch," "Twilight Zone" and NBA games you
missed.

However, on closer inspection, Google's online Video Store
represents a whole-scale shift in communications power.

For those of you who might have missed it, let me quickly
catch you up.

Last year, Google started publishing TV news content on
their http://video.google.com site.

A short time later, they started accepting content from
anyone with a video camera and something to show.

In very little time, Google started developing a huge grab
bag of everything from community access TV clips to video
game instructions to yoga tips - all on video streaming
over the Internet.

In my opinion, this first stage served the purpose of
gauging market interest and whether enough people would
submit /watch video to make it worth taking the next step
(selling video online).

Obviously, Google thought enough people had enough interest
in consuming video online, because Friday, January 6, 2006
they announced the opening of their Video Store at
http://video.google.com

The store functions like a virtual vending machine,
allowing visitors to stream video right on their computer
screens.

If the copy protection is turned off, Google also enables
users to download some paid video to their iPods and Sony
PSPs to view on the go.

The individual publishers of the video content determine
whether the copy protection gets turned on or off.

Also, content publishers determine the prices for their
videos but, at the moment, most video still comes free of
charge.

I will say, however, that Google's video service isn't
perfect, but it works and, like everything else they do, it
will get better because they operate with enough cash to
make it work (if consumers want it).

With that said, what does all this mean to the individual
content provider?

What does this mean for distribution and consumption of
video content?

First, it opens up a distribution channel for small content
publishers (1-man shows) who could create excellent
content, but, until now, lacked the technical expertise or
server resources to deliver the video over the Web.

Second, it allows content providers to target micro-niche
audiences who cannot be reached profitably through
traditional advertising or distribution channels (Wal-Mart
doesn't carry "Chihuahua Training Tips" videos).

Third, it creates a unique outlet for individual creativity
like never before and will expose consumers to a whole new
world of thought and content.

Though the service has its detractors, the video isn't
high-definition, and the system has some kinks to work out,
Google Video's approach will win out in the end.

Google's model has always been to "keep it simple!"

By making it simple for consumers to find and view video,
as well as making it simple for content providers to upload
and distribute video, Google will find itself at the center
of an online video revolution comparable to the "golden
age" of television in the early 1950s.

How Crawler works

A search engine operates, in the following order: 1) Crawling; 2) Deep Crawling Depth-first search (DFS); 3) Fresh Crawling Breadth-first search (BFS); 4) Indexing; 5) Searching.

Web search engines work by storing information about a large number of web pages, which they retrieve from the WWW itself. These pages are retrieved by a web crawler (also known as a spider) - an automated web browser which follows every link it sees, exclusions can be made by the use of robots.txt. The contents of each page are then analyzed to determine how it should be indexed. Data about web pages is stored in an index database for use in later queries. Some search engines, such as Google, store all or part of the source page (referred to as a cache) as well as information about the web pages, whereas some store every word of every page it finds, such as AltaVista. This cached page always holds the actual search text since it is the one that was actually indexed, so it can be very useful when the content of the current page has been updated and the search terms are no longer in it.

This problem might be considered to be a mild form of linkrot, and Google's handling of it increases usability by satisfying user expectations that the search terms will be on the returned web page. This satisfies the principle of least astonishment since the user normally expects the search terms to be on the returned pages. Increased search relevance makes these cached pages very useful, even beyond the fact that they may contain data that may no longer be available elsewhere.

When a user comes to the search engine and makes a query, typically by giving keywords, the engine looks up the index and provides a listing of best-matching web pages according to its criteria, usually with a short summary containing the document's title and sometimes parts of the text. Most search engines support the use of the boolean terms AND, OR and NOT to further specify the search query. An advanced feature is proximity search, which allows you to define the distance between keywords.

The usefulness of a search engine depends on the relevance of the results it gives back. While there may be millions of Web pages that include a particular word or phrase, some pages may be more relevant, popular, or authoritative than others. Most search engines employ methods to rank the results to provide the "best" results first. How a search engine decides which pages are the best matches, and what order the results should be shown in, varies widely from one engine to another. The methods also change over time as Internet usage changes and new techniques evolve.

Most web search engines are commercial ventures supported by advertising revenue and, as a result, some employ the controversial practice of allowing advertisers to pay money to have their listings ranked higher in search results.

The vast majority of search engines are run by private companies using proprietary algorithms and closed databases, the most popular currently being Google, MSN Search, and Yahoo! Search. However, Open source search engine technology does exist, such as Dig, Nutch, Senas, Egothor, OpenFTS, DataparkSearch, and many others.

Google Page Rank(PRV)

What PageRank (PR) Is

Google’s measure of the number & quality of inbound links to a web site. The PageRank (PR) of each page is one of the about 100 criteria Google uses to rank web pages.

How Much PageRank Matters

It is only one of 100 criteria Google uses, but from experience I’m convinced that it weighs quite heavily in Google’s ranking algorithm. Pages with a high PR value usually outrank pages with a low PR value.

Some search engine experts feel that webmasters in general assign too much value to PR – and they are probably right. A high PR is only valuable if the page is properly optimized for the keywords it targets. The Google homepage has a perfect PR of 10, but it does not rank first for every keyword search.

How PageRank is calculated

Google measures the number and quality of links to a page – both links from outside the site and links from other pages in the same site.
The PR formula is:
PR(A) = (1-d) + d(PR(t1)/C(t1) + ... + PR(tn)/C(tn))

Don’t be discouraged. It’s not as difficult as it looks.

Before I explain how it works, I should mention that this is the original formula used by Larry Page and Sergey Brin when they developed the PageRank system. It is likely that the formula has been tweaked since then.

The Formula Made Easy
A = The page for which we want to calculate PR
t1 to tn = All the pages linking to page A
C = The number of outbound links each page has
d = A damping factor (set to 0.85)


Page A has inbound links from pages X, Y and Z.
Pages X and Y each have only one outbound link:
The one to page A. But page Z has three outbound links of which only one points to page A. Page X has a PR of 1, page Y has a PR 2 and page Z has a PR 3.

Here’s this example’s formula:

PR(A) = (0.15) + 0.85(1/1) + 0.85(2/1) + 0.85(3/3)

PR(A) = 0.15 + 0.85 + 1.7 + 0.85

PR(A) = 3.55

So page A has a PR of 3.55.

So page A has a PR of 3.55.
Black shows page A, red shows page X, green shows page Y and purple
shows page Z.

Top Search Engine Ranks - The Only Secret You Need-Explained
By Afzal Khan

The top three search pages- the only place you'll be noticed they say. Unfortunately, they are correct. How can you possibly compete with millions of pages? Many represent thousands of dollars of professional search marketing and optimization per page, backed by budgets on which most of us could retire. How can the little guy have a chance armed with an already out-of-date search engine marketing book and a ream of now dog-eared, printed web pages on the topic? Do you really think you have a chance to land those top spots? Yes, you do... the same chance the big guys have!
Do searches for Internet marketing or some derivative of that and you'll be thrown into an expanding mass of theories, techniques, methodologies, scams and some actual useful information? Sorting through the latest trends, applying them to your site, and most importantly understanding what you're doing can be confusing to say the least. Before we discuss how to get on top, let's define some of the different players on the field.
If you've guessed it's not about which strategy to use, but rather how to use each strategy effectively, you're headed down the right path. I'm going to show you that all of these techniques share one common thread... you master that, and you're in the big leagues!
Here's the team line up in the current Internet marketing playing field.

RSS
RSS stands for Really Simple Syndication. It's a kind of programming language format that lets you share news headlines and stories through a type of Internet address called an "RSS feed." You can import these feeds into your site so you have the latest headlines from news around the world.
But the great part about them is that feeds exist for almost every subject matter, in every genre you can think of. Track a couple down that pump news about your business subject matter and you have juicy, up-to-date dynamic content on your web page to attract visitors! You can even take it one step further.
It's not that difficult to create your own RSS feed. In this case, you're actually writing the news about your business. People around the world can then display your information on their websites. If you make those news stories factual, appealing, attractive and easy to read, you're on your way to becoming a household name.

Link Popularity
Many would argue that this is the most crucial factor in search ranking today. You might have heard the "vote" analogy. Where every site that links to yours is like a vote for your site. The more sites you have linked to you, the more votes you have... the more votes, the higher your ranking.
That's an extremely simplified definition. There's a big curve ball though. Votes are not equal. Search engines weigh many factors about your site, your link text, the site that links to you, and the number and type of sites that link to it, just to name a few variables.
Link popularity is definitely a big contender in achieving high rankings. But, you don't want to link to just anyone. And stay far away from link farms and free-for-all link sites. They usually cover a full gamut of different kinds of sites. They're scams. And many are already banned from search engines and you will be penalized.
You want to stick with sites that are similar in theme and subject matter to yours. It's worth paying for some one way links to your site from well established, authoritative sources.

Articles
The concept of article marketing in terms of the Internet is two fold. If RSS were like a news ticker, articles would be the equivalent of magazines. Here's how the process works. Write an article about one particular interesting facet of your business. An issue, current event or strategy that would peak interest in like-minded readers who are targets. Next, post your article for free on sites like content-articles.com and ezinearticles.com that store giant collections of articles. Post your article on groups like yahoogroups.com that announce new articles by particular category.
Then, a person in your related field who is looking to fill their site with some well-written, interesting things that will keep people on their site selects your article to publish and includes it for all of their traffic to read.
How does this help your business? The key is in the very bottom of the article, in a few lines of type called the resource box. Here, you put your name, and a well written (and varying) text link to your website. (This is part of the agreement. If someone wants to reproduce your words on their site they may do so for free, but they must include the resource box with the active links.)
Now, a search spider comes along and indexes your article on their site, with a link pointing right back to your site! And remember that links back to your site, in a search engine's mind, mean that you must be important.
Secondly, people may want to learn more and follow the link right back to your site. Hint... posts your articles on your own site too. If someone wants to read more of what you've written, they have immediate access.

Blogs
According to Princeton University, a blog is a "web log, a shared on-line journal where people can post diary entries about their personal experiences and hobbies..."
What makes blogs so exciting is that anyone can have a blog. It's a chance to establish yourself in your area of expertise. Get your name around. People will keep coming back to your blog to hear what you have to say about a particular subject matter. While the term "diary" conveys a free writing style, the content of blogs can be quite educational and come from some powerful high ranking industry people in an easy to understand, all professional titles aside type of atmosphere. Hint: A blog can be an unprecedented educational tool.
Imagine that your passion is physics and you could read Einstein's own, every day thoughts on relativity in an atmosphere of his laptop by a fireplace, typing away at night in his jammies very relaxed, outside of the office. You have access to some great minds and thinkers, learn from them, expand upon their ideas, and become a leading mind yourself.


Search Engine Optimization
This is the combination of art, creative writing and statistical analysis that gives your site viewers rich content and presents it to the search engines in the most efficient method of achieving you high ranks. Optimization includes researching keywords, analyzing your competition, having just the right keyword in just the right place on the page in just the right frequency in just the right line of code so the engine feels your page is relevant. It involves streamlining your code, placing things like CSS and Java and other scripts in external files. Search engines love text, not the framework that makes it look pretty on the page.
The Dilemma
Search engines are evolving at an exponential rate, forced to adapt and develop battle strategies to combat programs and technology developed to outwit them. When link popularity became important, software to create hundreds of doorway pages to sites was invented, to cite one example.
The consequences from these people trying to outmaneuver the search companies are chaotic and create virtual gridlock on the Internet. However, statistics overwhelmingly show 75-80% of site traffic comes from search engines. So search engine marketing must be an equally strong part of your web development thrust.
You think ok, but if search engines are always evolving to stay a step ahead of the abusers, what if I manage to get my site way high on the list, only to be at the bottom again because a modification in the search-ranking algorithm is made. Suddenly my top 10 results plummet to position 880 overnight. That's pretty upsetting, not to mention a huge waste of money.
The Secret
Diversify
Don't rely only on keywords in your pages to get to the top. Like you would diversify a stock portfolio, write some articles, make sure you get yourself listed on good, quality directories and sites, and optimize your code. Develop an Internet marketing plan that won't cause you to plunge in the search ranks because of an algorithm change. Insert appropriately the eggs in one basket cliché here.
The Common Thread
RSS feeds, articles, link text, blogs... the combination of these and other methods will bring you to the top. But they have to be effectively consistent in one thing. They must all have superior content written for the end user, not the search engines. Remember; always sell to your potential customers, not the engines. Here's why.
A search engine company's goal is to provide the end user with the best, most relevant matches to a search inquiry. If they fail, the user moves to a different search engine. They don't want to lose customers any more than you do. To keep a customer, a search must be matched to the best content. Period. No matter what defenses or blockades or changes they evolve with overnight. If they don't provide results a user can use, they lose that user. And they won't allow that.
Therefore, only legitimate, quality content will be matched to the end users request. No matter what modifications to algorithms or stops to doorway pages or link farms a search engine company integrates into their defenses, they won't deviate from their goal of being the best. They can only do that by providing the best results.
An assortment of carefully planned and executed Internet marketing strategies and techniques will get you to the top. It does take some time, but your efforts must start with a solid foundation. You must look at the big picture. It's not something that can be done in an evening, so don't try. Instead, focus on building a well diversified skeletal system to your Internet marketing.

The Basics of SEO - Some FAQS

The Basics of SEO - Some FAQs
By Afzal Khan

Search engine optimization (SEO) is a foreign field to a lot of people. Rarely does a day go by when I don't get asked a few questions on the subject. So I've decided to post this FAQ article in the hopes that it will help people understand the basics, and make them a little more comfortable with the whole domain.

Q: Why are Search Engines Important To Me?

A: 85% of all website traffïc is driven by search engines. The only online activity more popular than search is email. 79.2% of US users don't go to page 2 of search results. 42% of users clïck on the no.1 result. For the under-40 age-group, the Internet will become the most used media in the next 2-3 years.

Q: How Do Search Engines Decide on Their Rankings?

A: IMPORTANT: You cannot pay a search engine in return for a high ranking in the natural results. You can only get a high ranking if your content is seen as relevant by the search engines. Search engines identify relevant content for their search results by sending out 'spiders' or 'robots' which 'crawl' (analyze) your site and 'index' (record) its details. Complex algorithms are then employed to determine whether your site is useful and should be included in the search engine's search results.

Q: Can't I Just Pay for a High Ranking?

A: No. The biggest concern for search engine companies like Google and Yahoo is finding content that will bring them more traffïc (and thus more advertising revenue). In other words, their results must be relevant. Relevant results makes for a good search engine; irrelevant results makes for a short-lived search engine.

Most search engines these days return two types of results whenever you clïck Search:

Natural/Organic - The 'real' search results. The results that most users are looking for and which take up most of the window. For most searches, the search engine displays a long list of links to sites with content which is related to the word you searched for. These results are ranked according to how relevant and important they are.

Paid - Pure advertising. This is how the search engines make their monëy. Advertisers pay the search engines to display their ad whenever someone searches for a word which is related to their product or service. These ads look similar to the natural search results, but are normally labeled "Sponsored Links", and normally take up a smaller portion of the window.

Q: How Do I Get a High Ranking?

A: There are four main steps:
Step 1 - Use the right words on your website.
Step 2 - Get lots of relevant sites to link to yours.
Step 3 - Use the right words in those links.
Step 4 - Have lots of content on your site & add more regularly.

Q: What is Search Engine Optimization (SEO)?

A: Search Engine Optimization (SEO) is the art of creating a website which is search engine-friendly. This means:
Using the right words in your copy.
Using the right words in your HTML code.
Structuring your site properly.
designing your site properly.

Many people use SEO to also describe the other ingredient in a high ranking, 'Link Popularity'.

Q: What is Link Popularity?

A: Think of the search engines as a big election. All the websites in the world are candidates. The links to your website are votes. The more votes (links) a candidate (website) has, the more important it is, and the higher its ranking. Link popularity is all about how many links you have, and how you can get more.

Links to your site tell the search engines how important your site is. They assume that if it's important enough for a lot of other sites to link to, it's important enough for them to display at the top of the rankings. Links are the single most important factor in ranking. Generally speaking, the more links you have to your site from other sites, the better your ranking.

Q: Are Some Links Better Than Others?

A: Yes! The ideal kind of links are those that:
come from relevant sites (sites which use the same keywords);
come from important sites (have a high ranking);
include your keyword as part of the visible link text;
include varying link text (not the same link text each time); and
come from a page that links to few other sites.
When a search engine sees a link which satisfies most or all of these conditions, it says, "Hey, this site must be credible and important, because others in the same industry are pointing to it."

Q: How Do I Get Lots of Links Back to My Site?

A: There are many possible ways to generate links. Some are dubious (like auto-generation software, and sites set up by webmasters simply to host links to their other sites) and I won't be discussing them hëre. Others, like those discussed below, are legitimate.
Add your site to DMOZ & Yahoo Directories (and other frëe directories)
Chëck where your competitors' links are coming from
Article PR - Write and submit articles for Internet publication
Swap links
Partnër websites
Pay for links

Q: What Do You Think is the Best Way to Get Lots of links?

A: Article PR. Write helpful articles and let other webmasters publish them for frëe in exchange for a link in the byline. With article PR, you don't have to pay for the link, you determine the content of the page containing the link, you determine the link text, and the link is more or less permanent. A single article can be reprinted hundreds of times, and each time is another link back to your site!


Q: How Long Does It Take to Get a High Search Engine Ranking?

A: A long time! It's impossible to say how much time you'll need to spend generating links, but you can be sure it'll be a while no matter which method of link generation you use. You just have to keep at it until you have achieved a high ranking. Even then, you'll still need to dedicate some ongoing time to the task, otherwise your ranking will drop.

Q: What is the Google Sandbox, and is It Real?

A: The Google Sandbox theory suggests that whenever Google detects a new website, it withholds its rightful ranking for a period while it determines whether your site is a genuine, credible, long term site. It does this to discourage the creation of SP@M websites (sites which serve no useful purpose other than to boost the ranking of some other site).

There is a lot of anecdotal evidence supporting the theory, but there is also a lot discounting it. No one has categorically proven its existence.

Q: What is the Google Dampening Link Filter, and is It Real?

A: The Google Dampening Link Filter theory suggests that if Google detects a sudden increase (i.e. many hundreds or thousands) in the number of links back to your site, it may sandbox them for a period (or in fact penalize you by lowering your ranking or blacklisting your site altogether).

There is a lot of anecdotal evidence supporting the theory, but there is also a lot discounting it. No one has categorically proven its existence.

Q: What SEO companies Should I be Wary of?

A: Be wary of SEO companies that promise or guarantëe results in a given timeframe, especially if they won't expand on their methods for generating links back to your site.

Q: What Tools Can You Recommend?

A: There are many very useful tools to help with your SEO. The following are just selection. All tools are frëe unless otherwise indicated.

* Backlink checker
* Backlinks by IP address
* Link Popularity Tool
* Number of links required to rank
* Google Alert
* Google Sitemaps
* Google Sitemap Generator
* Google Toolbar
* Indexed Pages
* Keyword Analysis (Nichebot)
* Keyword Analysis (Overture)
* Keyword Analysis (WordTracker - Paid)
* Keyword Difficulty
* Keyword Identifier
* Keyword Density Measurement (Simple)
* Keyword Density Measurement (Complex)
* Plagiarized Copy Search
* Traffïc Rank
* Search Engine Rank
* SEO Report
* Top 10 sites for a keyword by no. of backlinks and age
* Top 10 sites for a keyword
* Spider Simulator


Q: What is Keyword Analysis?

A: The first thing you need to do when you begin chasing a good search engine ranking is decide which words you want to rank well for. This is called performing a keyword analysis. Keyword analysis involves a bit of research and a good knowledge of your business and the benefits you offer your customers.


Q: Do I need to submit my site to the search engines?

A: Theoretically, no. But I wouldn't risk not doing it - especially as it's frëe. As soon as you register your domain name, submit it to Google! Even if you haven't built your site, or thought about your content, submit your domain name to Google. In fact, even if you haven't fully articulated your business plan and marketing plan, submit your domain name to Google.


Q: Should I submit my site to the search engines more than once?

A: No need. Although some of the search engines allow you to do this, there's really no need.

Q: What are directories and should I submit my site to them?

A: Directories are websites (or web pages) which simply list lots of websites and give a quick description of the website. Some are frëe and some require you to pay for a listing. Frëe directories are useful because you get a frëe link. However, the links aren't worth that much. Paid directories can be good if they're relevant, but they can cost a lot in the long term, so choose wisely.

One essential directory for any website is the DMOZ Open Directory Project.

SEO Glossary

AlgorithmA complex mathematical formula used by search engines to assess the relevance and importance of websites and rank them accordingly in their search results. These algorithms are kept tightly under wraps as they are the key to the objectivity of search engines (i.e. the algorithm ensures relevant results, and relevant results bring more users, which in turn brings more advertising revenue).

Article PR
The submitting of free reprint articles to many article submission sites and article distribution lists in order to increase your website's search engine ranking and Google PageRank. (In this sense, the "PR" stands for PageRank.) Like traditional public relations, article PR also conveys a sense of authority because your articles are widely published. And because you're proving your expertise and freely dispensing knowledge, your readers will trust you and will be more likely to remain loyal to you. (In this sense, the "PR" stands for Public Relations.)

Article submission sites
Websites which act as repositories of free reprint articles. They are sites where authors can submit their articles free of charge, and where webmasters can find articles to use on their websites free of charge. Article submission sites generate revenue by selling advertising space on their websites. See also article PR.

Backlink
A text link to your website from another website. See also link.

Copy
The words used on your website.

Copywriter
A professional writer who specializes in the writing of advertising copy (compelling, engaging words promoting a particular product or service). See also SEO copywriter and web copywriter.

Crawl
Google finds pages on the World Wide Web and records their details in its index by sending out ‘spiders’ or ‘robots’. These spiders make their way from page to page and site to site by following text links. To a spider, a text link is like a door.

Domain name
The virtual address of your website (normally in the form www.yourbusinessname.com). This is what people will type when they want to visit your site. It is also what you will use as the address in any text links back to your site.

Ezine
An electronic magazine. Most publishers of ezines are desperate for content and gladly publish well written, helpful articles and give you full credit as author, including a link to your website.

Flash
A technology used to create animated web pages (and page elements).

Free reprint article
An article written by you and made freely available to other webmasters to publish on their websites. See also article PR.

Google
The search engine with the greatest coverage of the World Wide Web, and which is responsible for most search engine-referred traffic. Of approximately 11.5 billion pages on the World Wide Web, it is estimated that Google has indexed around 8.8 billion. This is one reason why it takes so long to increase your ranking!

Google PageRank
How Google scores a website’s importance. It gives all sites a mark out of 10. By downloading the Google Toolbar, you can view the PR of any site you visit.

Google Toolbar
A free tool you can download. It becomes part of your browser toolbar. It’s most useful features are it’s PageRank display (which allows you to view the PR of any site you visit) and it’s AutoFill function (when you’re filling out an online form, you can click AutoFill, and it enters all the standard information automatically, including Name, Address, Zip code/Postcode, Phone Number, Email Address, Business Name, Credit Card Number (password protected), etc.) Once you’ve downloaded and installed the toolbar, you may need to set up how you’d like it to look and work by clicking Options (setup is very easy). NOTE: Google does record some information (mostly regarding sites visited).

HTML
HTML (HyperText Markup Language) is the coding language used to create much of the information on the World Wide Web. Web browsers read the HTML code and display the page that code describes.

Internet
An interconnected network of computers around the world.

JavaScript
A programming language used to create dynamic website pages (e.g. interactivity).

Keyword
A word which your customers search for and which you use frequently on your site in order to be relevant to those searches. This use known as targeting a keyword. Most websites actually target ‘keyword phrases’ because single keywords are too generic and it is very difficult to rank highly for them.

Keyword density
A measure of the frequency of your keyword in relation to the total wordcount of the page. So if your page has 200 words, and your keyword phrase appears 10 times, its density is 5%.

Keyword phrase
A phrase which your customers search for and which you use frequently on your site in order to be relevant to those searches.

Link
A word or image on a web page which the reader can click to visit another page. There are normally visual cues to indicate to the reader that the word or image is a link.

Link path
Using text links to connect a series of page (i.e. page 1 connects to page 2, page 2 connects to page 3, page 3 connects to page 4, and so on). Search engine ‘spiders’ and ‘robots’ use text links to jump from page to page as they gather information about it, so it’s a good idea to allow them traverse your entire site via text links.

Link partner
A webmaster who is willing to put a link to your website on their website. Quite often link partners engage in reciprocal linking.

Link popularity
The number of links to your website. Link popularity is the single most important factor in a high search engine ranking. Webmasters use a number of methods to increase their site's link popularity including article PR, link exchange (link partners / reciprocal linking), link buying, and link directories.

Link text
The part of a text link that is visible to the reader. When generating links to your own site, they are most effective (in terms of ranking) if they include your keyword.

Meta tag
A short note within the header of the HTML of your web page which describes some aspect of that page. These meta tags are read by the search engines and used to help assess the relevance of a site to a particular search.

Natural search results
The ‘real’ search results. The results that most users are looking for and which take up most of the window. For most searches, the search engine displays a long list of links to sites with content which is related to the word you searched for. These results are ranked according to how relevant and important they are.

Organic search results
See natural search results.

PPC (Pay-Per-Click advertising)
See Sponsored Links.

PageRank
See Google PageRank.

Rank
Your position in the search results that display when someone searches for a particular word at a search engine.

Reciprocal link
A mutual agreement between two webmasters to exchange links (i.e. they both add a link to the other’s website on their own website). Most search engines (certainly Google) are sophisticated enough to detect reciprocal linking and they don’t view it very favorably because it is clearly a manufactured method of generating links. Websites with reciprocal links risk being penalized.

Robot
See spider.

Robots.txt file
A file which is used to inform the search engine spider which pages on a site should not be indexed. This file sits in your site’s root directory on the web server. (Alternatively, you can do a similar thing by placing tags in the header section of your HTML for search engine robots/spiders to read.

Sandbox
Many SEO experts believe that Google ‘sandboxes’ new websites. Whenever it detects a new website, it withholds its rightful ranking for a period while it determines whether your site is a genuine, credible, long term site. It does this to discourage the creation of SPAM websites (sites which serve no useful purpose other than to boost the ranking of some other site). Likewise, if Google detects a sudden increase (i.e. many hundreds or thousands) in the number of links back to your site, it may sandbox them for a period (or in fact penalize you by lowering your ranking or blacklisting your site altogether).

SEO
Search Engine Optimization. The art of making your website relevant and important so that it ranks high in the search results for a particular word.

SEO copywriter
A ‘copywriter’ who is not only proficient at web copy, but also experienced in writing copy which is optimized for search engines (and will therefore help you achieve a better search engine ranking for your website).

Search engine
A search engine is an online tool which allows you to search for websites which contain a particular word or phrase. The most well known search engines are Google, Yahoo, and MSN.

Site map
A single page which contains a list of text links to every page in the site (and every page contains a text link back to the site map). Think of your site map as being at the center of a spider-web.

SPAM
Generally refers to unwanted and unrequested email sent en-masse to private email addresses. Also used to refer to websites which appear high in search results without having any useful content. The creators of these sites set them up simply to cash in on their high ranking by selling advertising space, links to other sites, or by linking to other sites of their own and thereby increasing the ranking of those sites. The search engines are becoming increasingly sophisticated, and already have very efficient ways to detect SPAM websites and penalize them.

Spider
Google finds pages on the World Wide Web and records their details in its index by sending out ‘spiders’ or ‘robots’. These spiders make their way from page to page and site to site by following text links.

Sponsored Links
Paid advertising which displays next to the natural search results. Customers can click on the ad to visit the advertiser’s website. This is how the search engines make their money. Advertisers set their ads up to display whenever someone searches for a word which is related to their product or service. These ads look similar to the natural search results, but are normally labeled “Sponsored Links”, and normally take up a smaller portion of the window. These ads work on a
Pay-Per-Click (PPC) basis (i.e. the advertiser only pays when someone clicks on their ad).

Submit
You can submit your domain name to the search engines so that their ‘spiders’ or ‘robots’ will crawl your site. You can also submit articles to ‘article submission sites’ in order to have them published on the Internet.

Text link
A word on a web page which the reader can click to visit another page. Text links are normally blue and underlined. Text links are what ‘spiders’ or ‘robots’ use to jump from page to page and website to website.

URL
Uniform Resource Locator. The address of a particular page published on the Internet. Normally in the form http://www.yourbusinessname.com/AWebPage.htm.

Web copy
See copy.

Web copywriter
A ‘copywriter’ who understands the unique requirements of writing for an online medium.

Webmaster

A person responsible for the management of a particular website.

Wordcount
The number of words on a particular web page.

World Wide Web (WWW)
The vast array of documents published on the Internet. It is estimated that the World Wide Web now consists of approximately 11.5 billion pages.

Future of SEO

Taking a look back at SEO in 2005...
2005 was a great year for Search.
But What Does That Mean For You? How does the progress made in 2005 affect YOUR business?
A quick recap of 2005
2005 could be called the Year of the "Black Hat"/ "White Hat" Controversy. One thing is for certain though – no matter what color your hat is - ethical optimization tactics are what the long term, big picture thinkers should go for.
2005 brought some outrage at Google. They had to deal with the backlash of angry searchers fearing that Google was sacrificing the relevancy they are known and loved for, in exchange for profits.
People rallied against the Black Hatters that infiltrated the search results. Jagger proved to rid the SERPs of spam-filled, software generated sites, so that the "White Hatters" didn't have to compete with the "Black Hatters" (at least for now).
Most importantly, what happened in 2005 was that public awareness of SEO increased. More people are optimizing their sites and realizing the immense potential of organic search.
As the Big 3 duked it out for an increase in market share – the winners were:
Searchers, looking to find information, resources and products to buy.
You
Clearly the exposure Search has received and the increased numbers of people using Search has benefited companies like you – who are looking to get their information/product/service out there into the hungry consumers' hands.
Call us at 714-612-8066 to talk about the opportunïty for your site.
Search has become such a giant that it even has mammoth retailer Wal-Mart scared!
"We watch Google very closely at Wal-Mart," said Jim Breyer, a member of Wal-Mart's board.
Their concern? Google makes so much information readily available; Google might soon be able to tell Wal-Mart shoppers if there are better bargains online, or at other nearby retailers.
As Google continues to gain popularity, and is used as the starting point for finding information and buying products and services online, companies that previously didn't worry about Google, now view Google as a threat to their business or even a potential competitor.
Wouldn't you love your site to be one of the sites listed in Google that has big retail chains scared for their future?
Consider the opportunïty that exists today, with the engines gaining popularity, and more and more people getting online. The traffïc your site can tap into is growing daily. Websites that use aggressive and ethical SEO have everything to gain. Click here to learn more about Search and how you can benefit.
Another exciting event in 2005: Blogs made a grand entrance into the mainstream and changed the way people share information, photos and communications. Blogs opened up interactivity between website owners and online surfers (potential clients). Blogs created new opportunïty for site owners to cast their nets wider and gather more potential visitors.
Looking forward to 2006, what can we expect?
Consensus at the Search Engine Strategies Conference (SES) that took place in Chicago in early December is that SEO is not a bubble that is about to burst; and we can expect the search engine industry to continue to grow and evolve. Plan on continuing to invest in your website's search engine rankings. Search will remain an important source of traffïc for websites, and a crucial element in your marketing plan.
What is the best way to participate in the growth and evolution of Search and to ensure you are getting a piece of the action?
To answer that question, you have to consider the following:
The landscape of SEO is changing almost daily.
Which makes hiring an SEO professional more crucial than ever before. It is becoming even more obvious that a novice can't get the results they want and need in the SERPs. Danny Sullivan, respected SEO expert and keynote speaker at SES, as well as many other experts in the field note that you need real expertise (or a whole lot of time) to get a high organic ranking in today's search world.
The Do's and Don'ts of SEO for 2006...
Don't:
Have hidden text on your web pages (especially text that is hidden in invisible CSS layers)
Have paid links or other links that are not within Google's quality guidelines
Overuse internal links or anchor text within your optimization
Steal other people's copy – the engines vow to determine the original source of content and not let stolen content benefit the thieves
Optimize your site yourself unless you are sure you are 100% clear on all steps and guidelines, and have a lot of time
Panic if you aren't in the engines yet, it's not too late to get in the game
Do:
Follow best practices – stay within the engines' guidelines
Avoid Overkill – keep the "nature theory" in mind
Add relevant keyword rich content. Remember the creation of original and quality content is where it's at
Hire a professional that is willing to share information with you, one that won't hide their techniques. SEO is a team effort -- you need to listen to your SEO Company and they need to understand your goals and objectives.
Take action now, and don't watch your competitors horde all the traffïc.
Future changes...
Keep your eye on the news: We can definitely expect to hear more about Google's patent for user targeted, or attention targeted, search results which will change the ranking of Google's organic results for each individual user, based upon that user's search behavior, location, sites visited, etc.
This means as the search engines do more to profile searchers, they will be able to manipulate or control results. This one will be interesting to watch play out, and will require feedback from searchers, companies using Search as a marketing method and SEO experts.
The future of links:
Links are still an important component to an SEO campaign. Links were devalued temporarily as the engines realized there was too much manipulation going on. However, with recent algorithm shifts, we now know that links are still important – as long as you follow the rules. Links must be relevant – there is nothing more important than this. Links must also be attained in reasonable quantities in reasonable timeframes. There is still controversy out there about one way versus reciprocal links. Recent and surprising articles and studies have shown that reciprocal links may not be all bad – as long as they are extremely relevant and not part of a link farm or mass linking effort. The bottom line on links is as the Black Hats find ways to beat the system, the engines will change the criteria and you can expect this to continue evolving throughout 2006 and beyond.
Plans for 2006
As Organic search grows and the search industry continues to evolve plan to devote a lot of time to staying current. Or - better yet – hire an expert that will do this for you. Free yourself up to do what you do best, run your business and leave the links, algorithm shifts and all of the other SEO techniques to the experts. Call us at 714-612-8066 to discuss how we can help you get the growth and exposure your site needs in 2006.
When talking to an SEO Firm, remember the SEO experts you hire need to be:
Accessible to you
Willing to show you samples of their work
Not only allow but encourage your involvement in the process, especially in keyword selection
Willing to share their techniques and not insist they have "proprietary info and methods"
Able to analyze your site and tell you realistically what results you should expect
"Clients will say, 'I want to be found for 'homes,' and they'll find an SEO who'll say 'Okay, we can do that,' even though there's no way they can do that." Says Jill Whalen, SES presenter and respected SEO expert. Jill went on to say, "It's unfortunate that many SEO companies aren't honest and say 'No, we can't do that — and it's not necessarily what you really want us to do."
SEO Challenge 2006:
Make 2006 the year you get serious about your business. Commit to a Search Engine Optimization Campaign and make this the year you overtake your competitors and grow your online business to unbelievable levels.