How to Download, Install and Use Google Talk on your PC.
You can talk free any where in the world with best voice quality. By using google talk software on PC. Truphone, a mobile VoIP company, has revealed that users of the Google Talk service can now connect with Truphone mobile handsets for free, anywhere in the world.
Truphone provides free software which allows Internet-rate phone calls to be made by users of Wi-Fi-equipped Nokia mobile phones over Wi-Fi connections. The handset reverts to a normal mobile phone, from a Truphone, when the handset is not in Wi-Fi range, meaning users only need a single handset.
What these ads mean is that if, for example, a nonprofit that helps the elderly has a Web site
and an advertiser dealing with the elderly has signed up for contextual ads, their PPC advertisement may appear on the first nonprofit’s Web site. When researching PPC strategies for nonprofits, Google Grants need to be taken into consideration because the program rules may change the direction of a PPC plan.
Google Grants can help nonprofits pay for Google AdWords advertising. The program is designed to help organizations extend their public service messages to the global audience, in an effort to make a greater impact on the world . Grant recipients receive at least three months of advertising through Google AdWords, with a per-month spending cap of $10,000.
Sheryl Sandberg, Google vice president of global online sales and operations, said, “When we thought about what we could give back, what we obviously do is search and advertising, and it would be a great opportunity for us to refer people interested in the topics that these non-profits work on.
The format of a PPC using Google AdWords, for example, includes a headline that is limited to 25 characters, a second and third line that are limited to 35 characters, and a fourth line that provides the URL, or Web address, that people will see in the advertisement .
Labels: Paid SearchA 2005 62% of people using the Internet do not understand the difference between organic and paid listings on a search engine results page. Pay-per-click advertising is an innovation in marketing. Small businesses that would not traditionally advertise on the Internet are now spending much more on Web advertising because it is so cost-effective .
Labels: Google advertising, Paid Search
They guard these proprietary algorithms because if people knew exactly how the algorithms worked, they would take advantage of that knowledge to ensure the site they were promoting ranked first. The little bits of algorithm that people figure out themselves often get so abused that the search engines eventually devalue them. Good HTML titles, good body copy, great content, ensuring that your site doesn’t have blocks to crawling—these have worked for nearly a decade. Grappone and Couzin noted that Sullivan did not mention anything about trying to figure out the elusive algorithm.Some of these methods include keyword stuffing, in which one repeats the content of the meta tags over and over.
Keyword stuffing is designed to make the search engine crawler rank a Web site higher based on the specific meta tag keywords that are being repeated. Duplicate content is another spam technique designed to trick a search crawler into ran king a Web site higher than a more relevant Web site. Another spamdexing technique is the use of a link farm, a page of links that is created only to artificially boost a linking strategy, so that a page ranks higher in search results.Penalties for such techniques will differ among the different search engines, but they can include delisting of a Web site from the search engines. If a Web site is delisted, an organization will have to explain to the search engines what happened, and it could take months to fix the problem. Noted that the purpose of a search engine is to find, index, and serve content to a user looking to find something. If an organization approaches the creation of a Web site in the same manner a search engine approaches serving content,then the goals will naturally align.
Accordingly, giving extra effort to link building.
Site map. A site map can help a Web site become more accurately linked. This site map should not be confused with the site map used to help people navigate a Web site. This site map is an XML-based document at the root of your HTML, that contains information (URL, last updated, relevance to surrounding pages, and so on) about each of the pages within a site. Using this XML site map will help to ensure that even the deep pages within your site are indexed by search engines.
There are a number of additional factors that can influence Web site ranking. Google, for example, probably includes hundreds and possibly even thousands of factors in its algorithms. There is an inherent problem with SEO, which is that no one really knows the exact algorithm used to determine site ranking. Grappone and Couzin described this conundrum the following way: Here’s something that drives people crazy about SEO: You can’t ever be 100 percent sure that what you’re doing will be rewarded with the rank and the listing you want. This is because the search engines keep their internal ranking mechanism, even the criteria by which the ranking is determined, under wraps.
Inbound links to a Web site, meaning other Web sites have a link to that Web site within them, also are important to SEO. They are important because they can indicate a page’s quality, popularity, or status on the Web, yet site owners have little control over which sites link to them. When links became part of the criteria used by crawlers to rank Web sites, some unscrupulous SEO marketers developed “link farms” to artificially increase rankings.
Today, links usually must be related to the content of the page; if a Web site’s inbound or outbound links do not match the keywords the site is listing, they will be of little value to the site’s ranking . Google and Yahoo! have tools that allow a user to see all of the Web sites that are linked to a particular site. To view links to a Web site in Google, open the Google search
engine and type “link:” before the Web address.
An example of this would be link:http://www.smumn.edu. To view links in Yahoo!, open the search engine and type in the Web address, adding “linkdomain” to the beginning and making sure to leave off “http://”, like this: linkdomain:www.smumn.edu .
It has a huge advantage with regards to site links:
The culture of the Web generally adores noncommercial content—something your website should be chock full of. And, let’s face it, giving you a link doesn’t cost a thing. Any webmaster or blogger who supports your cause—or at least has no major problem with it—will see adding a link [from their site to yours] as a cheap and easy way to help out. You will want to adjust your SEO plan.
After keywords are developed and added to the site tags, the Web site content needs to be adjusted to make sure the keywords are prominently featured throughout the Web site .It seems obvious, but you would be surprised at how many site owners miss this simple point: In order to rank well for a particular set of keywords, your site text should contain them.
Some SEO professionals recommend that a Web site should have 250 to
1,000 words per page and that 5% to 10% of those should be the organization’s keywords. The number of keywords per page divided by the number of words per page is called keyword density.
Search engines also may begin to ignore a Web site if the content is old. It is pointed out that the content of a Web site should be regularly updated. How fresh is your content? How relevant is it? How often is it updated? And how much content is there. Although organic search is cheaper than the paid placement that is the focus of the next section of this research, updating content may be difficult for some nonprofits.
Cortes conducted a series of interviews in 2004 and 2005 to determine the technology adoption of San Francisco–based nonprofits. They noted that some nonprofits relied on volunteers or students to create their organizations’ Web sites. The organizations had to count on those people to update their Web sites, but that presented a problem:
A number of organizations, when asked for examples of how they kept their Web sites current, stated that they returned to the volunteer who had originally built it. When that volunteer was no longer available, they struggled to find a replacement, sometimes with no success. One organization had its Web site developed by a student so that it could easily update the content on its own. It gave an example of a nonprofit and the challenges it faced in updating content on its Web site.
Web sites are text files that exist on Web servers. These text files are much like the files created and used by word-processing software. To enable Web browsers such as Microsoft’s Internet Explorer, Mozilla’s Firefox, or Opera to read these files, the files must be formatted according to a generally accepted standard. The standard used on the Web is HyperText Markup Language (HTML). HTML uses codes, or tags, that tell the Web browser software how to display the text contained in the text file.
For example, a Web browser reading the following line of text
A Review of the Book Wind Instruments
recognizes the and tags as instructions to display the entire line of text
in bold and the and tags as instructions to display the text enclosed by
those tags in italics.
When assessing a Web site for search engine optimization, one should review a number of elements, including site and page tagging, page content, site links, and the site map. Each of these elements is crawled by search engines to determine Web site ranking . Each of these elements should be addressed separately.
Site and page tagging. The HTML page title is today’s hands-down leader, and an Eternally Important factor, in search engine ranking algorithms.
In addition to the bold and italics tags used to format text, there is another tag on Web sites called the meta tag. Meta tags are included in the coding of a Web site and are essential to having the site listed properly in a search engine .
A META tag is HTML code that a Web page creator places in the page header for the specific purpose of informing Web robots about the content of the page. META tags do not cause any text to appear on the page when a Web browser loads it; rather, they exist solely for the use of search engine robots.
It is pointed out that the most important meta tags to a search engine are the title and the description tag. Schneider and Evans also discussed the use of meta tags, including a keyword meta tag. They stated that many search engines’ crawlers, spiders, or robots do not search an entire Web site. Some search engine robots collect information only from a Web page’s title, description, keywords, or HTML tags; othersread only a certain amount of the text in each Web page.
Schneider and Evans demonstrated the first few lines of HTML from a Web page that contains information about electronic commerce:
Current Developments in Electronic Commerce
electronic commerce developments.”>
data interchange, value added reseller, EDI, VAR, secure socket layer, business
on the Internet”>
Developing the correct keywords to get a site ranked higher than others is a big part of organic search engine optimization. To develop the correct set of keywords, however, one must know which keywords searchers are using. There are many different ways to build a list of keywords, including brainstorming, looking at the competition, and using keyword-building tools. Wordtracker is the leading keyword research tool. Yahoo! and Google both have their own keyword-building tools.
Labels: SEO
When the user types a word or phrase into a search box and clicks a button. “Wait a few seconds and references to thousands (or hundreds of thousands) of pages will appear. Then all you have to do is click through those pages to find what you want” . Some of the results are called organic results, or natural results. These are the most relevant matches for that search request among all of the indexed Web pages.
There also are paid placement listings, which can show up on the top or the side of the search engine. Paid placement listings are described as the technique by which a search engine devotes space on its search results page to display links to a Web site’s page based on the highest bid for that space. Some search engines differentiate the two types of listings
others do not.
An organization’s Web site likely will rank within the first few thousand results in
a search engine. However, as it pointed out, That’s just not good enough. Being ranked on the ninth or tenth page of search results is tantamount to being invisible. To be noticed, your site should be ranked much higher. . . . Most people won’t look past the third page of search results, if they get that far. The fact is, it’s the sites that fall on the first page of results that
get the most traffic. Every Web site is being indexed by crawlers or spiders. It exists with the millions of other sites on the Internet. To get a Web site noticed by the crawlers, certain elements must stand out. Making those elements stand out is the process of search engine
optimization .
Google is receiving the lion’s share of the searches, with Yahoo! and MSN
following up in second and third places, respectively
The crawler is a specialized software program that hops from link to link on the World Wide Web, scarfing up the pages it finds and sending them back to be indexed. This is very similar to Matthew Gray’s earliest search engine, which searched and indexed entire files on the Internet, not just the titles. The crawler sends its data back to a massive database called the index. The runtime system or query processor is the user interface you see on a search engine’s Web site, where you type in your search words.
three critical pieces of search, and all three must scale to the size and continued growth of the Web: they must crawl, they must index, and they must serve results. This is no small task: by most accounts, Google alone has more than 175,000 computers dedicated to this job. That’s more than existed on Earth in the early 1970s!
The Web is huge; it is so big that it is hard to get an accurate count of Web pages. In January of 2004, it was estimated that the Web contained over 10 billion pages; with an average world population of 6.4 billion, that is almost two pages per person. In 2003 Google reported that it served 250 million different searches per day.
Most people are using search engines in their daily lives to find information on the Web. The most recognizable part of a search engine is the query interface. This is the home page that is displayed when you visit a major search engine such as Google, Yahoo!, or MSN.
The query interface is the only part of a search engine that the user ever sees. Every other part of the search engine is behind the scenes, out of view of the people who use it every day. That doesn’t mean it’s not important, however. In fact, what’s in the back end is the most important part of the search engine.
Pay-per-click in its current form began when an entrepreneur named Bill Gross
developed an idea for the first PPC search engine, goto.com, which changed its name to
Overture in 2001 and was acquired by Yahoo! in 2003. People were initially skeptical of
PPC search engines, thinking that users would not want to use a search engine filled with
advertisements.
It was not until late 2000, when Google introduced its AdWords program, that the industry started to mature . From its inception as a business in the late 1990s to 2004, paid search as an industry grew from a base in the low millions to $4 billion in revenue, and it is estimated to hit $23 billion by 2010.
Google is the present king of search engines. What has made Google a household word is the accuracy of its search results. Brin and Page were able to achieve greater search accuracy by focusing not only on keywords, but also on link popularity. Link popularity looked at how many other pages linked to the pages with the keywords. These are just a few of the hundreds of criteria that search engines use in ranking the relevancy of Web pages .
It wasn’t long after the advent of search engines that advertisers noticed that search engine sites were receiving numbers of hits in orders of magnitude greater than any other type on the web. Receiving daily hits in the millions, search engines seemed like advertising gold mines.
Monier was trying to create a sales demonstration for the new Alpha processor, basically a public relations ploy to show how powerful Digital Equipment Corp.’s new processor was. Monier put the entire Web on one computer to showcase the speed of the new processor; this is how the search engine AltaVista was born.
With no marketing and no formal announcement AltaVista generated nearly 300,000 visits on its first day alone. Within a year, the site had served more than 4 billion queries. Four billion—nearly as many queries as people on the Earth. AltaVista went online in 1995. It was the first search engine to allow natural-language searches and other advanced search techniques. AltaVista also provided multimedia search capability for photos, music, and videos . By 1997 AltaVista was truly the king of search. Serving more than 25 million queries a day and on track to make $50 million in sponsorship revenue . AltaVista in 1999 was the Google of its time and the most popular brand on the Web. The Web site was in a three-way tie with AOL and Yahoo! as the most important destination on the Web.
These names are a bit misleading as they give the impression the software itself moves between sites like a virus; this is not the case, a robot simply visits sites by requesting documents from them. The concept of robots or spiders is very important to understanding search results from today’s Web browsers.
In 1994 Jerry Yang and David Filo created Yahoo!. It started out as a collection of their favorite Web sites, but Yahoo! also contained descriptions of what the user would find on the page. Later in 1994, WebCrawler was introduced, developed by University of Washington researcher Brian Pinkerton. WebCrawler was of major importance in that it did not simply search and index the Internet for document titles; it indexed the entire document and made the entire document available to search. AOL used WebCrawler, its full-text search, and a simple browser-based interface to make the Web fit for mainstream consumption beyond academics and tech geeks. The following quote reveals just how small the Internet was in its infancy: When the Internet was young, when the Web comprised less than 10 million pages, when Yahoo! was a funky set of links and “Google” was just a common misspelling for a very large number, Louis Monier put the entire Web on a single computer.
The results led not to the exact article but only to a computer that contained it, where the user would have to search for the actual article . Gopher was a database archive created in 1991 by Mark McCahill at the University of Minnesota. It was named after the school’s mascot. Where Archie had used FTP to create a searchable database of computer files, Gopher was able to index the titles of plain-text documents .
In 1993 students at the University of Nevada developed a search engine that not only would find an article on the Internet but also would take the user directly to it, rather than just to the computer where the document resided. This enhanced search engine worked much like Archie, but substituted Gopher for FTP. Gopher was a popular and more fully featured Internet file-sharing standard than FTP .
These students, playing on the name Archie from the comic book, named their search engine Veronica (Very Easy Rodent-Oriented Net-wide Index to Computerized Archives). Still, the main limitations of both Archie and Veronica were that they would only search the title of the document, not the content. “From 1993 to 1996, the Web grew from 130 sites to more than 600,000” . Watching the growth of the Internet was Matthew Gray, a researcher at the Massachusetts Institute of Technology. He is considered a pioneer of the earliest Web-based search engine because he developed the WWW Wanderer. The Wanderer was different from any of its predecessors. Gray realized the Internet was growing faster than any human could track. The Wanderer was a program called a robot that would scour the Web and create an index of everything it found. Gray developed an interface that allowed searching the created index. “It wasn’t the greatest search engine that ever was, but it was the first search engine that ever was” .
Keyword: A word or phrase describing an organization’s product or service or other key content on its website. A word or phrase entered in a search engine.
META tag: An HTML element that a Web developer places in a Web page header to inform Web robots about the page’s content .
Nonprofit: A legally constituted organization whose primary objective is to support or to actively engage in activities of public or private interest without any commercial or monetary profit purposes.
Organic search: Results returned when a user types a particular keyword into a search engine. These are also called natural results and contrast with PPC advertising.
Pay-per-click advertising: A method of marketing where a business pays a certain amount of money each time someone clicks on a small ad on a search engine’s results page or home page and is then taken to the advertiser’s website.
Search engine: A software application that indexes and serves content to an Internet user who is looking for something specific .
Search engine marketing: The activities that improve search referrals to a Web site, using either organic or paid search. Also known as search marketing.
For-profit organizations have been using many techniques to increase the likelihood that they will show up very high in the results when someone tries to find them using a search engine. For example, when someone searches for a museum, the museum’s goal is that the search engine will return their museum within the first page of search engine results; hopefully the museum will show up as the number one ranking. But what if an organization does not appear in search results when a Web searcher types in the organization’s city and the museum’s name? There are ways to optimize placement in the search engine rankings.
Research conducted in 2005 by search consulting firm oneupweb.com showed that top 10 placement in Google increased site traffic to five times its previous levels in the first month.
Marshall and Todd (2007) pointed out that every second of each day, 3,000 searches are performed on Google—180,000 searches per minute, all day, every day. Online advertising is becoming increasingly expensive. A major challenge for nonprofits is to spread the word about their organization, despite a limited budget. Nonprofits must balance the cost of advertising efforts that build brand awareness with those that generate donations . People are becoming increasingly more reliant on the Internet for information. Online newspapers and phone books are ever present. A study completed by Mindshare Interactive Campaigns and Harris Interactive: “The study found that nearly 40 percent of people who support nonprofit organizations either as a donor, volunteer, or advocate report that they consult online sources of charity information before making donations.
When a potential constituent, customer, donor, volunteer, or board member is trying to find an organization’s information on the Web, research indicates that they are relying on a search engine.
Although marketing for a non-profit uses many of the same terms and even many of the same tools as a business, it is really quite different because the non-profit is selling something intangible. Something that you transform into value for the customer.More and more, nonprofit Web sites are used to recruit new members or donors and to provide a host of services to existing members, in addition to public education, community service, and many other important applications.
Access to your organization online doesn’t limit itself to just the people you serve. Go back to your list of markets. What about your funders? Your donors? Your volunteers? Your staff? Most if not all of them have the ability, even the desire to access [your Web site] . . . The people you serve are a vital market. But they are not your only market.
The “non-profit” institution neither supplies goods or services. Its “product” is neither a pair of shoes nor an effective regulation. Its product is a changed human being. The non-profit institutions are human-change agents. Their “product” is a cured patient, a child that learns, a young man or women grown into a self-respecting adult; a changed human life altogether.
Labels: About Google, Organisation Marketing“Search engine marketing, or SEM, is a form of Internet marketing that seeks to promote websites by increasing their visibility in the search engine results pages”. Search engine optimization (SEO) is based on strategies to improve a search engine ranking. SEO usually means making changes to a Web site’s design elements and content, and in most cases it costs nothing at all. SEM is not just search engine optimization; SEM includes pay-per-click (PPC) strategies. SEM is doing what needs to be done to ensure that a Web site ranks as high as possible in search engine results and to bring more people to that Web site.
The Pew Internet & American Life Project (2007) conducted a demographic survey of Internet sers between February 15, 2007, and March 7, 2007. The results of this survey showed that 71% of adults in the United States use the Internet. Table 1 shows the demographics of U.S. adults who use the Internet.
Oser (2006) wrote, “Once online, 80% of Internet traffic begins at a search engine, according to a Harris Interactive poll. Forty-one percent of Web users find brands through search rather than just typing a URL into their browser, a DoubleClick study reported”

On January 13, 2009, Yahoo appointed Carol Bartz, former executive chairman of Autodesk, as its new chief executive officer and a member of the board of directors.
According to Web traffic analysis companies (including Compete.com, comScore, Alexa Internet, Netcraft, and Nielsen Ratings[8]), the domain yahoo.com attracted at least 1.575 billion visitors annually by 2008. The global network of Yahoo! websites receives 3.4 billion page views per day on average as of October 2007. It is the second most visited website in the U.S., and the most visited website in the world.
Curtesy : wikipedia
Curtesy : wiki
Blog Archive
-
▼
2009
(27)
-
▼
February
(27)
- How to use Google Talk on your PC.
- Free talk any where
- Contextual Advertising.
- Google AdWords
- Paid search
- Secret formula of SEO
- Site map for SEO
- Site links for SEO
- SEO for page contents
- HTML code for SEO
- About Organic search
- U.S. Search Rankings, November 2007
- how a search engine works.
- Pay per click
- king of search
- New Alpha processor
- Web crawlers or spiders
- History of Search Engines
- Some terms for SEM
- Profitable organisations marketing
- Significance of nonprofitable organisation
- MArketing of a non profit organisation
- nonprofit organizations
- SEM & SEO
- Internet user survey
- About Yahoo
- About Google
-
▼
February
(27)