Google amps up its search with ‘Caffeine’

(CNN) — If you searched Google on Tuesday, you may have noticed that the information you’re looking for is a bit "fresher" than it would have been on Monday.

That’s because the world’s most popular search engine has unveiled a new search method called "Caffeine," which claims to index new information 50 percent faster than Google’s old search.

"Caffeine provides 50 percent fresher results for Web searches than our last index, and it’s the largest collection of Web content we’ve offered," the company says in a news release on its official blog. "Whether it’s a news story, a blog or a forum post, you can now find links to relevant content much sooner after it is published than was possible ever before."

That doesn’t mean Google has changed its search formula entirely, or that search results will pop onto your screen faster than before. Essentially it means that Google is able to find new content more quickly. So, for instance, a new Twitter update that, in the past, would be been missing from search results because Google hadn’t found and indexed it yet, would be posted to Google search results more quickly with Caffeine.

Here’s a promotional video from Google that explains how the search works.

The update — which has been anticipated by tech and search-engine blogs — comes as Google faces increasing competition from both traditional search engines and from online social networks with search-like functions.

Google is still the top search engine, with 64 percent of search queries, according to a report from the Web traffic monitor comScore, which cites April numbers. But that’s down 1 percent compared to the previous month. Meanwhile, Microsoft’s Bing search engine has been winning new fans, and, in general, increasing traffic. Microsoft and Yahoo! combined now make up about 30 percent of the search market, according to the report.

In terms of social networks, Twitter has become a go-to site for finding up-to-the-minute information. And Facebook in April debuted its "Like" functionality all over the Web. Some tech writers see that as a threat to Google, since Facebook is essentially trying to organize information on the internet according to the likes and dislikes of someone’s friends.

Mashable, a social media blog that has a content partnership with CNN, says Google Caffeine is effective at making the Mountain View, California, company’s search results more immediate: "This search is not only faster, but in some instances in our few tests, seems more capable of producing real-time results," Ben Parr wrote on the site last year when rumors of Caffeine were surfacing.

The Google search update is a sign that the company is feeling some heat from its competitor Bing, writes Charles Arthur at The Guardian.

"It’s interesting to see that Google is focusing again on the element of its offering where it does lead the pack: search," he writes. "That’s what made its [Google's] name, but it’s clear that even if Microsoft’s Bing hasn’t (yet?) won the market share, it has got Google thinking about how it can improve what it does."

The blog The Next Web says the move is significant for the development of the real-time internet, and that it also "could provide a tremendous boast to not only Google’s stock price, but also to its ad revenue."

To better understand how Caffeine works, it might help to think of Caffeine as a blog and the old Google as a newspaper. Where a newspaper collects content and then publishes it all at once, at the beginning of the day, a blog is constantly looking for new information and updating on the fly. This is sort of how Google Caffeine works. Rather than collecting big "batches" of Web pages to index for its search, Google is trying to publish more frequently as it goes.

"Every second Caffeine processes hundreds of thousands of pages in parallel. If this were a pile of paper it would grow three miles taller every second," Google says.

In a blog post, Google software engineer Carrie Grimes acknowledges that times are changing for search.

"Content on the Web is blossoming," she writes. "It’s growing not just in size and numbers but with the advent of video, images, news and real-time updates, the average Web page is richer and more complex.

"In addition, people’s expectations for search are higher than they used to be. Searchers want to find the latest relevant content and publishers expect to be found the instant they publish."

Story by By John D. Sutter, CNN

Google Confirms “Mayday” Update Impacts Long Tail Traffic

Google made between 350 and 550 changes in its organic search algorithms in 2009. This is one of the reasons I recommend that site owners not get too fixated on specific ranking factors. If you tie construction of your site to any one perceived algorithm signal, you’re at the mercy of Google’s constant tweaks. These frequent changes are one reason Google itself downplays algorithm updates. Focus on what Google is trying to accomplish as it refines things (the most relevant, useful results possible for searchers) and you’ll generally avoid too much turbulence in your organic search traffic.

However, sometimes a Google algorithm change is substantial enough that even those who don’t spend a lot of time focusing on the algorithms notice it. That seems to be the case with what those discussing it at Webmaster World have named “Mayday”. Last week at Google I/O, I was on a panel with Googler Matt Cutts who said, when asked during Q&A,  ”this is an algorithmic change in Google, looking for higher quality sites to surface for long tail queries. It went through vigorous testing and isn’t going to be rolled back.”

I asked Google for more specifics and they told me that it was a rankings change, not a crawling or indexing change, which seems to imply that sites getting less traffic still have their pages indexed, but some of those pages are no longer ranking as highly as before. Based on Matt’s comment, this change impacts “long tail” traffic, which generally is from longer queries that few people search for individually, but in aggregate can provide a large percentage of traffic.

This change seems to have primarily impacted very large sites with “item” pages that don’t have many individual links into them, might be several clicks from the home page, and may not have substantial unique and value-added content on them. For instance, ecommerce sites often have this structure. The individual product pages are unlikely to attract external links and the majority of the content may be imported from a manufacturer database. Of course, as with any change that results in a traffic hit for some sites, other sites experience the opposite. Based on Matt’s comment at Google I/O, the pages that are now ranking well for these long tail queries are from “higher quality” sites (or perhaps are “higher quality” pages).

My complete speculation is that perhaps the relevance algorithms have been tweaked a bit. Before, pages that didn’t have high quality signals might still rank well if they had high relevance signals. And perhaps now, those high relevance signals don’t have as much weight in ranking if the page doesn’t have the right quality signals.

What’s a site owner to do? It can be difficult to create compelling content and attract links to these types of pages. My best suggestion to those who have been hit by this is to isolate a set of queries for which the site now is getting less traffic and check out the search results to see what pages are ranking instead. What qualities do they have that make them seen as valuable? For instance, I have no way of knowing how amazon.com has faired during this update, but they’ve done a fairly good job of making individual item pages with duplicated content from manufacturer’s databases unique and compelling by the addition of content like of user reviews. They have set up a fairly robust internal linking (and anchor text) structure with things like recommended items and lists. And they attract external links with features such as the my favorites widget.

From the discussion at the Google I/O session, this is likely a long-term change so if your site has been impacted by it, you’ll likely want to do some creative thinking around how you can make these types of pages more valuable (which should increase user engagement and conversion as well).

Original Story by Vanessa Fox at Search Engine Land

Five Steps to Online Marketing Success

Using competitive intelligence strategies and tools will be key to launching successful email marketing campaigns in 2010, according to a new white paper from Compete.

In the white paper “Five Simple Steps to Online Marketing Success,” Compete recommends that marketers take the five following steps to integrate competitive marketing into their online campaigns:

1. Know the Competition: First, Compete advises email marketers to create a list of competitors in their space and identify the specific reasons each poses a competitive threat. After identifying their competitive set, marketers are advised to dig into metrics and figure out their standing in the ones that matter most.

Frequently used metrics include unique visitors, page views, time on site, average stay, and pages per visit. All of these will give marketers a sense of real norms for their competitive landscape, thus allowing them to make logical business decisions and maximize ROI. Some competitive intelligence solutions provide visibility into metrics for competitors’ sites that are much like the ones analyzed using local web analytics tools.

2. Cover Search Marketing Bases: Increase search traffic and campaign performance through analysis of competitors’ search marketing trends and keywords. In addition, using local web analytics tools, marketers can see what keywords are sending traffic to their sites, how much of that traffic is paid as compared to natural, and what percentage of traffic the individual search engines contribute to a site overall.

Through optimization and testing of paid search campaigns, SEO keyword research, and content creation, marketers can maximize conversions and increase search ROI. However, optimization and testing are not guaranteed to provide beneficial results. Competitive search analytics tools let marketers quickly gain insights into your competitors’ search strategy so you can capitalize on their success.

3. Copying is a Sign of Flattery: Identify what websites are sending competitors traffic and get in on the action. Competition for web traffic will continue to grow as more everyday activities move online. Just like there are a few keywords that drive most of the traffic to a site (the head) and thousands of other keywords that drive a little (the long tail), traffic to websites works much in the same way. There are millions of websites out there, however only a small percentage account for the bulk of traffic. The relationships marketers forge with other online businesses can determine the fate of their marketing success.

By analyzing referral (upstream) and destination (downstream) traffic for their competitive sets, marketers can easily identify new affiliate websites, business relationships, and link-building opportunities. If competitors are receiving traffic from specific sites, it may be worth reaching out to those sites.

Compete advises that marketers use competitive intelligence tools to create a list of websites that send traffic to their competitors, excluding websites that they are already working with. Marketers can then determine which sites they want to reach out by analyzing how much traffic these websites receive. Analyzing traffic referral reports is a great way to generate new relationships and increase online reach with minimal effort.

4. Fix Your Leaky Bucket: Local web analytics tools will let marketers identify which websites are sending traffic to their sites. Based on page views, bounce rates, and conversion rates, they can easily identify which websites deliver the most benefit. Thinking of a website as a bucket, using various marketing tactics, marketers can scoop up site visitors that may be interested in their products or services. Then by creating conversion funnels, they can attempt to lead users to perform a specific action.

Using destination or downstream traffic tools, marketers can easily see where their users go when they leave the site. Are they going directly to a competitor, a search engine, or back to their favorite social network? Maybe they are searching for coupons on their favorite deal sites or conducting research before they decide to purchase your products or services. By regularly performing a destination traffic report on their websites, marketers can easily find opportunities to quickly “patch the holes” in their “leaky buckets.”

5. Stay on Top: Marketers should monitor, set benchmarks, and realize how their site enhancements and marketing strategies impact their competitors. In order to stay on track and continue to grow their business, marketers need an online marketing strategy that can adjust to ever-changing economic, technological, and social environments. Incorporating competitive intelligence into that marketing strategy can give them the critical information they need to minimize risk and ensure success.

Marketers should monitor their competitive set and use competitive intelligence tools on a regular basis. They will find much of the information they collect to be critical in making decisions about their overall marketing strategies. Then, they should set benchmarks and monitor changes across their entire competitive sets.

Google Dominates Search Engine Activity
US web users prefer to conduct online searches with Google by a wide margin, according to recent data from The Nielsen Company. In March 2010, Google Search maintained its comfortable lead in search engine usage during March 2010, with 6.39 billion searches, or 65.7% of 9.72 billion total searches.

March 2010 Search Rankings Change Little from February

nielsensearchmar10apr2010.jpg

Americans’ usage preference for online search engines changed little between February and March 2010, according to The Nielsen Company.

Google Search Maintains Dominance
Google Search maintained its comfortable lead in search engine usage during March 2010, with 6.39 billion searches, or 65.7% of 9.72 billion total searches. Yahoo Search came in a distant second with 1.3 billion searches, or 13.4% of the total. MSN/Windows Live/Bing Search followed with 1.2 billion searches, or 12.2% of the total.

nielsen-search-mar-10-apr-2010

No other search engine had a search total in the billions or double-digit market share. AOL Search, the fourth-most-popular search engine for the month, accounted for 245.8 million searches, 2.5% of the total. Total searches increased 5.8% from 9.18 billion in February 2010, which is likely at least partly due to the additional three days in March.

February 2010 Numbers Were Similar
Google Search led all search providers in February 2010 with a 65.2% search share, or about 5.98 billion searches, according to previous Nielsen rankings. Yahoo Search came in second with a 14.1% search share, or about 1.29 billion searches. MSN/WindowsLive/Bing followed with 12.5% search share, or 1.14 billion searches. AOL Search, the fourth-most-popular provider last month, had a 2.3% share, or about 207 million searches.

MSN/WindowsLive/Bing experienced approximately 15% growth in its share of US searches in February 2010, increasing from a 10.9% share and 1.12 billion searches. March 2010 figures indicate this growth has at least temporarily stalled.

comScore Results also Similar
comScore’s core search rankings use different metrics than Nielsen’s search rankings, but produced similar results in March 2010. There was little change in comScore’s market share statistics of the five leading US online search providers between February and March 2010. Google Sites led the core search market with 65.1% market share, down from 65.5%. Yahoo Sites slightly rose from 16.8% to 16.9% market share. Microsoft Sites also grew slightly from 11.5% to 11.7% market share. Ask Network and AOL LLC Network’s market share rankings remained virtually unchanged in the low single digits.

Google Page Indexing Creates Leads

hubspotgoogleindexmediamonthlyleadsapr2010.jpg

Getting your site ranked highly on search engines is one of the fundamentals of SEO, just how you go about that is a topic which creates a lot of discussion, which method works best, and is off page optimisation more effective than on page optimisation, and how valuable are links or votes that are directed to your site from other web sites on the net.

I’ve always worked on the premise that I get as many of these factors working for me when optimising a site and a very effective tool to get high rankings is relevancy to the subject you’re trying to rank for. Getting as many of your URL’s or pages indexed by the search engines can lead to an increase in both traffic and conversions. So I thought this research from HubSpot would be of interest to you.

The more pages a company has indexed by Google, the more leads it will generate, according to research by internet marketing firm Hubspot.

Incremental Indexed Pages Can Cause Double-digit Lead Growth
There is a strong positive correlation between the number of Google indexed pages and median leads. An incremental increase of 50-100 pages indexed by Google can cause lead growth in double-digit percentages. For example, going from 60-120 indexed pages to 121-175 indexed pages can increase a company’s median leads from seven to 12, creating 58.3% growth.

hubspot-google-index-media-monthly-leads-apr-2010

The most significant improvement in median lead growth comes when a company increases its indexed pages from the 176-310 range to the 311-plus range. Median leads skyrocket from 22 to 74, representing triple-digit 236% growth. After exceeding the 311 indexed pages mark, median lead growth subsides.

Size Not Critical Factor
Overall, Hubspot research indicates that size is not a critical factor for achieving significant volumes of Google indexed pages. Size and number of pages are mildly positively correlated, mostly in the extreme categories of indexed pages.

hubspot-google-index-company-size-apr-2010

While HubSpot’s large customers formed the biggest group with 311 or more indexed pages in Google, small and medium-sized customers together outnumbered large ones in this category, 57% – 43%. In addition, small customers formed the largest group with 176 to 310 Google indexed pages (39%).

As might be expected, small customers do form the largest group within the less than 60 (53%) and 61-120 (54%) indexed pages categories.

Marketing Takeaways
Hubspot advises marketers considering a Google page indexing program to use the following techniques:

  • Build page volume: Consider starting a blog to quickly increase the number of indexed pages.
  • Improve each page’s optimization as per Google’s methodology to maximize chances of having all corporate web pages included in the index.
  • On-page search engine optimization: Place keywords in the right places on web pages such that Google and other search engines know what each page of a company’s web site is about, and what keywords to rank it for.
  • Off-page search engine optimization: Build inbound links from reputable sites, thus demonstrating a company’s popularity to search engines.
  • Inbound links do not generate more leads, but do generate more unique visitors.

Google Dominates Core Search
Google Sites clearly dominate US internet users’ core search activities, according to comScore. In March 2010, Google Sites led the core search market with 65.1% market share and 10.05 billion core searches, up 6% from February 2010. Both of these statistics represent the continuation of long-term core search trends.