Friday, April 30, 2004

Profile: The Google founders

By Will Smale
BBC News Online business reporter

The founders of the Google internet search engine - Larry Page and Sergey Brin - are the type of young men most parents would dream of their daughters bringing home.
And far from simply because they will both be billionaires following a stock market flotation of Google.

Instead, most mums and dads would also be drawn to the facts that both men are very clean cut in appearance, undeniably hard working and intelligent, and seem, well, just "nice".

They are your text book, well presented, quietly well behaved "boys next door" from a smart middle class American suburb.

Only a lot richer.

Yet far from living an extravagant lifestyle, complete with yachts and private jets like fellow software leader Oracle boss Larry Ellison, Mr Page, 31, and Mr Brin, 30, are both reported to continue to live modest, unassuming lifestyles.

They don't even have sports cars, and instead are said to each drive a Toyota Prius, a plain-looking but rather environmentally friendly saloon that is half electric-powered, and growing in popularity among green-minded Americans.

Mr Brin's father even claimed recently that his son still rents a modest two bedroom apartment.

Garage industry

Mr Page and Mr Brin just happen to be geniuses with computers and, by extension, the founders of the world's most popular internet search engine.

Today both barely in their thirties, the two first met at Stanford University in the mid-1990s, where they were doing doctorates in computer sciences.

Apparently, they did not immediately hit it off, but they became friends while developing a new system of internet search engine from their college dormitory.

Initially called BackRub, they created a software system whereby the search engine would list results according to the popularity of the pages, after realising that more times than not the most popular result would also be the most useful.

So after changing its name to Google they dropped out of college (although Mr Brin is officially still on leave) and the rest, as they say, is history.

Pulling together $1m from family, friends and other investors, on 7 September 1998 Google was commercially launched from a friend's garage.

Growth was quick.

Initially, Google got 10,000 queries per day compared with 200 million today.

Family influences

Both Mr Page and Mr Brin come from an academic and computer science or mathematical background.

Larry - or Lawrence - Page was born and raised in Michigan, the son of Carl Page, a pioneer in computer science and artificial intelligence.

Page Senior earned a doctoral degree in computer science in 1965, back when the subject was still in its infancy, and went on to become a computer science professor at Michigan State University.

His wife, and Larry Page's mother, also worked in computers, teaching computer programming.

Perhaps unsurprisingly, Larry Page says he fell in love with computers at the tender age of six.

Mr Brin is a Muscovite by birth, the son of a Soviet mathematician economist.

His family, who are Jewish, emigrated to the US in 1979 to escape persecution, and Mr Brin went on to get a degree in mathematics and computer science from the University of Maryland, before enrolling at Stanford University as a postgraduate.

Hippy mantra

Google today has its headquarters at Mountain Vale in the heart of California's famous Silicon Valley, where certain quirks are in place to keep staff happy.

These include weekly games of roller-hockey in the car park, an on-site masseuse and a piano.

And each member of the team is given one day a week to spend on their own pet projects.

In a nod to the county's former hippy past, the company's head chef is said to have formerly worked for the rock band Grateful Dead.

There is also something very 1960s California about what Mr Page and Mr Brin say is their philosophy.

As Mr Page recently explained to ABC News: "We have a mantra: 'Don't be evil', which is to do the best things we know how for our users, for our customers, for everyone.

"So I think if we were known for that, it would be a wonderful thing."

Nice boys, you see. Wealthy yes, and maybe a little quirky too, but still very nice.

fr.: http://news.bbc.co.uk/1/hi/business/3666241.stm

Does Google feel lucky?

By Clare Matheson
BBC News Online business reporter

When the world's favourite search engine announced it was to float on the stock market it said: "Google is not a conventional company".

Well, it's stuck to its guns and declared it won't be heading to market in the traditional manner.

Usually when a privately owned company offers stock for the first time to fund either early development or expansion - is made through brokers, with institutional investors - such as mutual funds and insurance companies - getting most of them.

But not Google.

Instead the group has said it will take the alternative route of making its initial public offering (IPO) online.

Google hopes to raise up to $2.7bn (£1.62bn) by auctioning its shares - a move that could see the group valued at more than $20bn.

Unconventional

That way, it explained, the general public will have a greater chance of grabbing a share in the company.

And in a further flourish, the company remained close to its geeky roots - apparently the share offering is something of a mathematical joke.

The exact offering, $2,718,281,828, is the product of "e" and $1bn, where "e" is the base of the natural logarithm -a logarithm that is particularly useful in calculus -and equals 2.718281828.

To be exact "e's" decimal point stands for a billion dollars - so "e" is 2.718281828. And if you're not rolling round in the aisles after that one, well it is a mathematical gag after all.

Google has been very tight-lipped about its plans and finances in the past.

Strangely, it was a 70-year-old law that prompted triggered the public sale of the six-year-old firm's shares.

Under US rules, once a firm has $10m in assets and 500 shareholders it must release detailed financial information - much the same as a public company has to.

Google is thought to have breached the set threshold last year when it issued shares to staff.

Ahead of filing its accounts, speculation was rife that it would see the event as a timely opportunity to announce an initial public offering (IPO) of its shares.

And those rumours proved to be true. Why would Google go to the time and expense of revealing its details to the public - and its rivals - with no benefits at the end of it.

Joining the big boys

Now founders Sergey Brin and Lawrence, or Larry, Page - who are known for their quirky ways of running their company - have joined the world of big business.

Good news from a Google float could renew interest in the technology sector which is in the tentative stages of a recovery from the dotcom crash of 2000.

A Google IPO would "display confidence" in the tech sector and the economy, said Joel M Bernstein, partner in LA-based law firm McDermott, Will & Emery.

It may even push more firms to go public, with Ernst & Young predicting that a successful flotation would "help pull the sector forward".

There's so much value in Google, it's hard to think they're not going to want to unlock some of that

Steve Berkowitz, Ask Jeeves
According to IPO Monitor, just 16 companies have taken the plunge and gone public this year.

None of them were anywhere nearly as big a name as Google, a company that came to the fore just as the dotcom bubble was heading for a crash.

High returns

Employees and venture capitalists who helped build Google - such as Sun Microsystems co-founder Andy Bechtolstein and former Amazon business development vice president Ram Shriram - stand to make a killing from a public offering of their shares.

Ironically, the list of beneficiaries would include rival Yahoo, which invested $10m in the firm back in 2000 in exchange for a stake that could now be worth hundreds of millions of dollars.

The same goes for Silicon Valley venture capital firms Sequoia Capital and Kleiner Perkins Caufield & Byers.

They invested a total of $25m and could see billion dollar returns for their efforts.

Then there are the celebrities: Tiger woods, Henry A. Kissinger and Arnold Schwarzenegger are also in on the act, albeit with small stakes, according to New York Times.

Stanford University, where the two men met, could also see a substantial windfall.

The university still owns the search technology for Google which it leases back to the firm under a deal that saw it take some stock and an annual royalty payment.

And of course the founders - thought to own between a third and half of the company - will not just be taking home pocket money.

Bill Gates made his multi-billion dollar fortune when Microsoft's shares were sold to investors.

Straightening up

But what would a stock float mean for the group.

On the plus side, it would raise even more money to put into research and development, or even expansion.

In recent months, Google has launched G-Mail and Froogle - the online equivalent of shoppers haggling for the best price in the marketplace.

"These are guys who have one of the best brands ever created in such a short time," said Ask Jeeves chief executive Steve Berkowitz.

"There's so much value in Google, it's hard to think they're not going to want to unlock some of that."

On the downside, the group might have to sit up straight and ditch its laid back Californian attitude.

A hit?

Wall Street would want to know what is happening on the money and the business side.

And there would be more regulatory hoops to jump through.

Investors would also want updates on the latest innovations the Google team is working on - something that could jar with their usual "test behind the scenes until we're happy with it" ethic.

Peter Thiel, who took his internet scheme Pay-Pal public in 2002 before selling it to eBay, said a flotation would change "the emphasis from building a great business to trying to meet the quarterly (earnings) figures".

The windfall could also see staff desert in droves as they chose to bag their cash and retire or even set up their own firms, possibly threatening Google's crown.

fr.: http://news.bbc.co.uk/1/hi/business/3664339.stm

Google plans $2.7bn share auction

Internet search powerhouse Google has confirmed it is to float on the US stock market.
The firm has filed papers proposing to issue stock via an online auction, rather than the traditional allocation by big banks.

The move, experts predict, could raise an estimated $2.7bn (£1.62bn) and value Google at as much as $20bn.

Founders Sergey Brin and Larry Page are set to become billionaires, while many workers hope for million-dollar riches.

The initial public offering (IPO) will be led by Morgan Stanley and Credit Suisse First Boston.

Unorthodox choice

In an unusual move, the group confirmed it would sell its shares through an auction - which will determine the final price of the offering.

The unorthodox process is designed to give the general public a better chance to buy its stock before the shares begin trading.

IPOs are traditionally run by investment banks, who make their own decisions about who should end up with the shares.

The banks say this allows them to build a base of solid long-term investors.

But many have been accused of taking advantage of the system to get hot shares at low prices into the hands of favoured customers.

Shares in the firm are expected to begin trading either on the Nasdaq or the New York Stock Exchange in the late summer or early autumn.

Profits double

Details of the widely-anticipated flotation emerged in papers filed with the US Securities and Exchange Commission (SEC).

Those documents also revealed key financial details about the firm, which is notorious for remaining tight-lipped about its cash situation.

Google revealed it made a net profit of $105.6m in 2003, on revenues of $961.9m.

This change will bring important benefits for our employees, for our present and future shareholders, for our customers, and most of all for Google users

Google
The web group added it had seen a sharp rise in net profits in the first three months of 2004 to $63.9m - more than double the figure for the same period last year.

The documents also showed Google had cash on hand of $454m at the end of March.

Unconventional

But they also showed Google taking care to stress that going public would not turn its Silicon Valley culture into a buttoned-up one.

"Google is not a conventional company," said a letter included with the SEC filing.

"We do not intend to become one."

Anyone expecting a focus on smoothing performance to satisfy quarterly expectations at the expense of long-term investment and risk-taking would be disappointed, the "owner's manual" for prospective investors said.

"Now the time has come to move the company to public ownership. This change will bring important benefits for our employees, for our present and future shareholders, for our customers, and most of all for Google users."

One big change will be that the group's financial backers and stock-holding employees could be in line for huge windfalls - at least on paper.

Silicon Valley-based Google was set up in 1996 by Stanford University students Sergey Brin and Larry Page, who had met a year earlier during a research project.

Google's underlying technology is still owned by the university and leased back to the firm.

Competition heats up

Google's position in the market place has been cemented through its search technology, based on highly sophisticated algorithms.

Over the past five years this helped it to become the search engine of choice for a majority of internet users.

However, the company's rivals are working hard to displace it from its dominant position.

Internet portal Yahoo recently dumped Google as its search technology provider, replacing it with its own technology after buying a string of search companies - Inktomi, Altavista and Overture.

Microsoft, whose founder Bill Gates has publicly acknowledged the superiority of Google's technology, says that it will overtake its Californian rival after about two years.

To counter its rivals, Google in recent months launched or announced a string of services, among them the free webmail service G-Mail and Froogle, an online shop price comparison service.

fr.: http://news.bbc.co.uk/1/hi/business/3670951.stm

Mozilla, Gnome mull united front against Longhorn

Last modified: April 28, 2004, 4:00 AM PDT
By Paul Festa
Staff Writer, CNET News.com

As Microsoft focuses on merging its Web browser and operating system software, open-source competitors are mulling a proposal to join forces and beat the software giant to the punch.

Representatives from two open-source foundations, Mozilla and Gnome, met last week to consider a joint course of action aimed at keeping their respective Web and desktop software products relevant once Microsoft releases the next major overhaul of its Windows operating system, known as Longhorn.

Microsoft now has "a single team for Web and native desktop rendering," noted one participant, according to meeting minutes posted on the Gnome Web site. "Gnome and Mozilla need to align to counter this."

Mozilla is an open-source browser development project. Gnome, which stands for GNU Network Object Model Environment, is an open-source user interface for use with Linux and other Unix systems.

The April 21 meeting, attended by veteran Mozilla and Gnome organizers including JavaScript inventor Brendan Eich and Ximian co-founder Nat Friedman, is but one manifestation of the open-source community's Longhorn jitters. Microsoft has promised that Longhorn will fuse Web browsing and desktop computing to an unprecedented degree.

Microsoft said last year that it would discontinue standalone versions of its Internet Explorer browser to focus development energies on Longhorn.

Competitors fret that when Longhorn launches, standalone browser and desktop applications may find themselves consigned to the computing paradigm scrapheap.

The open-source developers may have time on their side. Microsoft earlier this month said it won't release Longhorn until at least the first half of 2006, having decided instead to focus this year on getting out a major security upgrade, known as WindowsXP Service Pack 2, for its current operating system.

Microsoft also faces unknown fallout from a decision last month by the European Union to force the software maker to supply a version of its Windows operating system without its Media Player software. Microsoft has appealed the ruling, and a final decision could be years away. But it could set a precedent on how the company builds its software that could affect Longhorn, which will introduce many new features.

While Microsoft has delayed Longhorn's release repeatedly, the company has advanced vital components and related technologies, including the Extensible Application Markup Language (XAML), the Avalon graphics and user interface technology, and the .Net Web services framework.

A dangerous combination
Taken together, that arsenal is costing open-source competitors sleep.

"What makes Longhorn dangerous for the viability of Linux on the desktop is that the combination of Microsoft deployment power, XAML, Avalon and .Net is killer," Ximian co-founder Miguel de Icaza wrote in a recent blog posting. "It is what Java wanted to do with the Web--but with the channel to deploy it and the lessons learned from Java's mistakes. The combination means that Longhorn apps get the Web-like deployment benefits: (You can) develop centrally, deploy centrally and safely access any content with your browser."

A key weapon in any planned counterattack could be Mozilla's Extensible User Interface Language (XUL), a 5-year-old scheme for building desktop applications' user interfaces out of lightweight Web markup languages like XML (Extensible Markup Language) and CSS (Cascading Style Sheets).

The original impetus for XUL was to make the Mozilla browser itself lighter and faster by creating its interface with Web standards. But out of the resulting technology Mozilla developers speculated they could spark a "programming revolution."

So far, XUL has failed to catch on, and Microsoft questioned whether Mozilla's technology would do much to help Gnome ward off Longhorn's promised threat.

XAML, Microsoft warned, is more potent than XUL in its ability to reflect exactly what's in the operating system.

"XUL is not the multipurpose declarative language that Gnome probably wants," said Ed Kaim, product manager for the Windows developer platform. "People say that when all you've got is a hammer, everything looks like a nail. In the same way, people are trying to figure out how to crush XUL into an OS it really wasn't designed for. The browser is great for a lot of things, but when it comes to robust client side applications, it's not the best."

Another trick will be in reconciling XUL with Gnome's existing user interface technology.

"There are ways to marry them," said Bruce Perens, an open-source consultant who serves as executive director of the Desktop Linux Consortium, a marketing organization. "But it's very difficult to get the two teams working in the same direction. They both went on a several-year tour of technical creation where they sat down and created everything they needed to do GUI (graphical user interface) applications--and they didn't create the same thing. Now to get them together it would take some number of years to resolve the technical diversions."

Gnome already relies on some Mozilla software and produces a Mozilla-based browser called Epiphany.

Mozilla also produces a version of its Firefox browser for Linux and Gnome, and one of the points of discussion between the two groups is to produce a browser that combines the native Gnome interface elements of Epiphany and the cross-platform capabilities and 200 extensions or plug-ins that come with Firefox.

But it is the development framework that poses the greater challenge and holds the higher stakes.

"As we look at the challenges coming our way, we must remain competitive and retain an aggressive agenda to provide a rich user experience on all platforms," said Mozilla spokesman Bart Decrem. "XUL has come a long way since it first came out, and the combination of Gecko and XUL is a great starting point for delivering rich applications to the desktop."

fr.: http://news.com.com/2100-1032-5201325.html

電腦密碼防護 普遍薄弱

記者鍾翠玲/台北報導  29/04/2004

安全廠商RSA Security一項調查發現,電腦用戶所使用的密碼過於較簡單,而可能成為資訊安全的漏洞。

RSA Security日前公佈第二屆電子安全消費行為調查,結果顯示消費者雖然對電子交易信心不足,擔心身份被盜用的問題,但另一方面,大多數人仍然使用安全性較低的密碼防護。

RSA Security針對1,000多名美國消費者,就電子安全的警覺性、電子交易安全的信心以及採用防範身份被盜用與電腦攻擊的安全措施等多項問題進行調查。對於身份被盜用的問題,有約一半的人並不覺得今年比去年安全,更有人認為自己比2003年更不安全。

不過另一方面,使用者似乎並未因此提高安全防護。這份調查顯示,約三分之二的受訪者(63%)利用少於五個密碼,來登入不同的系統存取電子資訊,例如網站服務、電腦系統、自動櫃員機及其他電子服務;甚至有15%的人只使用一個密碼來存取所有系統。RSA報告指出,大部份人仍使用安全性低的密碼方法進行電子交易,此舉讓身份被盜用的機會大為提升。

這項報告是RSA Security在美國進行的調查,不過這家公司表示,密碼防護安全薄弱的情形同樣適用於台灣。

RSA Security台灣區技術顧問黃惠美指出,在台灣,許多企業因為實施了BS 7799的資訊管理制度,經常被要求更換密碼,造成使用者傾向使用過於簡單的密碼。她表示,為了應付經常更換密碼的需求,使用者可能會以很容易記住的密碼排列,如「1234」或「0123」等,反而讓密碼可能更容易被猜出。

她並指出另一個危機,使得所有靜態密碼防護喪失作用。黃惠美表示,挾帶鍵盤追蹤(keylogger)的病毒將會偷偷紀錄使用者所有密碼,並藉由病毒大量散佈出去。「在這種情況下,使用者再怎麼常更換密碼都沒有用。」

她並指出,一次性密碼(one-time password)可能是較安全的防護。黃惠美指出,由於一次性密碼只使用一次,即使被駭客得知也沒有關係。RSA Security即為一次性密碼解決方案供應商之一。


目前國內已有許多金融業,如銀行正在導入藉由令牌(token)與認證伺服器組成的雙因素認證(two-factor authentication),或利用簡訊將一次性密碼傳到使用者行動電話,提供消費者登入網路銀行使用。

fr.: http://taiwan.cnet.com/news/software/0,2000064574,20089201,00.htm

Thursday, April 29, 2004

Spotting Search Engine Spam: Q&A

BY Shari Thurow | April 12, 2004

I received so many responses to my column on search engine spam and doorway pages, I consolidated your most frequently asked questions to alleviate confusion.

Q. I believe there's a niche for microsites. We began offering microsite development when clients wanted search engine traffic but wouldn't allow us to change the main site. The microsites used to be sub-domains of our own site, but now we offer to secure keyword-dense domain names for clients. Is this search engine spam?

A. I understand why companies are unwilling to change their Web sites. After a company pays a Web design firm thousands of dollars to develop a site, the last thing it wants to hear is design, copy, or both need modification.

Perhaps a company has its own staff develop a site. After months of focus groups, project management meetings, wire-frame approval, and so forth, it launches a new site. The company might not want to modify the it after all that hard work.

Enter microsites, satellite sites, or whatever you wish to call them. They serve to get search engine traffic without modifying the main site. Companies can easily find search engine marketing (SEM) firms that will create microsites.

But niche or no, most microsites are considered search engine spam for the following reasons:

Microsites are primarily built to rank well in the search engines.


Microsites are not built primarily for end users.


Most microsite content is redundant. Similar content is available on the main site.


To rank, the microsite is usually a part of a link farm. Creating invisible links in HTML and CSS is very easy.


No site should have to link to an SEM firm's site, and vice versa.
Even if the SEM firm states it will write unique content for the microsite, the content might be gibberish. Or the SEM firm might create good-looking microsites with complete sentences and full navigation schemes. Though microsites might look and sound good at first glance, many software engineers consider them spam. Optimization should always occur on the main site.

One of my favorite quotes comes from Google:

Webmasters who spend their energies upholding the spirit of the basic principles... will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.
Additionally, if a microsite hasn't been penalized or banned, it's not necessarily because the search engines find it acceptable. It may be that software engineers haven't detected it yet.

Search-engine friendly microsites are rare.

Q. Our site maintains good Google rankings, even with all the changes Google has undergone. However, my site is nowhere to be found in Yahoo! Is Yahoo penalizing my site for ranking well with Google?

A. Google doesn't penalize sites that rank well in Yahoo. Likewise, Yahoo doesn't penalize sites that rank well in Google (or any other search engine).

Unfortunately, many unethical SEM firms prey on the belief one site can't possibly rank well in all search engines. Don't fall for this. Building a search-friendly Web site is pretty straightforward if you follow these guidelines:

Use words and phrases on your pages that prospects might type into search queries.


Ensure relevant, keyword-rich phrases are visible in the SERPs by placing them within your HTML title tags and meta-tag descriptions (for search engines that use meta-tag descriptions in their SERPs).


Place keyword-rich, relevant phrases above the fold on your site. Visitors should know they've landed on a page that contains what they searched for.


Have at least one spider-friendly navigation scheme.


Make your URL structure spider-friendly.


Regularly request high-quality, relevant links to your Web site.
As long as a site follows these guidelines, simplistic as it may sound, the site should not have any problems receiving search engine traffic.

Q. I want to hire you for link development services. We need 500-600 links to our site within three months.

A. Link development is a slow, cumulative process that generates long-term results. Companies want to know they'll receive a specific number of links to their sites within a given time. A solid deliverable is far more appealing than "I don't know."

Unfortunately, link farm (spam) companies promise that type of deliverable because they control the links. With true link development, directories are usually the best start because the link quality is very high. Link requests can be sent on a regular basis, but there's no guarantee the links will be added.

Remember, any time an SEM firm promises "instant" link popularity or gives a specific number of guaranteed links to your site, you're probably dealing with a firm that practices link farm spam.

Thanks for your questions!

fr.: http://www.clickz.com/experts/search/results/article.php/3338211

Design Matters

BY Shari Thurow | March 29, 2004

Many search engine marketers don't understand how important design and information architecture are to search engine marketing (SEM). They may remember to throw in a link here, add some keywords there, mix it up, and (as Emeril Lagasse would say) -- BAM! A fully optimized page that ranks a top 10 position in every search engine.

Effective search engine optimization (SEO) campaigns require effective site design. Keywords are useless if search engines and site visitors don't have easy access to them.

What's an Effective Web Site?

An effective Web site is user friendly, search friendly, and persuasive enough to convert. In other words, search friendliness is an essential component of site design. Usability experts such as Jared Spool, Jakob Nielsen, and Steve Krug stress the importance of creating clear categories and navigation hierarchies. Spool coined the term "trigger words." The definition is very similar to the common SEM term, "keywords."

Site visitors should be able to browse your site and easily find what they're searching for. In an ideal situation, visitors would go directly to the page that contains the information they need. However, since no two people think alike, Web developers and usability professionals constantly test and determine better designs, HTML code, and programming to make delivering needed information as easy as possible.

Optimization is not only a marketing process. It's also a design and usability process.

SEM USPs

To differentiate themselves, search engine marketers present their own unique selling propositions (USPs) to close a sale. One of my esteemed colleagues publishes an online newsletter about search engine friendly copywriting. Her company feels the core of a SEO campaign is copy. The company's USP is writing keyword-rich copy that sells.

It's a great USP. Many copywriters don't know how to write keyword-rich copy that converts.

Another SEM company has this interesting sales pitch:

As SEO is a specialty-marketing tool, you'll want to be sure that you hire a company whose only business is SEO. Relying on a company who specializes in a variety of services will only detract from the services that you want most.
This company's USP is it specializes in optimization only (though after reading its entire site, it appears it specializes in all types of SEM).

This USP is good, too. Many Web hosting companies offer SEO services. A company I use for hosting publishes a regular newsletter on the topic. I cringe every time I read it. Many hosting companies have discovered SEM is a hot topic. They want to cash in and offer it as a new service.

Here's the flip side. At the last Search Engine Strategies conference, in the session entitled "Search Engine Marketing & Advertising Agencies," one of my colleagues pointed out ad agencies have issues with search engine marketers. One pet peeve is search engine marketers make pages or sites ugly.

Maybe slapping a keyword here and throwing a link there can make a page search friendly. But it can ruin a site's design, information architecture, and branding.

At the conference, a few of my colleagues approached my company's CEO asking, "Are you still really doing site design?" The answer is a resounding yes. Guess what? Design matters.

I'll repeat that: Design matters.

Design and SEO

Building an effective Web site requires a team. Copywriters and content managers can write persuasive, keyword-rich copy that converts. If search engines and site visitors don't have easy access to that copy, the pages won't appear at the top of search results.

Some site designs can be fixed by simply adding text-link navigation and a site map. Other design solutions aren't so simple.

Many CMSs and shopping cart software pass too many parameters in the query string, resulting in a URL structure that makes search engine indexing problematic. Only a programmer or developer experienced in a variety of Web programming languages can analyze a site and fix the problem.

Sites must also generate or help generate revenue. To do so, the site must be persuasive. Both copy and graphic images can make content more persuasive. Likewise, an appropriate color scheme and layout can make a site more persuasive.

As stated above, ad agencies don't like when search engine marketers make sites ugly. Unfortunately, many ad agencies deal with SEM firms with mostly programmers and other tech people for staff. Many programmers lack design, branding, and marketing skills.

In contrast, many copywriters and marketers lack technical skills. Sure, finding a person (or company) with technical, artistic, design, and marketing skills isn't easy. But those are the skills needed for an effective SEO campaign.

Conclusion

Am I saying the most important SEM component is site design? Of course not. Copywriting is equally important, as is information architecture. But written content must be presented in a user-friendly, search-friendly, persuasive manner.

How's that accomplished? Simply put, site design. Design matters.

fr.: http://www.clickz.com/experts/search/results/article.php/3331381

How to Spot Search Engine Spam: Doorway Pages

BY Shari Thurow | March 15, 2004

The search engine marketing (SEM) industry is teeming with unethical practitioners. Does your site need a number-one position in Google? They'll guarantee that position. Don't want to change your site but still want top positions in "natural" search results? No problem. Unethical search engine marketers will create doorway pages or mirror sites to redirect traffic to your site.

The problem with using doorway pages and mirror sites is search engines consider these practices spam. If you or your SEM firm violates the terms and conditions set forth by both the human-based (Yahoo! directory, Open Directory, Business.com) and spider-based (Google, Inktomi, Teoma) search engines, both your site and the SEM firm's site will be penalized or banned.

If you don't know how to spot a spam doorway page, here are some general guidelines.

Doorway Page Characteristics

Doorway pages come in all shapes and sizes. Some are very easy to spot. They're often computer-generated, text-only pages of gibberish. If human visitors viewed the page, they wouldn't purchase from the company. The page is ugly and nonsensical.

Yet some doorway pages are very, very attractive and difficult to spot. They're designed to visually blend in to a Web site. They have graphic images and navigation schemes. The content contains complete sentences, paragraphs, even reasonable calls to action (CTA). Spam doorway pages aren't necessarily computer-generated. Staff at SEM firms can and do write doorway page content.

Indications your SEM firm might be creating spam doorway pages:

Doorway pages often reside on an SEM firm's server under a different domain, not as a part of your own site.

Doorway pages are often redirected from the SEM firm's server to your Web site. Once search engine software engineers detect the doorway pages, SEM spammers abandon that domain and put the doorway pages on another one. It's a cat-and-mouse game.

Caveat: Unethical search engine marketers sometimes remove the redirect and claim the spam doorway pages are no longer spam. If pages are created deliberately to trick the search engines into delivering inappropriate, redundant, or poor-quality search results, they're spam. With or without a redirect.

Visitors don't see the same page search engine spiders do. In other words, one doorway page is presented to search engines, a different one is presented to visitors.

Listen for any phrase that resembles "instant link popularity." Even if you don't change your site, unethical SEM firms may build hundreds, even thousands, of doorway pages that point to your site to artificially boost link popularity.

Doorway pages are often created for individual search engines. They can be created for each search engine in different languages.
If you're concerned about your SEM firm's integrity, you can always visit the search engines and read their terms and guidelines (URLs listed below). If your SEM firm does anything that remotely violates their terms, don't hire the firm.

Information Pages

In November 1998 at the first Search Engine Strategies conference, I introduced the concept of information pages in my presentation, "Designing Search-Engine Friendly Web Sites." My colleagues Detlev Johnson and Marshall Simmonds and I came up with this concept earlier that year.

The primary goal of an information page is to provide useful information to your target audience. Examples of information pages include tips, set-up instructions, FAQs, and glossaries. Your target audience determines what information is useful, not the SEM firm.

Information pages reside on your Web server and are a natural part of your W site. In fact, links to information pages tend to be in the main navigation scheme.

Even though spam doorway pages can also reside on your server and be a natural part of your site, doorway pages can be detected by looking for hallway pages. A hallway page is created specifically to link to doorway pages. A hallway page sounds like a modified site map, doesn't it? Unfortunately, a hallway page is exactly that -- a modified site map that links to doorway pages.

Information pages do not redirect to a page with different content. With information pages, visitors often click on the link in the SERP and land on the Web page that gives them the precise information they were searching for.

Site visitors and search engine spiders always view the same information. Though there are many legitimate uses for cloaking, many doorway page companies use cloaking to hide spam doorway pages.

More Red Flags

Can doorway page companies take the characteristics of information pages and make spam pages? Absolutely. They do it all of the time. As I said, detecting and eliminating search engine spam is a cat-and-mouse game.

Since many unethical SEM firms know the phrase "doorway page" might not close the sale, they come up with other names for spam pages. Here's a partial list:

Attraction pages

Mini- or microsites

Satellite sites

Magnet sites

Channel pages

Shadow domains

Directory information pages (DIPs)

Search engine entry (SEE) pages

Advertising pages
One way to determine if you may be dealing with an SEM firm that spams is to search for the firm's Web site on Google or another spider-based search engine. If the site is no longer in the index, it was probably banned.

Searches I like to use on Google are site:companyname.com, inurl:companyname.com, and link:companyname.com. If you don't see pages from or links to the SEM firm's site in the index, then that site's under a spam penalty.

Bottom line: Informational pages don't violate the terms and guidelines set forth by the search engines.

Below, the links for you review. Bookmark them. As spammers find more creative ways to spam, these guidelines will surely evolve.

Google:

http://www.google.com/Webmasters/index.html
http://www.google.com/terms_of_service.html

Yahoo:
http://docs.yahoo.com/info/guidelines/spam.html
http://docs.yahoo.com/info/terms/

Teoma/Ask Jeeves:
http://sp.teoma.com/docs/teoma/about/terms_of_service.html
http://ask.ineedhits.com/programterms.asp#spam

fr.: http://www.clickz.com/experts/search/results/article.php/3325301

How to Survive Search Engine Changes

BY Shari Thurow | March 1, 2004

With all the hoopla following the latest Google updates and Yahoo! changes, advertisers and search engine marketers (SEMs) are scrambling for the latest and greatest strategies.

Should firms practicing search engine marketing rush into those so-called latest-and-greatest strategies? If you have a well-planned search engine marketing campaign, your site can not only survive, but win with, all the recent search engine changes.

The Client Panic Attack

Below, a typical client phone conversation after Google or Yahoo updates its index.

SEM: Hi. How can I help you?

Client: My site's positions dropped in Google! Help! What should we do?

SEM: OK, don't panic. Let's see what happened. We should look at your site statistics. Will you please open that up for me?

(Searches for client's site in Google and Yahoo.)

I see both Google and Yahoo indexed the pages on your site. While you're opening your site statistics, I'll check to see if Google or Yahoo dropped any pages from their indices.

No, both did index the pages on your site. No linking problems, either. Your site doesn't appear to be penalized for spam. You've been following all the search engines' guidelines, right?

Client: Right, no search engine spam.

SEM: Great! If you have your statistics open, tell me what traffic looks like. Has Google or Yahoo traffic decreased, increased, or stayed the same?

Client: It stayed the same.

SEM: Good! Look at the section of your statistics that show which search engine spiders visited the site. See Googlebot anywhere?

Client: Yes.

SEM: Has Googlebot regularly visited your site within the past month or so?

Client: It looks like Google spidered our site a couple of days ago.

SEM: Fantastic. Since I see pages in Google's index, and your logs show Googlebot regularly spidered your site, it means Google has no problem accessing the information on your site. Have you received fewer inquiries or sales as a result of the Google or Yahoo changes?

Client: No.

SEM: Just so we're clear. Your Google positions have changed. Your traffic from Google hasn't changed, or changed very little. The number of inquiries from your Web site are more or less the same. Is that accurate?

Client: Yes.

SEM: Well, if your site is receiving high-quality traffic from the search engines, it really doesn't matter that your positions changed, does it?

Client: No. I guess positioning isn't as important as I thought.

SEM: If you got high-quality traffic but low conversions from Yahoo and Google, I'd tell you the problem was likely landing page usability issues, calls-to-action (CTAs), or your value proposition. Do your statistics and sales indicate we should usability-test some pages or use focus groups to make CTAs clearer?

Client: No, conversions are fine.

SEM: OK, then you probably have nothing to worry about. If you notice any significant dips in site traffic, give us a call. We'll be more than happy to help.

Client: Thanks so much! Bye.

Aftermath

I was tempted to hire some actors and produce this little "play" for our office voicemail: For sales, press 1. For design, press 2. Site affected by latest Google and Yahoo updates, press 3.

We've been fortunate. The latest Google and Yahoo changes didn't adversely affect our sites or our clients', save one (due to technical issues). Others haven't shared our experience.

When Google, Yahoo, or any search engine suddenly updates its index, some tips to get you through it:

Don't panic. Search engines change their algorithms all the time. They partner with other search engines. They evolve to have better spiders. Search engine changes often have nothing to do with what you're doing on your own site.

I know how difficult it is if your boss starts breathing down your neck when positioning disappears. Panicking won't help the situation. Accept that search engine positions always change. You will survive them.

Check your analytics software or log files to understand how your site is affected. In many cases, search engine changes don't affect sites at all. Sometimes, you might see a downward traffic spike, then it returns to normal.

Understand what you can change and change it. Again, analytics software is very helpful. If the search engines can spider your site and do so, then you, your programmers, and your developers didn't do anything to negatively impact your site's search engine visibility.

When in doubt, buy ads. Search engine advertising provides something "natural" or "organic" optimization doesn't: a guarantee. If your company absolutely needs search engine visibility, search engine advertising should be in your marketing plan.

Use paid inclusion for a quick fix -- but understand paid inclusion is evolving. You may want to wait for the dust to settle before going down that road. The main benefit of paid inclusion is it's the fastest route to appearing in the Web page sections of search engine results pages (SERPs).
If you know you'll use search engine marketing as part of your overall online marketing plan, assess the values of each strategy. That way, you'll be prepared for any search engine change.

fr.: http://www.clickz.com/experts/search/results/article.php/3318701

Titles and Search Engine Marketing

BY Shari Thurow | February 2, 2004

When you hire a search engine marketing (SEM) firm, you'll hear the word "title" used frequently. Are search engine marketers referring to the HTML title tag or to a title and hyperlink in a search engine ad?

This column will help you understand the different types of titles and why they're important to SEM.

HTML Title Tags

One of the most important tags for "natural" or "organic" search engine optimization (SEO) is the HTML title tag. Title text is text placed between the (title) and (/title) tags. A title tag and its content looks like this:

(title)Organic green teas from ABC Tea Company(/title)
HTML title-tag content is important for positioning and providing a call to action. All major crawler-based search engines (Google, Inktomi, etc.) use title-tag content to determine relevancy. Additionally, much of the title-tag content is the hyperlink in search engine results pages (SERPs).

A keyword-stuffed title tag does not give potential customers the right impression. Do you want people to buy organic green teas from your company? Or do you want them to buy green teas; green tea; organic green tea; and organic green teas? Write for your target audience when you create HTML title-tag content.

CMS

A problem with many CMSs is they place the same title-tag text on multiple pages, sometimes on every page. Since every page on a Web site is unique, identical title-tag content doesn't showcase each page's individuality. Worse, it won't help with positioning.

Sometimes, a CMS will pull the title-tag text from actual content on the page. This can be good, if the page has naturally keyword-rich content. If the main headline contains targeted keyword phrases, the CMS can pull the title tags from the main headlines.

However, I find it more effective when professional copywriters write unique title tags for every page. Whenever possible, I try to make the HTML title tag its own field.

Title Attribute

Many Web marketers and designers mix up the title attribute and the HTML title tag. In HTML, an attribute contains information about the data in the HTML tag, but it's not the actual data. Width and height are common attributes in an image tag:

(img src="images/logo.gif" width="150" height="50"/)
A title attribute is commonly placed on the anchor tag, like so:

(a href="page.html" title="Click here for information on green tea.")
What's great about the title attribute is it adds a little pop-up bubble in Explorer that displays this text. So the title-attribute text can be used to encourage people to click the link. This text is also good for accessibility purposes.

The bad news? None of the crawler-based search engines use this text to determine ranking. So don't waste time with a search engine marketer who spends a great deal of time on this attribute.

Web Directories

When you submit your site to a Web directory (a human-edited search engine) such as Yahoo!, Open Directory, or Business.com, the form will ask you for a title. The form is not asking for an HTML title.

Most often, a Web directory title is your official business name or the name of the Web site. Think of it this way: Most directories want the title of a Web site; crawler-based search engines want the title of a Web page.

Most crawler-based search engines will display the first 40-70 characters of an HTML title tag, but directory titles tend to be shorter. Be prepared to write a variety of titles for directory submission.

Search Engine Advertising

With search engine advertising (such as Google's AdWords and Overture), the title isn't the same as the HTML title tag. Rather, it's the text used as the hyperlink in the ad.

This particular title is often shorter than the HTML title-tag text and the directory title. Overture's current limit is 40 characters. These titles usually cannot contain certain words, such as "best" or "number one." Unlike natural SEO, an ad's title text doesn't necessarily have to contain a keyword phrase to be effective.

Ranking for search engine advertising has nothing at all to do with the ad's title text. Position is based on the bid amount.

Multimedia Files

Multimedia files such as Flash movies, audio files, video files, and so forth can have titles in their metadata.

Karen Howe, general manager at Singingfish, the rich media search engine recently purchased by AOL, recommends giving descriptive, keyword-rich titles to all multimedia files. The software used to create multimedia files has a title field.

Be sure your multimedia designers put descriptive, keyword-rich information in other metadata fields as well.

"Title" has many different meanings in the SEM industry. Make sure you and your SEM firm are speaking the same language when marketing your site.

fr.: http://www.clickz.com/experts/search/results/article.php/3306541

Why Ad Agencies Fail at Search Marketing

BY Shari Thurow | January 5, 2004

Marketing blitzes are quite common in the ad industry. Produce a great TV spot, mention a Web address, and deliver a great user experience. What if your audience searches for your products on Google or Overture? Ensure your company appears in the top position. Make sure affiliate sites can appear in all top results for increased sales and branding.

From an ad agency's perspective, it's a great scenario.

According to an affiliate marketer who recently e-mailed me, having the main site and affiliate sites top-ranked "forces" consumers to the advertiser's site rather than to a competitor's. Agencies aren't too concerned with potential negative effects.

Yet search engine marketing (SEM) is very different from traditional marketing. The above scenario might be great for ad agencies, but it could spell big trouble with the search engines.

When Less Is More

Redundant content is problematic from a search engine perspective. Search engine representatives understand no one wants to go to a search results page and click on the top 10 or 20 links, only to find the links all direct to the same company. From the end users' point of view, if they weren't interested in the product or service the first time, they aren't interested the second, third, fourth, or fifteenth time.

Most affiliate sites are considered spam by search engines. As discussed in in an earlier column, search engine spam is defined as "pages created deliberately to trick the search engine into offering inappropriate, redundant, or poor-quality search results." Reasons affiliate sites are considered spammy include the fact they can:

Contain thousands (or millions) of illegible doorway pages


Contain redirects to pages on your Web site


Cloak from human viewing


Participate in free-for-all (FFA) link farms to gain link popularity
Not all affiliates sites use FFA link farms for link popularity, but a considerable number do. Amazon.com affiliates are especially problematic. As cloaking can become expensive and time-consuming, many affiliate sites don't cloak.

If affiliate sites offer unique content, it's a different story. Search engines and end users want unique content. But most affiliates don't take the time to produce unique, useful content.

Search engines don't want a single company to dominate "natural" search results. If a single company dominates, Google will eventually penalize or ban the sites, affiliates and all. In the past few months, Google removed many sites with affiliates, doorway pages, and mirror pages. In other words, it removed a large number of sites with redundant content.

Less is more as far as search engine results pages are concerned.

Copywriting

My high school English teachers actually helped me in SEM. Teachers would give their classes vocabulary lists. We had to write essays using these words. It was a great exercise in vocabulary-building and grammar.

Extrapolate this example to SEM. To write effective copy for both search engines and your target audience, you must learn to write in keywords. This applies equally to search engine optimization (SEO) and search engine advertising. With optimization, a Web page must appear focused on a specific topic or category. With ads, target audiences respond to specific words and phrases. In other words, if you offer a 30-day free trial of your CRM software, people want to see those words both in the ads and on your Web pages.

SEM is so new, colleges and universities don't yet offer courses in search engine copywriting. Many ad agency copywriters don't feel they must write in keywords. Instead, they rely on traditional copywriting, using flowery words people just don't type into search queries.

Ego Sites

The number one reason ad agencies fail at SEO is ego. Building what I call "ego sites" isn't confined to ad agencies, of course. Many businesses' first Web sites are often ego sites rather than user-friendly ones.

I can always tell when an ad agency has designed a Web site. Usually, the color scheme is outstanding. If there's one thing I'll give agencies credit for, it's color schemes. All those years dealing with print and TV provide a good sense of which colors people respond to. Agency production artists certainly understand the psychology of color.

The problem is many agencies do both print and Web design for their clients. There's nothing wrong with that -- except when agencies mistakenly believe a brochure and a Web site must follow the same format. A Web site should never be a brochure site, especially when it comes to SEM. People don't search to look at brochures. They don't search to look at ads. They search to find information.

Creative directors and production artists can have conflicting issues. Many are more interested in building neat, pretty ads and sites for their portfolios. Their excuse? Branding first, sales will follow. Never mind the vast majority of people who become irritated.

Ad agencies and business owners must put their personal preferences in check when designing and copywriting for sites. Fact is, ad agencies won't spend thousands or millions of dollars to buy your products and services. The CEO of your company won't spend thousands or millions of dollars on your products or services. Your target audience will. And if your target audience prefers a site that doesn't quite fit the CEO's idea of how it should be categorized, who will you satisfy? Ad agencies (and many SEM agencies) like those million-dollar checks. They make the CEOs happy.

At whose expense? You guessed it: the target audience.

Believe me, I've designed plenty of sites that aren't included in my portfolio. I don't like the color selection. Or maybe the layout. But if my clients make thousands or millions of dollars from their sites? Great! My personal preferences aren't important. Professional expertise and guidance are. Personal preferences should never be part of the design equation.

Conclusion

Not all ad agencies and marketing firms fail at SEM. But for many agencies, formal education and traditional marketing experience are barriers to SEM success.

If ad agencies truly comprehended SEM's uniqueness and kept personal preferences in check, maybe the search experience would improve for everyone.

fr.: http://www.clickz.com/experts/search/results/article.php/3293791

Web-Positioning Software: Good or Bad?

BY Shari Thurow | April 26, 2004



Last week, Fredrick Marckini lauded NetIQ for acquiring popular search engine positioning software, WebPosition Gold (WPG). Fredrick feels NetIQ cornered a brand new search engine marketing (SEM) and advertising analytics market.

Many search engine marketers don't share his joy with NetIQ's acquisition, nor do they feel Web analytics is a new market. My own attitude toward marketers and Web analytics is, "Where have you been?"

Search Engine Positioning Vs. Web Analytics Software

I'll just come right out and say it: I don't endorse any position-checking software.

Never mind that WPG and other well-known position-checking software are commonly used by marketers. Just because a product is popular or common doesn't mean it's effective.

Since 1997, we've provided clients with reporting from Web analytics software (a.k.a. site statistics software). Our designs, copywriting, and site architecture are always based on visitor behavior. After careful data analysis, we can redesign and rewrite a site based on visitor data and, thus, create more effective Web sites.

Sure, Web analytics software has evolved since 1997. It now includes more detailed keyword analysis and search engine advertising data. This makes our analyses more thorough. In that sense, I agree with Fredrick and laud Web analytics companies for responding to the search industry.

However, top-10 search engine positions alone don't communicate user behavior. Site owners should focus more on those who will spend money on their products and services: their target audiences.

Instead of overzealously maintaining a top-10 position (unfortunately, still a search engine optimization (SEO) obsession), online marketers should spend more time and money on visitor behavior.

Moreover, no position-checking software company I know of has received permission from search engines to perform automated queries on their indexes.

Server Loads and Advertisers

Search engine representatives have consistently not endorsed position-checking software. One explanation they provide for this is advertiser demand on their server load.

Suppose your site has over 100 targeted keyword phrases. You want use position-checking software to measure your site's search engine visibility. With 100 keyword phrases, the software performs 100 queries per search engine. If you wish to check positioning on 10 search engines, the software performs 1,000 search queries for just one site (10 x 100 = 1,000).

In the real world, thousands of companies want to know their site's search engine visibility. To keep things simple, let's say 100 companies with 100 unique, targeted keyword phrases want to perform position-checking:

100 companies x 1,000 queries = 100,000 search queries
If these companies check their position weekly (I commonly see this in RFPs), the total will be:

100,000 search queries x 52 weeks = 5,200,000 queries per year
Extrapolate these simple calculations to real-world business. Imagine the server load for the thousands of companies, all using position-checking software.

That server load interferes with the search engines' ability to provide users with relevant information. Further, advertisers pay for space on the search engines. How can they differentiate a position-checking query from a qualified visitor's query?

Even Google states directly it doesn't endorse position-checking software:

Don't use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our terms of service. Google does not recommend the use of products such as WebPosition GoldTM that send automatic or programmatic queries to Google.
Doorway Pages

Another problem with WPG is its doorway-page generator. I've written about doorway pages before.

The SEM industry has advanced beyond computer-generated doorway pages, but ad agencies still use them. One of my colleagues, who also specializes in search-friendly Web design, recently told me about four ad agencies whose entire SEO program is based on WPG doorway pages. He constantly turns those doorway pages in to the spam police.

Granted, not all ad agencies are ignorant when it comes to SEM. But ad agencies must understand the optimization market more thoroughly. WPG might provide a quick fix, but that quick fix is often search engine spam.

Conclusion

I'm not enthused about NetIQ's acquisition. Rather than turn to an SEM company for guidance on Web analytics development, you should instead work directly with search engines and usability professionals. The SEM industry might be more effective if companies worked with usability professionals who understand search behavior.

Shari Thurow is marketing director at Grantastic Designs, a full-service search engine marketing, Web and graphic design firm. Acknowledged as the leading expert on search engine friendly Web sites worldwide, she is the author of the top-selling marketing book, Search Engine Visibility, published through New Riders. Shari's areas of expertise include site design, search engine optimization and usability.

fr.: http://www.clickz.com/experts/search/results/article.php/3344231

Sorting Out SiteMatch

By Barry Lloyd - April 23, 2004

i) Introduction:

At the beginning of March, Yahoo rolled out a new and somewhat controversial form of paid inclusion called SiteMatch. They also introduced a new Trusted Feed program called SiteMatch XChange™ to replace the existing Inktomi, AltaVista and AllTheWeb feed programs, combining them into one.

In this article, we will be looking at how the standard SiteMatch program works, the impact (if any) it has on results and other issues that may have caused concerns with the new Yahoo! Search Engine.

ii) History:

Yahoo has been acquiring search technology aggressively over the past 2 years. Starting with their acquisition of Inktomi at the end of 2002, the search engine industry was shocked when Yahoo announced their purchase of Overture, the leading PPC provider, in the middle of 2003. Overture had also been busy buying search engines, having purchased both AltaVista and AllTheWeb earlier in the same year.

By purchasing Overture, Yahoo gained control of three major crawling search engines, along with their various technologies, as well as the leader in pay-per-click search. It was obvious that Yahoo were gearing themselves for a major re-launch into the lucrative field of search.

In February 2004, Yahoo dropped showing Google results on their searches for web pages (the default search) and replaced them with results from their own database. This appeared to be very similar to the old Inktomi data but with additional results which seemed to come from fresher crawls undertaken by both AllTheWeb and AltaVista. However, ranking criteria appeared to be a little different from the old Inktomi formula - but all-in-all, people who had done well with Inktomi tended to do well in the new Yahoo results.

Were we seeing a revamped Inktomi? According to Yahoo, no - this was an entirely new search technology created by combining the best elements of their recently acquired crawlers. In fact, Inktomi - as a brand, came to an end. Yahoo Search, along with their crawler Slurp (hmmm - seem to remember that from Inktomi) replaced the Inktomi crawler.

iii) The Launch of SiteMatch:

At the beginning of March, Yahoo announced that they would be replacing the 3 paid inclusion programs for the 3 crawling engines they had purchased by a single paid inclusion program, SiteMatch. Administered by Overture, SiteMatch could be purchased via Overture directly or through a selection of partners including the majority of those who previously provided paid inclusion programs to the replaced programs.

In the past, each of the engines had different prices per URL and submitting a page for a year to all 3 engines could cost around $115 per page for guaranteed inclusion. SiteMatch offered an inclusion price of $49.00 for the first page, $29 for the pages 2-10 and $10 for all subsequent pages of the same domain. On the face of it, this appeared initially to be a substantial discount for sites who wanted the fast inclusion and 48 hour refresh provided by paid spidering. Instead of an initial outlay of $1,115 for a 10 page site, you paid $310! This would get you inclusion in all the search portals previously covered by AltaVista, AllTheWeb and Inktomi - with the addition of Yahoo! To good to be true? Well, yes - there was a catch!

Depending on the category your website/pages fell into, you would also have to pay an additional 15 cents or 30 cents per click!

Webmasters world-wide were shocked. Pages receiving hundreds of referrals per day could cost hundreds of dollars per week instead of the fixed annual fee.

To make things worse, Yahoo announced that existing Inktomi paid inclusion listings which had started to show on Yahoo since they dropped Google would no longer be included in Yahoo results after 15th April although they would continue to appear on other Yahoo partner site (such as MSN and HotBot) for the duration of the unused part of the Inktomi subscription (as long as these partners used Yahoo results). Similarly AltaVista PFI would continue to show on AltaVista and AllTheWeb PFI would continue to show on AllTheWeb until these PFI subscriptions expired. If webmasters wished for their PFI pages to continue to be included in Yahoo, it was suggested they upgrade to SiteMatch! People were not happy, but Yahoo started rolling people out to answer some pretty hostile questioning primarily on WebmasterWorld and JimWorld's Search Engine Forums.

iv) The Main Questions and Answers from our own Observations:

Q: Is SiteMatch the only way to get indexed by Yahoo?

A: Yahoo state that 99% of all Yahoo's listings are derived from freely crawling the web. This is done by following links, although, to assist webmasters, Yahoo have now introduced a free submission location. We have tested crawl speeds by putting a new site into the Yahoo directory in March to see how long it would take to get a site crawled from this single link. The root directory was crawled within 2 weeks and we were getting referrals from pages on this site from MSN and Yahoo by week 5. We also put another site into free submission with no incoming links at the same time. Although the home page has been crawled (and no further pages as I write), I have yet to see it rank anywhere.

Q: Will your existing PFI Inktomi listings be removed from Yahoo after April 15th.

A: People who did not sign up for SiteMatch will no longer have their Inktomi PFI listings in Yahoo results after April 15th. But, provided your pages have been crawled by Yahoo's free crawler, naturally crawled pages will not be dropped. Certainly we have had no pages which were in Inktomi PFI removed from Yahoo listings after April 15th - all pages are there, although they have no PFI tracking code, which indicates that they are there through "natural" inclusion! This includes pages put into Inktomi PFI as recently as 20th February this year.

Q: Will Penalties from Inktomi carry forward to Yahoo Search?

A: Inktomi started imposing some heavy penalties last year via both manual reviews and automated methods. If you were in paid inclusion, this meant that you would appear last on any targeted search term. Another indication is that Slurp will only read your robots.txt file and proceed no further. These penalties have been carried forward to Yahoo. Yahoo have stated that they intend to start a re-review process and have offered the e-mail address: reportsearchspam@yahoo-inc.com for webmasters who wish to appeal a penalty. SiteMatch users can also contact their SiteMatch vendor to find out if they have a penalty and most try to find what has caused it and offer assistance. We have used the standard appeal procedure for a site that we felt had received an unfair penalty. We received a reply confirming Yahoo would review the site within 3 weeks of us making the request and (fortunately) saw the penalty lifted within another 7 days. So, it is possible to appeal their penalties, but ensure the site is squeaky clean!

Q: Will use of SiteMatch drop my other site pages or prevent pages from being crawled?

A: Simple answer to that is not in our experience.

Q: Will SiteMatch pages receive a boost in the results?

A: This used to be asked about all PFI programs in the past. SiteMatch pages get crawled and re-indexed every 48 hours. A good SEO or webmaster will use this frequent re-spidering to "adjust" the page to see how that affects the page rankings. That is the sole advantage SiteMatch (or any PFI program) gives you. A poorly optimised page will get poor results and vice-versa! So SiteMatch gives no inherent ranking advantage but it does give the webmaster the opportunity to adjust their pages frequently to find the ideal mixture of content to ensure decent listings. It's in your hands!

Q: If Yahoo is crawling the web for free, why do you need SiteMatch?

A: It all depends on how quickly you want to see results! If you have a commercial site with content changing frequently and you want to be showing this content in search engine results within a couple of days of publication, then SiteMatch gives you a great advantage. If you have static content and you are happy to wait a few weeks for the natural crawl and updates then SiteMatch is of no real benefit. Certainly, we have sites where we use SiteMatch on some pages, yet others remain updated only by the free crawl as they change infrequently.

Basically, it all boils down to Return on Investment. For some sites/pages SiteMatch makes economic sense, for others it would be far too costly an exercise.

Q: How long does it take for Yahoo to crawl a site in full?

A: Currently, this is a difficult question to answer as Yahoo Search has been out crawling for only a few weeks! Yahoo seems to crawl a site in "layers". First the root directory, with pages linked to by the home page seems to be crawled, then index pages from sub-directories and then some internal pages. The initial crawl seems to take around 2-3 weeks, the second "level" crawl another 2 weeks and we have just started to see Slurp nibbling at some more internal pages. Updates seem to occur on a rolling basis but spidering is not as aggressive as Google can be. As soon as we have seen a sizeable site indexed in its entirety we will post an update.

Q: How does the SiteMatch submission process work?

A: SiteMatch paid inclusion is different to previous PFI programs as pages have to go through a review process to identify or check you have selected the correct category and also to check for spam. This is done manually by human reviewers.

When putting a page into SiteMatch, you are asked to select the most appropriate category for your site or pages. Categories (and their associated CPC rates) are:

Categories

Adult $0.15

Apparel $0.30

Automotive $0.15

Books $0.15

Computers & Software $0.15

Dating $0.15

Education & Career $0.15

Electronics $0.30

Entertainment & Attractions $0.15

Financial Services $0.30

Flowers, Gifts & Registry $0.30

Health, Beauty & Personal Care $0.30

Home & Garden $0.30

Jewellery & Watches $0.15

Music & Video $0.15

Office $0.15

Professional Services $0.30

Real Estate $0.30

Reference $0.15

Sports & Outdoors $0.15

Telecom & Web Services $0.30

Toys & Baby Equipment $0.15

Travel $0.30

Other $0.15


Having selected the correct category, you have the option of selecting the areas or countries you wish to have your results displayed in. As such, you can exclude your listings from markets where you do not want visitors or enquiries. As a default, all listings are displayed worldwide - so make sure you select just the UK or Europe if this is what you want!

After giving SiteMatch the pages you want included, you are asked to pay the appropriate fee along with a $50.00 deposit to be used for click-through costs. Your pages are then put in a queue for editorial review.

In our experience, this takes around 5 working days. The review process appears to consists of not only checking pages that were submitted, but also other aspects of the site. This can include contact information, links to other websites and (probably) a search for related sites or possible cross-linking. So this is not purely a check for spam on the pages submitted. Your whole site is checked out and (if you have a network of sites) possibly your whole network. If you have other sites with what the reviewer may consider substantially the same content, you could have them all penalised! Sites that have affiliate links may also receive penalties. Any attempt to submit old fashioned doorway pages is doomed to failure. The full list of rules laid down are listed in the SiteMatch Guidelines.

In summary, unless you know that your site is squeaky clean, don't use SiteMatch. If you have any doubts as to what constitutes "squeaky clean" get a second opinion from a professional SEM company who can show that they understand SiteMatch fully! If you attract a penalty, remember it can affect your whole site and (possibly) other web sites you may own! Be careful.

However, there is some good news! If you are accepted, pages appear within a matter of hours on all Yahoo partner sites and Yahoo itself. For sites we have submitted, the average has been around 12 hours from acceptance. So expect your pages to appear within 7 working days from initial submission.

As far as ranking criteria are concerned, we have noticed the following:

On previous Inktomi partners (such as MSN and HotBot) the old Inktomi algorithm appears to still be the main criteria in ranking well. Pages laid out to the formula we wrote about in Optimising for Inktomi, still work just as well as they did previously. On Yahoo, things do work differently. Although we still achieved top 20 rankings for our pages, it appears that rankings only increased when external links were picked up by Yahoo's crawl. A Yahoo directory listing appears to have consistently boosted the rankings by several places over the initial placement in the Yahoo results, but other directories (such as the ODP) could well have the same effect. Results on AltaVista and AllTheWeb seem to follow the Yahoo results fairly closely but we expect these portals to be used for algo testing, so expect some inconsistencies!

Regardless of if you have selected various countries or regions to show up in, geo-targeting algo changes are used. For example a .co.uk site will get a boost in MSN UK and Yahoo UK results and a slight demotion in the main .com MSN and Yahoo sites. The same will occur for other TLDs and the search portal that is used.

All-in-all though, SiteMatch does what it says. After approval, inclusion is rapid and spidering is frequent.

Q: What happens when your deposit runs out?

A: Well, this hasn't happened to us yet! The official line is that your PFI pages will be removed from inclusion but, as happened with the Inktomi PFI drop, if your pages are included in the natural crawl, these will replace your PFI listings. If this is correct, and if you play your cards right, this could be a major way to reduce your spend on SiteMatch as you could use it for a 3 month period until a full crawl has been established on a new site and then allow pages that change infrequently to drop from the program.

v) Conclusion

Yahoo's new search engine has brought some healthy competition to the search engine field. Their fundamentally different way of ranking sites since the changes on Google some months ago have been a lifeline to many on-line businesses. Many observers have applauded their relevance, though others have complained at the ease at which they can be spammed. It is obvious that Yahoo is taking a very pro-active stance against spam (some would say that they are becoming way too harsh), so it is going to be interesting to see how this develops over the next few months. Certainly, few can continue to state that SiteMatch is going to flood the results with "paid spam". The criteria for inclusion are tough.

Gloomy predictions that existing Inktomi PFI customers were all going to be forced into having to pay for SiteMatch have been incorrect. Yahoo have included the majority of those pages through their free crawl provided they are linked to adequately and are on sites that have reasonable incoming links from authoritative sites.

The much vaunted Yahoo Search crawl has yet to prove itself, however. Inktomi had a reputation for taking anything up to a year to fully crawl a site. Yahoo needs to show that their iteration of Slurp is not going to be so lazy!

Yahoo employees are making a concerted effort to engage the webmaster community. Although this obviously has great PR benefits (let's face it SiteMatch is pitched towards webmasters), the fact that they seem to taking webmaster comments seriously is to be applauded.

Provided Yahoo can improve the freshness of their main index, searchers could well gravitate to their search engine. They have a good start.

SiteMatch will remain a controversial program for many and can be confused with PPC by webmasters who are inexperienced with SEM. You can't target specific keywords without optimisation of the pages - and for many webmasters, this could cause dissatisfaction with the program. Similarly, many will expect a ranking boost for paying the fee. This just doesn't happen - you are paying for a service which can be used as a tool in your optimisation and SEM portfolio. This, too, may lead many webmasters to question the value of the program. For most it appears that SiteMatch will be unnecessary, for some - it can be of immense benefit. Only time will tell if this model becomes a success. But if it does, expect other search engines to follow suit (particularly MSN).

fr.: http://www.searchengineguide.com/lloyd/2004/0423_bl1.html

雅虎執行長:不擔心Google上市

CNET新聞專區:綜合外電  28/04/2004

雅虎執行長兼主席Terry Semel星期二表示,Google的首次公開上市不會傷及雅虎。

Semel表示:「雅虎一直就有競爭對手,其中就包括Google。對雅虎和Google來說,尚有許多的成長空間。」

Google預計將在本周公佈首次公開上市(IPO)的計畫,一旦上市,Google的市值將達到200億美元。雅虎持有Google 5%的股份。

另外一方面,雅虎也在擴展自己的核心業務。去年,雅虎拿出了10億美元用於收購。

Google最近宣佈,將提供免費超大的1 GB電子郵件服務。

Semel表示,隨著越來越多的用戶使用網上購物的比價搜索引擎,搜索引擎的本質正在發生改變。(李海)

fr.: http://taiwan.cnet.com/news/ce/0,2000062982,20089163,00.htm

趕搭搜尋熱潮 Lycos求售

CNET新聞專區:Jim Hu  28/04/2004

西班牙網路公司Terra Lycos已聘請投資銀行Lehman Brothers代為銷售該公司的美國網際網路業務,包括該公司旗下的Lycos.com網站。

此一部門的銷售形同瓦解了Lycos與Terra Networks在2000年網路顛峰時期所簽下的125萬美元合併協議。根據Lehman Brothers外流的徵求買主文件顯示,Terra希望趁著這波線上廣告熱潮賣掉Lycos部門,轉而將公司業務專注在西班牙與葡萄牙語系的業務。

「Lycos為目前少數碩果僅存的知名網路搜尋與內容網站,在此網路廣告預算上升、付費內容逐漸為大眾接受、市場也看好搜尋市場整合之時,入主Lycos將有獨到優異的創造價值機會。」文件中如此寫到。

文件中雖沒有寫出售價,但內情人士表示Terra Lycos希望的底線是2億美元。

Terra Lycos與Lehman Brothers發言人都不願表示意見。

雖然網路公司不太可能再回到1990年代末期的榮景,但隨著網路搜尋逐漸獲重視,網路公司又有逐漸加溫線上。Yahoo在蒐購方面可說一面倒的領先,歷年來分別以16億美元買下Overture;5.79億美元買下法國比價網站Kelkoo;並以1.2億美元買下中國3721搜尋網站。

1990年中期的入口網路大戰塵埃落定後,出線的分別是Yahoo、AOL與微軟的MSN。二線搜尋網站如Lycos雖然也相當熱門,但流量遠不如一線的領導廠商。

迪士尼(Walt Disney)之前曾耗資數百億打造Go.com入口網站,但後來不敵而關閉了Infoseek。

先前被@Home以67億美元買下的Excite.com則在2001年被破產法庭以1000萬美元售給iWon與InfoSpace。

Ask Jeeves則在三月表示將以3.43億美元買下iWon、Excite與My Way的母公司Interactive Search Holdings。

Terra Lycos在歷經數度裁員後終於決定出售美國的搜尋網站。根據Lehman Bros的公開求售說明文件顯示,Lycos目前擁有17萬付費用戶。Lycos在2003年虧損2400萬美元,但在第四季已經轉虧為盈,並預期今年開始獲利。(陳奭璁)

fr.: http://taiwan.cnet.com/news/software/0,2000064574,20089166,00.htm

Netsky變種速度打破紀錄

記者鍾翠玲/台北報導  28/04/2004


Netsky今(28)日又出現變種,這隻病毒並因為兩個月內即已用完所有英文命名,而成為史上最快變種病毒。

安全公司趨勢科技今天為新的Netsky變種命名為Netsky.AB。由於它變種出現速度太快,在兩個月內就進入兩碼命名,因而被認為是史上變種繁衍最快的病毒。

防毒公司都是以英文字母A、B作為同一隻病毒變種的命名方式。趨勢科技指出,過去很少有超過英文字母兩碼命名的病毒,許多是放在網站上公開任人改寫,但多為不知名、沒有到發出警戒程度的病毒,或是演變時間長達一年。Netsky是第一隻知名度較高、常到警戒程度的病毒,最重要的是,Netsky是首度在兩個月內就用完所有單碼英文命名的病毒。

Netsky之所以消耗命名速度如此快,是背後駭客和Bagle陣營之間競賽的關係。趨勢科技表示,其實兩隻病毒變種到後來,與原始病毒差異已相當大,要不是因為病毒程式碼內有駭客「叫陣」的文字,否則應該已被認定為不同病毒。

例如趨勢科技防毒研究中心 TrendLabs發現,在Netsky病毒檔解碼中含有「S-k-y-n-e-t--A-n-t-i-v-i-r-u-s-T-e-a-m._-\/ Hey Bagle, feel our revenge!」(我們是天網團隊!嘿,培果病毒,嚐嚐來自我們的報復吧!)等文字。

Bagle(或Bagle)也在四月26日出現Bagle.W的變種病毒。而Netsky也在四月20日才出現Netsky.X變種。

趨勢科技對Netsky.AB發出中度警戒。不過在災情上並不嚴重。趨勢科技表示,台灣災情不嚴重,多為消費者受感染,而企業防禦也很容易。

賽門鐵克對這隻病毒命名為W32.Netsky.AB@mm,不過將其定義為低感染率。賽門鐵克表示,由於危害程度不及於警戒標準,所以該公司並沒有發佈病毒通報。

fr.: http://taiwan.cnet.com/news/software/0,2000064574,20089175,00.htm

Tuesday, April 27, 2004

美國會辯論該不該課網路稅

CNET新聞專區:Declan McCullagh  27/04/2004

美國參議院26日針對民眾存取網際網路該課多少稅的議題展開激辯。

以74票贊成對11票反對,參議員同意啟動一個程序,其終點是本周稍後將正式表決是否延續一項禁令,以永久阻止各州和地方政府額外對網際網路連線業者課稅,不論是以撥接、DSL(數位用戶迴路)、有線數據機、無線或衛星方式上網。

在大約長達兩小時的議場發言中,支持永久禁止課稅的議員警告,若採用其他選擇方案,可能扼殺網際網路。俄勒岡州民主黨籍參議員Ron Wyden說,這項表決「多多少少會決定我們用的電子郵件、Google搜尋、網站和即時傳訊服務,會不會被單獨挑出來,受到歧視性的課稅待遇」。他說:「我不相信,參議院會讓電子郵件、BlackBerrys和各式各樣的技術受到歧視性的課稅對待。」

參議員在議室廳暢所欲言時,他們的助理則在私下積極協商可能的妥協方案。亞歷桑納州共和黨籍參議員John McCain的助理就嘗試協調出一項折衷方案,把緩課稅緩衝期限延長四年,暫時不賦予其永久免課稅的法律地位,同時修改一些定議,讓各州政府可據此對網路電話(VoIP)服務課稅。

但田納西州共和黨籍參議員Lamar Alexander的發言人接受訪問時指出,McCain的提案或許無法滿足一群參議員的要求,這些參議員警告,若禁止各州和地方政府對網際網路存取課稅,可能導致各州和地方損失數十億美元的稅收。

「隨著電話通話搬上網際網路,若是網際網路存取的定義讓州和地方政府無法徵收稅捐,那麼等於讓那些通話免稅,」Alexander發言人Alexia Poe說。

Alexander是提出另一項法案的共同起草人之一,他提出的法案把11月1日已到期的暫時免課稅令延長到2005年底。但他強調,既然網際網路業務已上軌道,「就應該比照其他業者一樣,繳納同樣的銷售稅和營業稅」。

美國總統布希26日也加入這場辯論戰,他支持大規模的延緩課稅方案。「寬頻技術必須讓民眾負擔得起,」布希在明尼亞波里斯演講時說:「為了確保寬頻擴展到全美各個角落,寬頻必須夠便宜。我們不該課徵寬頻上網稅。若你希望寬頻連線在社會上普及,國會必須禁止對網路存取課稅。」

網路存取永久禁止課稅問題在參議院內引發激辯,和此法案9月17日在眾議院順利過關的情況,形成鮮明的對比。 (唐慧文)

fr.: http://taiwan.cnet.com/news/software/0,2000064574,20089125,00.htm

Spam Rules Require Effective Spam Police

By Danny Sullivan, Editor
April 27, 2004

What's spam? Search columnist Kevin Ryan wasn't quite correct in saying there are no standards in his recent article. Indeed, the two major search providers each publish standards about what they find unacceptable. Here they are: Google's and Yahoo's.

Ryan's summary of things to avoid, like other ones that have appeared recently from Shari Thurow (here and here) Dave Wallace and Jill Whalen, is a good one.

Clearly, knowing what's widely considered spam isn't hard. So why aren't people "playing by the rules?" Here are just a few reasons:

The rules for each search engine aren't necessarily the same, though they do seem closer today than in the past.


Some people have a "the cheaters are winning" mentality. If they see spam slip through, they feel like they should do the same (regardless of the fact that the "cheat" technique might not actually be what's helping a page rank well).


Some people simply don't agree with the search engine rules. For example, they might justify the use of "hidden text" via cascading style sheets to make up for the fact that their home page has no text. To them, search engines are simply stupid for not accepting this.


Some people simply don't care. As far as they're concerned, as long as a person gets to something relevant to their search topic, who cares how it came up?

How About Standards?
Ryan's commentary sees one solution to spam as a lobbying for standards. This idea has been floating around for ages and has gone nowhere.

SEM pioneer Paul Bruemmer pushed for search engine optimization certification back in 1998. But as I wrote then, just having a "rule book" doesn't mean and end to spam.

We also had a push in 2001 for search engine marketing standards, which also has gone nowhere in terms of reducing spam in search engines.

How About Better Search Engine Disclosure?
Want a real solution to the spam problem? Then let's have the search engines agree to publish lists of firms and companies that they have banned. That would help the consumer seeking an SEM firm to understand which firms to avoid. Or, if they do use a banned firm, at least they've been warned about the consequences if they still want to go with a "rule" breaker.

It's something I've suggested before, and the search engines themselves have discussed the idea at various times in the past. It's never gone forward, because the search engines seem fearful of legal actions, should they out-and-out call a firm a spammer.

Given this, it's with some sympathy that I've defended the still new SEMPO search engine marketing industry group, when it has come under fire for not trying to ensure its members adhere to search engine spam guidelines.

SEMPO recently posted a FAQ explaining why it has declined to do this. I feel even more forcefully. If the search engines aren't brave enough to enforce their own laws, why should the onus be on a third party group that doesn't even create these rules?

They Do Enforce!
Of course, search engines do police for spam. If they catch it, a page might be penalized for banned entirely. But that's not the same or as effective as providing an offenders list, for a variety of reasons.

An offenders list ensures that people who aren't ranking well or aren't listed for perfectly innocent reasons can discover that their problems are NOT due to a spam penalty. Far too often, people assume that they've "accidentally" spammed when they haven't. That leads them to perhaps make changes they needn't do.

Sometimes spam is simply allowed to continue on. Google is especially famous for this, preferring to seek "algorithmic" solutions to removing spam than perhaps to react immediately to spam reports and yank material. Why? Google says it wants to detect overall patterns and come up with more comprehensive solutions. Unfortunately, the waiting period for this fuels the "anything goes" fears that some search marketers have when they see spam escaping prosecution.

Disclosure helps searchers, not just companies that want to be listed. We've seen press outcry over filtering of adult content or filtering content in response to national laws, with the idea that perhaps Google and other search engines should disclose what they remove. But as I've written several times before, no search engine discloses what they've removed for spam reasons. That's something a searcher might want to know.
Disclosure...
To further expand on my last point, Google currently provides you with two ways to discover if they've removed material because of the Digital Millennium Copyright Act, a US law that can cause search engines to bar listings. A DMCA case involving the Church of Scientology in 2002 is probably the most famous example of this.

Google provides DMCA takedown requests that they receive to the Chilling Effects Clearinghouse. So, do a search there for Google, and you can review the material Google's been asked to remove.

I don't believe that Yahoo does the same thing, provide copies of DMCA takedown requests. However, you can get some sense by searching for Yahoo at Chilling Effects. Ironically, this brings back requests to Google that were probably bulk directed to Yahoo and other search engines.

Google will also provide "inline" notification if it has removed or suppressed material that might otherwise have shown up in the results you were reviewing. For example, try a search for kazaa. At the bottom of the page, you'll see this notice:

In response to a complaint we received under the Digital Millennium Copyright Act,
we have removed 5 result(s) from this page. If you wish, you may
read the DMCA complaint for these removed results.

This is excellent disclosure (and not something I believe Yahoo or other search engines do). Google's not only telling you that they removed results, but you can also clickthrough to read the actual complaint that they reacted to.

...And Lack Of Disclosure
Now return to the spam issue. As a searcher, were you informed that material was removed from your search request? Nope. And might there be an economic incentive for search engines to ban sites? Absolutely. Removing sites may cause those sites to resort to taking out ads, exactly the accusation that Google came under (and strongly denied) at the end of last year, when many sites lost rankings.

Let's not single out Google. Long-time search engine marketer Greg Boser nearly brought an audience of search engine marketers to its feet in applause when he criticized search engines themselves as needing better standards during a session Search Engine Strategies in 2002.

One of the issues Boser complained about was how sites would get pulled, only to then hear from various search engines about how they could get back in in through advertising or paid inclusion.

Real-Life Disclosure Example
Want a real-life example of the need for disclosure? Earlier this month, NetIQ, the maker of WebTrends, purchased rank checking tool WebPosition. It was a pretty big deal. Perhaps someone may have gone to Google and done a search for webposition to find out more about the software.

Good luck finding the official WebPosition site. It's not in the top results, unusual given that Google built its reputation largely on providing good navigation to official sites. Why not? Because WebPosition has no pages listed in Google at all. For ages, I've not seen Google list pages from WebPosition. It's probably banned, but of course Google doesn't confirm these things. And as a searcher, it's not something that was disclosed to you.

The Google-WebPosition problem? Google doesn't like the burden the popular rank checking software places on its system, so explicitly warns people not to use it. But ironically, you'll notice that Google has no problems taking ads for the software from WebPosition's many resellers.

Let me stress -- similar things are happening at other search engines as well. The need for better spam disclosure is universal, not just a Google-specific problem.

Protect Yourself
Better disclosure would be helpful for so many reasons. It would help confused web site owners. It would help guide those seeking the many good, search engine marketing firms that diligently try to avoid trouble and play by the search engine rules. I'd love to see it happen.

Until it does -- or more likely, given that it may never come -- here are some additional suggestions to guide you.

When it comes to spam, the search engines are the lawmakers. You operate within their borders and are subject to whatever rules they may or may not publish. Break the rules, and you can get tossed out of the country. So learn the rules, as much as you can -- assuming you want to avoid trouble.

Outsourcing? Knowing the rules may not help, if the SEM company simply makes up its own euphemisms to cover up things search engines don't like. So get references. You might also try contacting the search engines themselves for advice about the company, but don't hold your breath waiting for a response. Still do it. Should you get banned, at least you have some evidence that you tried to do due diligence with the search engine themselves.
Looking for more on the topic of spam? Search Engine Watch members have access to the Search Engine Spamming page, which has a summary of things commonly considered bad to do, as well as a compilation of articles on the topic. The Search Engine Optimization Articles and Search Engine Marketing Articles pages also have a compilation of articles that touch on spam and past guidelines and standards attempts.

Another new spam resource is a scholarly paper that's just appeared out of Stanford, Web Spam Taxonomy. It's a nice summary, though not entirely accurate. Google still stands by the fact that it does not give out "URL spam" penalties for URLs that contain to many keywords in them (though I personally wouldn't do this, despite such assurances). And the example giving a meta keyword tag spam example in the report actually doesn't look like spam at all. But only a search engine could tell you for certain -- and none of them do.

For fun, you might also look at the new Black Hat SEO site. This is a humorous directory to unsavory search engine marketing pitches that have been received by Aaron Wall.

Finally, the Outsourcing Search Engine Marketing page is a compilation of articles covering advice on seeking out SEM firms to work with, such as the recent SEO outsouring one we published on this topic in SearchDay last month.

fr.: http://searchenginewatch.com/searchday/article.php/3344581

電子紙張閱讀器問市 國內看法保守

記者韓旭爾/台北報導  26/04/2004

家飛利浦電子(philips)、新力索尼(SONY)、E Ink、Toppan Printing共同宣佈推出全球首款採用電子紙張顯示模組的Sony電子書閱讀器LIBRI'e,該產品已在日本上市,強調可提供真實紙張般的閱讀感受。

LIBRI'e所採用的電子紙張顯示器,是由飛利浦(philips)、E Ink公司、Toppan Printing公司和SONY聯合研發製造的。由E Ink生產電子墨水,而Toppan Printing將墨水加工到薄膜上。再由飛利浦將E Ink公司的這些薄膜整合至一塊主動基底背板(active matrix backplane)上,並加裝電子驅動元件。

事實上,過去以PDA等小型裝置當作電子書閱讀器已相當普遍。不過,PDA並不適於大量閱讀以小文字為主的小說和報紙。長時間看著PDA的小螢幕顯示器,眼睛就會疲勞,對於篇幅較長的文件,負擔更大。

相較於PDA,飛利浦電子表示,電子紙張的優點在於螢幕較大,顯示器是反射式的,無論在明亮的日光下還是昏暗的環境中都能輕鬆閱讀,事實上從各個角度看它都和真正的紙張相似。

但對電子紙張的電子書閱讀器的閱讀效果以及系統的相容性,國內的看法相當保守。

「電子書包發展促進會」為「數位學習國家型科技發展計畫」之下的一個單位。對於電子紙張的電子書閱讀器的上市,電子書包發展促進會會長楊重和表示,電子書閱讀器的發展與各國的風情文化有關,例如:日本人喜歡在電車上閱讀書報與漫畫,所以一些日本大廠如SONY、Panasonic、NEC等大廠,都積極開發相關產品,企圖讓電子書取代傳統的書藉。

也因此,它們研發出的產品大多十分輕巧,單色螢幕為主,以省電為特色,其更換內容多以卡片抽換方式。也因此,電子紙張顯示模組的電子書閱讀器對日本而言有發展的空間。

但對台灣而言,楊重和表示,早期所開發出的電子書閱讀器也都為單色螢幕,功能不多,所以並沒有受到市場的青睞。目前,在市面上的產品大多已強化多媒體功能,有些已能播放MP3音樂。

楊重和認為,國內電子書閱讀器的發展趨勢應該是偏向多媒體以及開放性的架構發展,例如:彩色螢幕、具有多媒體功能、能上網、可搜索電子書內容…等等。

因此,在政府推動電子書包的政策時,採用的平台即是平板電腦(Tablet PC),內容可包含學生上課所需的一切資訊與活動,例如:功課表、家庭聯絡簿、線上考試、上網、辭典…等等,甚至可以內建多媒體影音功能、全民英檢複習題庫,還有娛樂及遊戲軟體,以多功能取向來減輕學生書包的重量,同時提高學生學習意願與效果。

楊重和表示,電子書的閱讀風氣要形成潮流,硬體並非重點,電子書的數位內容才是最重要的。目前,國內在這方面的業者並不多,而且在作業平台、文件格式、加密方式都還未建立有共通的標準。

宏碁電子化微服務事業單位業務行銷經理徐正隆就表示,目前就宏碁所提供的電子書數位文本而言,最只支援一般的PC、Notebook與平板電腦,作業系統為Windows與Mac,但對類似新力的LIBRI’e這種非普遍化的平台應該是不能支援。

fr.: http://taiwan.cnet.com/news/ce/0,2000062982,20089114,00.htm

Google fires back at Digital Envoy

Last modified: April 26, 2004, 5:25 PM PDT
By Stefanie Olsen
Staff Writer, CNET News.com

Google has countersued Digital Envoy, which claims that the search leader misappropriated its geo-targeting technology to deliver sponsored results.

Mountain View, Calif.-based Google filed a complaint against Digital Envoy on April 16 in the Northern District Court of California, asking for declaratory judgment that it did not misuse its technology license. It also claims that Digital Envoy wrongly filed its suit in Georgia, where the geo-targeting software company is based.

Google was responding to a lawsuit filed by Digital Envoy--its longtime technology partner--only weeks ago. The lawsuit, filed in U.S. District Court in Atlanta, alleges that Google overstepped the bounds of its contract to use Digital Envoy's IP Intelligence, software that pinpoints Web surfers' physical location and is used to better deliver local search results and relevant ads.

Google has held a license for the technology since November 2000 for its popular search engine. But when Google expanded its search-related advertising network to third-party publishers and non-search Web pages, it failed to secure appropriate licenses, according to the complaint.

"Google's going on the offensive," said Timothy Kratz, an attorney representing Digital Envoy.

Google representative David Krane said the company believes that Digital Envoy's allegations lack merit. "We have always and continue to act fully within our rights under our agreement with Digital Envoy."

Google currently pays $8,000 a month for its use of the Digital Envoy technology and has offered to increase that amount by 50 percent, Kratz said. But he suspects that Google is making millions from syndicating its ads to third parties and that the court discovery process will determine just how much. That way, it can come to appropriate fees.

Google's complaint argues that the case is a contractual dispute, requiring that it be filed in the state where it is headquartered, California. But Kratz said that the complaint is not contractual and therefore can be decided in Georgia. He said it's not a breach-of-contract claim but rather tort claims that Google has engaged in unfair business practices and misuse of technology falling outside the contract.

fr.: http://news.com.com/2100-1024_3-5200584.html?tag=st.lh