What Can Yahoo Learn From Malware?

Search engines would be more efficient if they were decentralized, reports MIT's Technology Review. In other words, they'd be faster if they were run like bot-nets.

Bot-nets are networks of zombie computers whose processing resources have been possessed by viruses, ready to be exploited by whoever spread the virus. Just two years ago, it was thought that the most powerful supercomputer on earth was actually a bot-net created by the Storm worm. (Below, a modern data center.)

data center

Search engines like Google and Yahoo, by comparison, usually centralize their processing in enormous data centers. Researchers at Yahoo have begun to re-think the approach, according to MIT, and theorize that it might be better to spread the engine's index around a smattering of smaller data centers. According to MIT: "With this approach, smaller data centers would contain locally relevant information and a small proportion of globally replicated data. Many search queries common to a particular area could be answered using the content stored in a local data center, while other queries would be passed on to different data centers."

Yahoo

To make the distributed model work, Yahoo researchers have designed a scenario in which statistical information about page rankings could be shared between data centers. Each small data center would run a query concurrently, and the ones with the statistically better results would respond to the user.

This is not so far from another search engine's taxonomy. ChaCha uses a similarly distributed approach to answer user queries, but instead of data centers it uses a humans.

Of course, Yahoo's research may all come to naught; its search engine is about to be replaced by Bing here in the U.S., and possibly for all Yahoo sites the world over.

Add New Comment

0 Comments