Bright Planet, Deep Internet

web sites in the sense that a file is downloaded to the user’s browser when he or she surfs to these addresses. But that’s exactly where the similarity ends. These internet pages are front-ends, gates to underlying databases. The databases include records regarding the plots, themes, characters and other features of, respectively, movies and books. Each user-query generates a special internet page whose contents are determined by the query parameters. The quantity of singular pages therefore capable of getting generated is thoughts boggling. Search engines operate on the similar principle – differ the search parameters slightly and completely new pages are generated. It is a dynamic, user-responsive and chimerical sort of net.

These are excellent examples of what http://www.brightplanet.com call the “Deep Internet” (previously inaccurately described as the “Unknown or Invisible Online”). They think that the Deep Internet is 500 times the size of the “Surface World wide web” (a portion of which is spidered by traditional search engines). This translates to c. 7500 TERAbytes of information (versus 19 terabytes in the entire identified internet, excluding the databases of the search engines themselves) – or 550 billion documents organized in 100,000 deep internet websites. By comparison, Google, the most complete search engine ever, stores 1.4 billion documents in its immense caches at http://www.google.com. The organic inclination to dismiss these pages of data as mere re-arrangements of the same details is incorrect. In fact, this underground ocean of covert intelligence is normally a lot more precious than the facts freely readily available or easily accessible on the surface. Therefore the potential of c. 5% of these databases to charge their users subscription and membership charges. The average deep net web page receives 50% more targeted traffic than a standard surface internet site and is much far more linked to by other web-sites. But it is transparent to classic search engines and tiny recognized to the surfing public.

It was only a query of time before someone came up with a search technology to tap these depths (www.completeplanet.com).

LexiBot, in the words of its inventors, is…

“…the first and only search technologies capable of identifying, retrieving, qualifying, classifying and organizing “deep” and “surface” content material from the World Wide Internet. The LexiBot makes it possible for searchers to dive deep and explore hidden information from several sources simultaneously employing directed queries. Firms, researchers and consumers now have access to the most worthwhile and difficult-to-uncover information and facts on the Internet and can retrieve it with pinpoint accuracy.”

It areas dozens of queries, in dozens of threads simultaneously and spiders the benefits (rather as a “initially generation” search engine would do). This could prove extremely beneficial with enormous databases such as the human genome, climate patterns, simulations of nuclear explosions, thematic, multi-featured databases, intelligent agents (e.g., buying bots) and third generation search engines. It could also have implications on the wireless net (for instance, in analysing and creating location-distinct advertising) and on e-commerce (which amounts to the dynamic serving of web documents).

This transition from the static to the dynamic, from the given to the generated, from the a single-dimensionally linked to the multi-dimensionally hyperlinked, from the deterministic content to the contingent, heuristically-designed and uncertain content material – is the real revolution and the future of the web. Search engines have lost their efficacy as gateways. Portals have taken over but most folks now use internal links (inside the same web site) to get from a single location to another. This is exactly where the deep net comes in. Databases are about internal hyperlinks. Hitherto they existed in splendid isolation, universes closed but to the most persistent and knowledgeable. Hidden wiki url might be about to change. The flood of high quality relevant details this will unleash will drastically dwarf something that preceded it.

Leave a Reply

Your email address will not be published. Required fields are marked *