0

Search engine for the Deep Web

Logicgateone Corp. 9 years ago 0
The United States' Defense Advanced Research Projects Agency (DARPA) is reportedly working on a search engine of sorts for the Deep Web in order to aid law enforcement against human trafficking.

DARPA's 3-year research program, dubbed as Memex, has been operating now for 6 months in coordination with 17 contractors and 8 partners. Carnegie Mellon was given a USD 3.6 million contract this year to partner with DARPA in the Memex project. And according to a research professor in its Robotics Institute, there is a huge potential in the program.

The term "Memex", which refers to Vannevar Bush's 1945 essay about a collective memory for humanity, is now going to be known as a tool for bypassing typical search processes in order to pinpoint users who create or spread illegal content in the deep web.

The mainstream webpages we can access through traditional methods compose what we call the Surface Web. Because of certain factors like ranking and advertising, results from leading search engines only account for around 10% of the Web. And according to some estimates, the Deep Web actually comprises 90% of the online world, with most of its content only accessible through Tor. The goal of Memex is to search through this Deep Web for signs of illegal activity.

The problem with the darknet is that the webpages in there are too fleeting -- content disappears even before authorities can check them. What Memex would do to solve this is to log the content and its source, then trace any personal information of the poster and map locations. This way, it will be easier to track online patterns that will aid law enforcement in uncovering illegal activities such as in the case of Silk Road.

Evidently, this won't be just useful in the field of law enforcement but can be also repurposed to be of use in the business sector and other fields like journalism.

DARPA program manager Chris White said in a post, "We're envisioning a new paradigm for search that would tailor indexed content, search results and interface tools to individual users and specific subject areas, and not the other way around. By inventing better methods for interacting with and sharing information, we want to improve search for everybody and individualize access to information."