Onion.City the new search engine for the Deep Web simply use a common browser

Pierluigi Paganini February 21, 2015

Onion.City the new search engine specialized for the black markets in the Deep Web simply accessible from a common browser.

We have described several times the not indexed portion of the web known as the Deep Web, an impressive amount of content that the majority of netzens totally ignore.

Deep Web is also known for the anonymity it offers, for this reason amount the actors that daily use it there are groups of cyber criminals.

Law enforcement and Intelligence agencies are spending a great effort trying to de-anonymize users on the Deep Web and indexing its content, recently the Defense Advances Research Projects Agency (DARPA) has publicly presented the Memex Project, a new set of search tools which will improve also researches into the “Deep Web”.

Recently a new search engine dubbed Onion.City is appeared on the Surface Web, it’s a Google like tool that allow easily search for content on the Deep Web.

Onion.City is a new search engine for online black markets that allow users easily find and buy illegal goods in the underground.

Onion.City 2

It seems very easy to buy drugs, stolen credit cards and weapons by using only common browsers, including Chrome, Internet Explorer or Firefox, without installing and browsing via the Tor Browser.

The Onion.City Deep Web search engine was presented by Virgil Griffith onto the Tor-talk mailing list, the tool is able to search content related to nearly 650,000 pages on the Tor network displaying results in a normal browser.

Onion.City search engine is based on Tor2web proxy, the author exposes all the tor2web onions on his sitemap, so Google is able to crawl them and index the content.

“Everything available on the Google Custom Search is also available on a regular google search with the qualifier: “site:onion.city”” said Griffith.

The actual solution of Onion.City, as explained by the author, represents the suboptimal because clients connect directly to Google.

“Alas no. I’m aware this is suboptimal. I see GOOG search engine as a temporary-ladder just to get the ball rolling. I am open to using any other index. For what it’s worth I’m very pleased with GOOG’s performance—right now it’s searching an index of 650k onion pages and the number grows every day.”

Another issue debated on about Onion.City it that the search engine uses only HTTP form, this means that it lack of traffic encryption exposing users to eavesdropping.

“It’s especially crazy if you allow your clients to submit HTTP forms over onion.city, since it basically means that onion.city gets to see *all* the usernames and passwords. I bet there are many people out there who don’t really get the tor2web threat model, and it’s nasty to read their passwords.” said one user in the discussion. 

Griffith explained that Onion.City doesn’t maintain logs of user’s traffic, but he understands the concerns of users, unfortunately, he hasn’t sufficient funds at the moment to implement HTTPs.

Onion.city isn’t the first ever Deep Web search engine, last year appeared on the Surface web Grams, the first search engine specialized in black markets.

I close with a curiosity, looking the Frequently Asked Questions (FAQs) on Onion.City website the author explains that users can report content that may be illegal.

Enjoy Onion.City …

(Security Affairs –  Onion.City, Black Market)



you might also like

leave a comment