Join the Webscout Weekly Intelligence Webinar scheduled on Thursday, September 29, 09:30AM PDT, on "Data Surveillance: Risks to Personal Data Security"

What’s DarkWeb
Crawling

Dark Web Crawling is an indexer project that can perform address
discovery in the TOR network and Surface WEB.

What is Dark Web Crawling?

In-Depth Analysis and Intelligence Area

  • Simplicity As Possible
  • Methodology Of The DarkMap
  • Smart Bot Technology
  • Unnoticed Crawling Technology
  • Periodic Web Page Controls
  • Archiving Of All Content
  • Everything At Your Server
  • Expandable Structure
split Images

What Is Dark Web Crawling?

It is a browser system that allows you to search in a simple interface.

Dark Web Crawling is an indexer project that can perform address discovery in the TOR network and Surface WEB.

Our main goal is to make exploration and indexing work in all layers that make up the Internet.

Archiving the targeted pages without missing any content is one of our main goals.

We automate the content tracking and discovery processes performed manually by cyber intelligence analysts.

Simplicity As Possible

It is a browser system that allows you to search in a simple interface.

DarkMap Module is a crawler system that allows you to search the contents collected from all layers of the internet in a simple interface. The difference from other crawling tools is that it presents the content collect-ed from the web, deep web and dark web in a functional, simple and holistic way so that analysts can easily research.

split Images

split Images

Methodology Of The Darkmap

In-Depth Analysis and Intelligence Area

  • Darkmap is exploring new connections with the tree branch methodology.
  • Darkmap gets rich content by adding all the links on a page it visits to the visit list and then visiting those pages.
  • As the content collection task proceeds based on the given target pages, the content is growing day by day, in a controlled, manageable and meaningful way.

Unnoticed Crawling Technology

Darkmap is configured to go unnoticed by the web page restriction algorithms. None of the platforms that can detected bot movements, including facebook or blocked sites.

The Darkmap, which can already save the data of the pages and groups that are open, can also obtain and save the necessary information from closed groups with the username and password defined to it.

Periodic Web Page Controls

In-Depth Analysis and Intelligence Area

  • The page visit frequency is determined by the users
  • Target pages can be checked and saved as
  • Monthly, weekly, daily, hourly and minutely
  • In this way, the current content about the target page is tracked
  • Cyber intelligence analysts have the opportunity to obtain the page contents completely because of the periodic web page visits.
split Images

split Images

Archiving Of All Content

In-Depth Analysis and Intelligence Area

  • DarkMap keeps the pages and applications it visits as html in discovery content.
  • If it is desired to archive all the content of any page (text, images, video and audio files), it can be archived regardless of the component of the page.
  • These components remain preserved in the archive, ready to be sorted and analyzed historically.

Everything At Your Server

In-Depth Analysis and Intelligence Area

  • As seen in the example, Darkmap hosts all of its high-importance discoveries on the servers it runs on.
  • Darkmap provides 24/7 analysts support with a real use experience because of the ability to save pages and channels.

Expandable Structure

In-Depth Analysis and Intelligence Area

  • Darkmap works in micro-service architecture. In this way, it can be enlarged horizontally and vertically.
  • System resources and storage can be increased easily.
  • The system can easily adapt to the structure that can grow in efficiency and storage.
cookies