D E X

DEX8 is a data extraction and task automation, SaaS platform with infinite possibilities. Use its web robots to scrape data by creating your own scripts or by using premade scripts from the Turnkey Solutions‚Äč . This tool is for everybody. A beginner, intermediate or experienced data extractor can quickly start a project collecting public data. JavaScript developers can create their own data extraction scripts by utilising rich DEX8 Helper Library.
Sign up and get all basic platform features for free.

DEX8 Mascot

How does the DEX8 platform work?

The process of extracting data is easy and can be carried out in a few simple steps.

DEX8 type JS code

Create a Robot Task

Develop your own crawler script by using JavaScript language, or select one from our Turnkey Solutions library.

DEX8 servers

Deploy a Robot Task

Deploy a crawler script as a robot task on one or many servers.

Scrape Web

Start a Crawler

Start a crawler - also known as a robot -and it will extract data from the Web. Use the DEX8 Web Panel to start, stop or pause the robot with a single click.

Database

Store Data

Save the extracted data in the MongoDB database as a JSON document.

Basic Features & Platform Components

  • Web Panel - A control panel where you can log in and manage your robots, robot tasks, databases, task scheduler etc. from your web browser.
  • Robots - Servers that run JavaScript code from different IP addresses. Also called crawlers, scrapers or data extractors.
  • Tasks - Scripts written in JavaScript code and implemented by your robots.
  • Turnkey Solutions - Premade tasks, which eliminate the need for scripting and complicated programming.
  • Task Runner - Use this to start, stop or pause your robots from your browser with only one click.
  • NoSQL Database - Save extracted data as a JSON document.
  • Exporter - This exports saved JSON documents into other formats: for example, XLS, CSV, PDF.
  • Scheduler - This enables you to start/stop robot tasks within a specified time-span.
  • Shop - Buy more DEX8 resources and expand your crawler capacity.