Overview

Your research. Your data. Your cloud.

We enable research teams to analyze, share and archive material from various data sources. Our secure and managed research platform is aimed at organizations that do not want to or cannot operate their own technology stack.
a newspaper and investigator face icon

Big data projects are becoming increasingly important for newsrooms and research organizations to enable high-quality publications and investigative disclosures.

a folder and an exploding face icon

More and more data sources are available, and the analysis and maintenance of a data catalog is becoming increasingly complex.

money and a sad face icon

The infrastructure required for this is cost-intensive and complex to manage and therefore cannot be implemented in many organizations due to a lack of resources.

a package with a smiling face icon

Our solutions make up the complete package for your data-driven searches.

Our Platform

Part of an established ecosystem

Our platform is built with state of the art technology and connects to established tools such as Aleph and the followthemoney toolkit. On top of that, we built our own open source solutions.

The Secure Research Hub

Our platform combines established search software into a secure analysis and archive system. We deploy exclusive and independent instances for each of our clients based on their individual needs.

  • Securely search and store data
  • Import external datasets on schedule
  • Manage users and group permissions
  • Independent and secure servers
  • Exclusive for each organizations needs

We call our solution the Secure Research Hub. It is a managed "software as a service" solution, but exclusive and independent for each client. Security, backups and everything else needed for a smooth operation is included. That is what it is made of:

Aleph

A data platform that archives all your documents, leaks, and investigations.

Nextcloud

We integrate Nextcloud so that organizations can upload documents easily and sync folders into Aleph collections.

Investigraph

Our framework to scrape and manage collections of structured data and import them into Aleph on schedule.

Infrastructure

Rest assured knowing that you have security and data sovereignty. We own our servers and operate them in an ISO-certified data center in Germany.

Additionally, we partner with FlokiNET, a secure hosting provider, to offer you alternative jurisdictions in and outside of the EU.

Data engineering

Empowering your data journalism projects

Our framework investigraph helps you to set up a pipeline that can extract, transform and load data from many formats into various target systems.

A research platform without data is useless. That's why a large catalog is part of the cloud, from which editorial teams can put together their own individual package. And, of course, they can add their own data at any time in a protected manner.

  • Data extraction
    Scrape data from public websites, APIs, json, csv data dumps, or sql databases, and more.
  • Data transformation
    Map the source data to the common "Follow the money" model to create connections between persons of interest, companies, and public entities.
  • Data loading
    Store your datasets on your computer, in the cloud, or share it with your team or external collaborators. Or import into your Aleph, on repeat.

We strive to empower data driven investigations. That's why we contribute to editorial projects and build our own public databases. Building upon our data platform and infrastructure, we help our clients to bring journalism projects to live with a decade of experience in:

  • Creating datasets from public available sources
  • Designing and building data exploration platforms
  • Find more data to enrich existing projects
  • Find missing links between different data sources

Have a look at some of our recent projects to get a glimpse what we can do for you.

Consulting

Streamline your Data Workflow

Our consultation services can empower your newsroom and editorial teams, from project inception to rapid data organization, ensuring efficiency and accuracy every step of the way.
  • How to organize data-driven investigations in newsrooms and teams
  • How to manage datasets within a cross-border investigation
  • How to build reproducible data pipelines
  • How to bring editorial projects and their data to live

Need it fast?

Whatever happens during a project, we can jump in and help out quickly.

We can set up a temporary Aleph instance for some hours or days for your organization to look into a leaked dataset and wipe it afterwards. Or we can use our servers and storage to temporarily provide some extra space to park huge datasets.

Whatever you need, don't hesitate to get in touch and we'll try to make it happen.

smiling face

We would love to hear from you!

Book a call