Overview
Your research. Your data. Your cloud.
Big data projects are becoming increasingly important for newsrooms and research organizations to enable high-quality publications and investigative disclosures.
More and more data sources are available, and the analysis and maintenance of a data catalog is becoming increasingly complex.
The infrastructure required for this is cost-intensive and complex to manage and therefore cannot be implemented in many organizations due to a lack of resources.
Our solutions make up the complete package for your data-driven searches.
Our Platform
Part of an established ecosystem
The Secure Research Hub
Our platform combines established search software into a secure analysis and archive system. We deploy exclusive and independent instances for each of our clients based on their individual needs.
- Securely search and store data
- Import external datasets on schedule
- Manage users and group permissions
- Independent and secure servers
- Exclusive for each organizations needs
We call our solution the Secure Research Hub. It is a managed "software as a service" solution, but exclusive and independent for each client. Security, backups and everything else needed for a smooth operation is included. That is what it is made of:
Nextcloud
We integrate Nextcloud so that organizations can upload documents easily and sync folders into Aleph collections.
Investigraph
Our framework to scrape and manage collections of structured data and import them into Aleph on schedule.
Infrastructure
Rest assured knowing that you have security and data sovereignty. We own our servers and operate them in an ISO-certified data center in Germany.
Additionally, we partner with FlokiNET, a secure hosting provider, to offer you alternative jurisdictions in and outside of the EU.
Data engineering
Empowering your data journalism projects
A research platform without data is useless. That's why a large catalog is part of the cloud, from which editorial teams can put together their own individual package. And, of course, they can add their own data at any time in a protected manner.
- Data extraction
Scrape data from public websites, APIs, json, csv data dumps, or sql databases, and more. - Data transformation
Map the source data to the common "Follow the money" model to create connections between persons of interest, companies, and public entities. - Data loading
Store your datasets on your computer, in the cloud, or share it with your team or external collaborators. Or import into your Aleph, on repeat.
We strive to empower data driven investigations. That's why we contribute to editorial projects and build our own public databases. Building upon our data platform and infrastructure, we help our clients to bring journalism projects to live with a decade of experience in:
- Creating datasets from public available sources
- Designing and building data exploration platforms
- Find more data to enrich existing projects
- Find missing links between different data sources
Have a look at some of our recent projects to get a glimpse what we can do for you.
Consulting
Streamline your Data Workflow
- How to organize data-driven investigations in newsrooms and teams
- How to manage datasets within a cross-border investigation
- How to build reproducible data pipelines
- How to bring editorial projects and their data to live
Need it fast?
Whatever happens during a project, we can jump in and help out quickly.
We can set up a temporary Aleph instance for some hours or days for your organization to look into a leaked dataset and wipe it afterwards. Or we can use our servers and storage to temporarily provide some extra space to park huge datasets.
Whatever you need, don't hesitate to get in touch and we'll try to make it happen.