Berlin, Deutschland

We are looking for a developer with Go experience to join a new cross-functional team. The Datacore team builds machine learning and data products for internal users. For example, one of our current projects is migrating all teams’ data-related processes and tools away from Postgres to use our Presto/Parquet-based data lake. Future projects include things like building an integrated simulation environment to help us fully automate releases. If you’re an experienced developer and enjoy a challenge, join us!


  • Solid experience in Go, and the ability to write it idiomatically
  • Working knowledge of SQL
  • Exposure to containerized environments such as Kubernetes
  • Experience with Airflow, Presto, or similar ETL and big data tools is a big plus
  • Experience with Kafka would be a plus

Current and prospective projects we're working on:

  • A S3-based data lake queried using AWS Athena (we wrote our own Parquet library in Go).
  • Converting experimental machine learning projects to production-grade code.
  • Scaling a Kubernetes- and Kafka-based batch processing system for simulating the analytical performance of our models and classifiers (we want to push past 10,000s txn/s so we can re-score all of our transactions nightly).
  • Building the infrastructure and new apps for end-to-end training-validation-deploy cycles (think CI/CD for machine learning).
  • Performance analysis and optimization of analytical models.

We are looking forward to receiving your application!

How to apply

Please apply via the following link:

Does this job really require Go skills? If not, please report it and we will take a look.