Professional Experience
Software Engineer / Datadog, Inc
May 2021 - Present, New York, U.S. (remote)
Datadog is an observability platform.
Core Responsibilities
- Software engineering in the Application Performance Monitoring space
Key Learnings
- Software Engineering: Java, Kotlin, Micronaut
- DevOps: Kubernetes, AWS, GCP, CNAB, bazel, Terraform
Data Engineer / FlixMobility Tech GmbH
Jul 2019 - May 2021, Berlin, Germany
Flixbus is a German brand that offers intercity bus service in Europe and the United States. It is owned by FlixMobility GmbH, which also operates FlixTrain and FlixCar. FlixMobility Tech is the technology team behind the company.
Core Responsibilities
- Owning Kafka cluster, used as central messaging and streaming platform for 250+ engineers in 50+ technologically heterogeneous teams
- Re-designed existing Kafka cluster to facilitate 45,000 partitions on EC2 with terraform, datadog, lenses, Pagerduty
- Enabling GDPR compliance by integration of Kafka with Azure AD / OpenID Connect
- Design, implementation and deployment of Kafka extensions for authentication, authorization and configuration validation
- Established company-wide Avro schema standard
- Consulting other teams on Kafka and streaming technologies
Key Learnings
- Data engineering: Kafka, Kafka Streams, MirrorMaker, Scala/Java, Python
- DevOps: Kubernetes, AWS, Terraform, DataDog
Data Engineer / mbr Targeting GmbH
Oct 2017 - Jun 2019, Berlin, Germany
Mbr targeting is a real-time bidding company. It provides a demand side- and data management platform allowing advertisers to run campaigns on multiple ad exchanges. The company specializes in user-centric optimization using machine learning algorithms. The system scales to above 100.000 requests per second with a latency of below 100ms.
As data engineer I used and maintained a 1000 core petabyte-size hadoop cluster and another 250 core 500Tb HBase cluster. I helped ingest terabytes of data per day. Apart from that I was also involved in planning, designing and implementing services for use in real-time-bidding systems and a data management platform with high requirements for low latency and high throughput. Because of the demanding performance requirements of our components I learned a lot about production JVM systems. As part of a small team of data engineers I took part in the shared responsibility for our software components as well as for organizational tasks such as hiring and mentoring.
Core Responsibilities
- Prototype Data Management Platform utilizing a 3-dimensional data model in HBase
- Maintain Hadoop / Hbase clusers
- Extend core functionality of HBase with coprocessors and custom filters
- Take part in interviewing applicants
- Organize trial days of applicants
- Coach data scientists and other users of the hadoop cluster
Key Learnings
- Data engineering: Java/Scala, Flink, Hadoop/mapreduce, Hbase, Hive/Impala
- Backend: Scala, Aerospike, Elasticsearch, Kafka, Kubernetes, AWS, Docker, Jersey
Software Engineer / mbr Targeting GmbH
Jan 2016 - Sep 2017, Berlin, Germany
As software engineer I helped to build a distributed real-time-bidding engine using machine learning approaches to calculate bids for more than 30,000 events per second. With a hard response time requirement of 100ms to answer each request the system has to be highly performant and scalable.
Core Responsibilities
- Implement high performance bidding engine
Key Learings
- Backend: Java/Scala, Node.js, Vertx, Guice, Aerospike, Redis, Kafka
Software Engineer / adscale GmbH
Apr 2015 - Oct 2015, Christchurch, New Zealand
Adscale Labs is the development team of Adscale GmbH. Adscale is an ad exchange providing a supply side platform to publishers.
At adscale New Zealand I worked on the supply side real-time-bidding infrastructure. I helped implement features in the real-time components as well as data aggregation and reporting jobs. The ad exchange handles auctions requesting and responding to 140,000 bids per second.
Core Responsibilities
- Implement and extend features and components
Key Learnings
- Backend: Java, Spring, Hibernate, Postgres
- Data engineering: Spark
Software Engineer (Working Student) / mbr Targeting GmbH
Apr 2013 - Apr 2015, Berlin, Germany
When I started working at mbr targeting I bootstrapped and built a campaign management web console and worked on related services. I also worked on the event tracking infrastructure and on the bidding engine.
Core Responsibilities
- Implement and extend features and components
Key Learnings
- Fullstack: Node.js, Angular.js, Postgres
- Backend: Node.js, Redis, Aerospike, C++
Web Developer / freelance
2010 - Mar 2013, Berlin, Germany
As freelance web developer I collaborated with web designers by implementing back-ends for websites of small businesses.
Core Responsibilities
- Implement websites and backends
Key Learnings
- Php/Javascript
- Wordpress
Education
Bachelor of Science Applied Computing / Hochschule für Technik und Wirtschaft
2012 – 2017, Berlin, Germany
At the Hochschule für Technik und Wirtschaft (HTW) Berlin I studied Applied computing specializing in mobile applications. I enrolled in courses ranging from distributed systems to functional programming. In my thesis “Implementing a Data Management Platform: Segmentation of Internet Users based on Tracking Data in a Wide-Column-Database” I built a prototypical Data Management Platform.