Senior Data Engineer (m/f/nb) - Big Data Platform
At Team Conrad Big Data, we develop and operate a platform that serves as the basis for data exchange, reporting, analysis, and AI. Keywords such as high performance, scalability, central data storage, company-wide usability, provision of data for internal and external apps, self-service, etc. drive us forward every day. Agility, iterative approaches, continuous improvement, and open feedback are part of our daily routine.
Responsibilities
- Design and implement scalable big data solutions with a focus on event streaming platforms (Confluent Kafka) for real-time data processing
- Use your in-depth knowledge of Java to develop complex data services and drive innovation by designing and implementing AI agents
- Design efficient data models and orchestrate ETL processes (Extract, Transform, Load)
- Integrate heterogeneous data sources into our data warehouse and establish strict standards for data quality and consistency
- Proactively monitor the performance of our platform (GCP/BigQuery), optimize queries, and ensure scalability
Requirements
- Several years of professional experience as a big data engineer, ideally with a focus on Google Cloud Platform (GCP)
- In-depth experience in backend development with Java (must-have), knowledge of Python is a plus
- Proven expertise in event streaming architectures, especially in working with Confluent Kafka
- Practical experience in the development and implementation of AI agents and understanding of how to integrate LLMs or agentic workflows
- Solid experience in designing data models and ETL pipelines, preferably using Matillion ETL
- Strong analytical skills and the drive to independently solve complex technological challenges
Benefits
- Competitive compensation package
- Flexible and family-friendly work environment
- Opportunities for professional development and growth