Change the way the world travels
Join the GetYourGuide journey to connect people with unforgettable travel experiences around the world. Millions look to us for unique activities they can trust, and it's all powered by our commitment to make every single journey extraordinary - including yours.
Team mission
The Core Data Platform team is at the heart of GetYourGuide's data landscape, empowering the entire engineering department and enabling data-driven innovation across the company. We architect, build, and operate the foundational data platform that powers everything from AB experimentation and AI-powered use cases to advanced analytics and business intelligence.
Your mission
- Design, build, and evolve our core data and analytics platform, enabling seamless scaling to multiple petabytes of data.
- Optimize and enhance our real-time clickstream data pipelines, collaborating closely with data scientists, engineers, and product managers to unlock new business value.
- Expand and maintain our change data capture (CDC) solutions, ensuring timely and reliable data mirroring to the data lake.
- Ensure system performance, reliability, and stability by proactively monitoring, maintaining, and upgrading our platform to consistently meet or exceed SLOs.
- Safeguard our data lake by keeping it secure, compliant, and cost-effective, while supporting a growing variety of use cases across the company.
Your toolkit
- 3+ years of experience as a Data Engineer or Software Engineer, working with large-scale distributed systems.
- Proven experience designing, developing, and maintaining scalable data platforms or backend systems.
- Strong programming skills in Scala or Java.
- Hands-on experience with big data frameworks such as Apache Spark and backend frameworks like Spring Boot.
- Ability to write efficient, maintainable, and well-tested code, with a focus on scalability and reliability.
- Strong analytical mindset, using data to guide technical decisions and solve complex problems.
What Sets You Apart
- Experience with Python to support data engineering workflows.
- Familiarity with workflow orchestration tools such as Apache Airflow.
- Experience with AWS and infrastructure-as-code tools like Terraform.
- Hands-on experience building stream-processing systems using Kafka, Spark Streaming, or similar technologies.
- Exposure to data governance, security, and compliance best practices.
How We'll Make Your Career Journey Extraordinary
- Annual personal growth budget and mentorship programs for continuous learning and development
- Work from anywhere in the world for 40 days per year
- Flexible working arrangements to support work-life balance
- Opportunities to collaborate and socialize with team members through quarterly team events and yearly company-wide events
- Monthly transportation and fitness budget
- Discounts for you, your friends, and family on GetYourGuide activities
- Language reimbursement program
- Health and wellness benefits