Raisin is the world's leading platform for savings and investment products. Founded in 2013, the FinTech connects consumers with banks in the EU, the UK and the US. This gives consumers better interest rates and banks a diversified form of refinancing. The company's vision is to offer savings and investments without barriers and thus open up the global €150 trillion euro market.
Your Responsibilities
- Work on scalable data pipelines to efficiently process and analyze large volumes of data, utilizing Snowflake, Looker, Airflow and dbt.
- Collaborate with stakeholders to help translate their requirements into technical steps and coordinate the projects you drive with them.
- Monitor and improve the health of our data pipelines.
- Promote knowledge sharing within the team to foster collaboration and continuous learning.
- Stay updated on emerging technologies and best practices in data engineering and bring new ideas to enhance the technical set-up.
Your Profile
- Solid experience with Airflow, Python programming and SQL / dbt.
- Experience with cloud data warehouses such as Snowflake is a plus.
- Knowledge of AWS services and Docker is a plus.
- Bachelors in Mathematics, Computer Science or other relevant quantitative fields.
- Strong analytical and quantitative approach to problems.
- Familiarity with Data Engineering best practices, including Data Quality and Observability.
- Comfortable working in a dynamic and changing environment, with a strong sense of responsibility.
- Eagerness to learn and grow through on-the-job learning.