Your daily tasks will include:
1. Building, designing, and developing data warehouses and data lakes
2. Designing and developing data pipelines
3. Setting up proper software development lifecycle measures (CI/CD, automated testing and validation, version control)
4. Collaborating with data scientists, analysts, and other stakeholders to understand data needs and translate them into technical solutions
5. Staying up to date with the latest trends and technologies in the data engineering field
6. Mentoring and guiding less experienced developers in the area of data engineering
The following will make you succeed in this role:
1. A senior mindset
2. Excellent communication and planning skills
3. Proficiency in data modelling and very good understanding of data management concepts and their applications
4. Understanding of data architecture patterns like star schema, medallion, snowflake schema, and data vault modelling
5. Hands-on experience with: Airflow, PySpark, Python, dbt, Databricks, or Snowflake
6. Cloud computing technologies and architectures deep expertise in at least one of Azure Synapsis, Microsoft Fabric, AWS Redshift, GCP Big Query, GCP Dataflow
7. Experience with Infrastructure as Code frameworks (Terraform, Pulumi, CloudFormation)
8. Fluency in German and English
#J-18808-Ljbffr