Your daily tasks will include
️ building, designing and developing data warehouses and data lakes
️ designing and developing data pipelines
️ setting up proper software development lifecycle measures (CI/CD, automated test and validation, version control)
️ collaborating with data scientists, analysts, and other stakeholders to understand data needs and translate them into technical solutions
️ staying up to date with the latest trends and technologies in the data engineering field
️ mentoring and guiding less experienced developers in the area of data engineering
The following will make you succeed in this role
️ a senior mindset
️ excellent communication and planning skills
️ proficiency in data modelling and very good understanding of data management concepts and their applications
️ understanding of data architecture patterns like star schema, medallion, snowflake schema and data vault modelling
️ hands on experience with
1. Airflow, PySpark, Python, dbt, Databricks or Snowflake
2. Cloud computing technologies and architectures
3. deep – data engineering relevant - expertise in at least one of Azure Synapsis, Microsoft Fabric, AWS Redshift, GCP Big Query, GCP Dataflow
️ experience with Infrastructure as Code frameworks (. Terraform, Pulumi, CloudFormation)
️ fluency in German and English