Senior Data Engineer
We work and play together. We value work-life balance and create a culture of respect, trust and equality. If these values are also key for you, there is a good chance that you will find your place with us.
Your role and main tasks
- Development of Central Lakehouse Data Platform based on AWS / Databricks ecosystem
- Managing data infrastructure (Terraform AWS / Databricks) and ETL code (PySpark, Delta Live Tables, SparkSQL)
- Implementing monitoring, data quality tests, unit testing for code, and automated alerts into the system
- Refactoring legacy AWS solutions (AWS Glue, Redshift) into monitored and CI/CD deployed Lakehouse ecosystem with data expectations / data quality testing
- Active support for building medallion lakehouse architecture for ML and analytics data and introducing data-mesh concepts
- Active support for business analytics and ML teams working on the data platform
You will need
- Terraform, PySpark experience is a must at least on the basic lvl
- degree in Information Systems, Computer Science, Mathematics/Physics, Engineering, or a related field
- several years of experience building data engineering solutions (ideally in E-commerce)
- very good knowledge of Python programming (building applications, automated testing, packaging and deployment)
- real experience with PySpark data processing and good understanding of Spark processing engine architecture
- Advanced SQL processing skills
- experience with data lakes architecture and data modelling paradigms
- experience with Terraform infrastructure-as-code and CI/CD deployments
- experience with building and deploying data engineering ETL applications – CI/CD pipelines in git environments
- Fluency in English – we work in an international environment
- Team player attitude and willingness to learn modern technologies / frameworks
- Familiarity with AWS Cloud architecture and Databricks platform will be very welcomed
- Experience with AWS Glue and Redshift warehousing
- Experience with CloudFormation templates
- Experience with AWS Kinesis and Spark streaming processing
- Good knowledge of Scala coding, functional programming and application development
- Experience with MLOps environments (e.g., MLFlow)
Benefits
- Private health care
We provide access to the best specialists for you and your loved ones. - Language classes
English and German lessons in small groups, tailored to your skills. - Remote work and flexible working hours
Possibility of partial remote work, as well as adjusting working hours to your daily schedule. - Office in the center of Wrocław
Nearby cinema, fitness club and large selection of lunch places. - Fruit Mondays
There is no shortage of coffee, fruit, pizza, sweets and healthy snacks in our office. - Company events in the best company
After hours we often organize interesting outings or meetings in our office.
- Interesting and challenging work in a dynamic environment of the Internet industry - you will not get bored with us!
- The possibility of real shaping of the business - we value independence and delegate responsibility,
- Gaining experience in an international team, operating in different European markets.