(Senior) Data Engineer-100% remote possible
- Work experience
- IT
- Fulltime
Join the limango IT!
- In the limango IT you get the chance to contribute your own ideas and know-how to maintain and develop our highly frequented, self built online shop for our markets in Poland, Germany, Austria and the Netherlands
- As a Senior Data Engineer, you will play a key role in one of the company’s most strategic initiatives led by the Data Platform Team (DPT): migrating legacy data solutions to a modern, cloud-based environment and building a new enterprise data warehouse based on the Data Vault methodology.
- Contribute to the development of a centralized Lakehouse Data Platform built on AWS and Databricks.
- Manage and evolve data infrastructure (Terraform on AWS / Databricks) and ETL pipelines (PySpark, Delta Live Tables, SparkSQL).
- Implement monitoring, data quality testing, unit tests, and automated alerting within the platform.
- Refactor legacy AWS solutions (e.g., Glue, Redshift) into a modern, CI/CD-deployed Lakehouse environment with proper observability and data quality controls.
- Actively support the design and implementation of a medallion lakehouse architecture to support ML and analytics use cases, incorporating data mesh principles.
- Collaborate closely with analytics and ML teams working within the data platform.
You'll need:
- Bachelor's or Master’s degree in Computer Science, Information Systems, Engineering, Mathematics, or a related field
- ~5 years of hands-on experience in building modern data engineering solutions (experience in E-commerce is a plus)
- Strong proficiency in PySpark and a solid understanding of the Spark processing engine architecture (required)
- Proven experience with Python for building applications, automated testing, and deployment
- Advanced SQL skills for working with structured and semi-structured data
- Familiarity with data lake architecture and data modeling concepts
- Experience with Terraform and Infrastructure as Code (IaC) principles
- Proficiency in CI/CD tools and Git-based development workflows
- Fluency in English (our team works in a fully international environment)
- Team player mindset and eagerness to learn and adopt new technologies and frameworks
- Experience with AWS ecosystem and the Databricks platform is highly welcomed
- Hands-on experience with AWS Glue, Redshift, or other AWS-native data services
- Familiarity with CloudFormation, Kinesis, or Spark Streaming
- Working knowledge of Scala, especially in a functional programming context
- Experience in MLOps environments (e.g., MLflow)
- Knowledge of Data Vault modeling concepts is a strong plus
What you can count for?
- Challenging tasks and real impact – you’ll be directly involved in bringing new projects to life and influencing how we grow as a e-commerce business.
- Fast-paced learning environment – we’re an international team, so you’ll constantly pick up new skills and insights.
- Flexible working hours and B2B contract – choose when and how you work.
- Lots of room to grow – through hands-on experience, training, and working with experts from different parts of the world.
Benefits
- Private health care
- We provide access to the best specialists for you and your loved ones.
- Language classes
- English and German lessons in small groups, tailored to your skills.
- Remote work and flexible working hours
- Possibility of partial remote work, as well as adjusting working hours to your daily schedule.
- Company events in the best company
- After hours we often organize interesting outings or meetings in our office.
Sounds good?
- Klaudia Grabelska
- Recruiting Manager