




Job Summary: This strategic and high-impact position offers the opportunity to operate at the heart of the business, transforming data into efficient and innovative tech solutions, building scalable data pipelines, and ensuring data governance and quality. Key Highlights: 1. Operate at the heart of the business with strategic impact 2. Transform data into efficient and innovative tech solutions 3. Lead initiatives in a dynamic and collaborative environment ### **Job Summary** If you're seeking a strategic and high-impact challenge, this role is for you! Here, you’ll have the opportunity to operate at the heart of the business, transforming data into efficient and innovative tech solutions. You’ll be a key player in building scalable data pipelines, ensuring data governance, quality, and high performance of data platforms. This is your chance to work with cutting-edge technologies, lead initiatives, and grow within a dynamic and collaborative environment! ### **Key Responsibilities** * Understand business problems end-to-end to propose efficient and innovative tech solutions; * Design, implement, and manage efficient and scalable data pipelines using ETL/ELT tools (Airflow); * Ensure data quality, integrity, and governance across all pipeline stages; * Build solutions that promote high availability, security, and performance of data platforms; * Conduct code reviews and ensure best practices in data engineering; * Support cross-team collaboration and scale data governance best practices. ### **MUST-HAVE \| Key Competencies and Skills** * Advanced knowledge of Python and SQL; * Experience in data pipeline orchestration (Airflow); * Knowledge of cloud platform services (AWS, GCP) ### **NICE-TO-HAVE \| Key Competencies and Skills** * Knowledge of Docker and Kubernetes; * Experience with continuous integration and continuous delivery (CI/CD); * Experience with agile methodology.


