




**Description:** We are seeking a senior professional to join an innovative and engaged team, developing high-impact data solutions in the financial sector. If you are proactive, passionate about programming, and ready for new challenges, this opportunity may be perfect for your career. Here, you will have room to grow, collaborate within a dynamic team, and stand out. Responsibilities and assignments Design, develop, and support robust data infrastructure solutions, with a focus on scalability, security, and efficiency. Lead the implementation and maintenance of ecosystems such as Trino, AWS Glue, Redshift, Airflow, Kafka, and Spark on Kubernetes (K8s). Collaborate with engineering, data science, and product teams to ensure seamless system integration. Diagnose and resolve performance, scalability, and reliability incidents. Apply and promote best practices in data architecture, governance, and monitoring. Requirements and qualifications **Solid experience with data infrastructure ecosystems:** Trino, AWS Glue, Terraform, CI/CD, Apache Airflow, Kafka, and Spark on Kubernetes. Advanced proficiency in Python (including PySpark) and object-oriented programming. Proven experience with distributed systems, data warehouses, and large-scale ETL/ELT pipelines. Practical knowledge of data governance, security, and storage (e.g., S3, Redshift). Analytical ability and problem-solving skills focused on operational efficiency. Strong communication skills and experience in collaborative work. 2512190202551645890


