




Job Summary: We are seeking a professional with experience in Apache Spark, Python/Scala, AWS, data orchestration with Airflow, and expertise in REST APIs to develop and design data solutions. Key Highlights: 1. Practical experience with Apache Spark 2. Proficiency in Python and Scala (or Java) 3. Experience in data pipeline orchestration, preferably with Airflow **Description:** Practical experience with Apache Spark; Proficiency in Python and Scala (or Java); Solid knowledge of AWS services (e.g., EMR, RDS, S3\); Experience in data pipeline orchestration, preferably with Airflow; Expertise in data structures and REST API development; Familiarity with CI/CD practices (Continuous Integration and Continuous Delivery). **Desirable:** Advanced understanding of AWS resource configuration and optimization; Experience with Data Lake architecture. Developing APIs and applications focused on data manipulation and exposure; Designing solutions collaboratively with business areas and technical teams (Architecture, Security, Infrastructure); Ensuring software quality, with emphasis on automated testing, version control, and code maintainability; Proactively and autonomously resolving technical issues; Collaborating with other engineers and stakeholders to build effective and sustainable solutions. 2511210202471772787


