




Job Summary: The professional will apply modeling, ETL, and ELT techniques in DW and Data Lake environments, developing code in SQL, Python, and SPARK to generate insights and support decision-making, and managing streaming services. Key Highlights: 1. Autonomy in developing solutions and data analysis 2. Focus on technology, development, and innovation 3. Inclusive work environment with emphasis on well-being We are more than a machine—we are people who transform and **create infinite possibilities.** We work to **simplify and accelerate businesses for everyone**, offering intelligent financial solutions. Here, we invest in **technology**, promote **development**, and foster **innovation** to create new possible paths and generate positive impact worldwide. At Cielo, we work with **autonomy** to write our own journey, **freedom** to be who we are, and the opportunity to **make things happen**. We are a team that **dreams collectively**, delivering a comprehensive experience while focusing on the physical and mental well-being of our 7,000+ employees and their families. We believe in **inclusion and embracing** all individuals, honoring their uniqueness and diversity of life experiences. Let’s achieve your dreams together! **Responsibilities and Duties** ----------------------------------- * Apply modeling techniques and model generation, sharing development and processes with other team members to ensure quality standards; * Apply ETL and ELT concepts in DW and Data Lake environments to work with structured and unstructured data, and identify and resolve issues based on data analysis to obtain meaningful insights supporting decision-making; * Develop code in programming languages such as SQL, Python, and SPARK with autonomy in solution development and analysis aligned with operational needs, ensuring correct interpretation and resolution; * Perform high-complexity ETL and ELT tasks (e.g., data recovery, aggregations, and filtering) to support the team in executing its tasks and to create and manage streaming services; * Develop refined data tables in Databricks using Parquet or Delta tables to ensure our databases remain up to date. **Requirements and Qualifications** ------------------------------ **What does the #TimeCielo expect from you?** * Completed undergraduate degree; * Experience developing data ingestion and transformation pipelines in DW/Lake/LakeHouse environments; * Code versioning and Infrastructure-as-Code (GitLab and CI/CD); * Programming languages and tools (e.g., SQL, Python, PySpark, among others); * Cloud-based Big Data environments (e.g., Databricks, AWS, Athena). **Additional Information** ------------------------ **Why live infinite possibilities alongside us?** * Medical and Dental Assistance; * Annual Variable Compensation (PPR); * Meal and Food Allowance; * Commuter Bus/Transportation Voucher or Parking; * Hybrid Work Model; * Remote Work Allowance; * Life Insurance; * Home and Auto Insurance; * Family Funeral Assistance; * Private Pension Plan; * Support Channel with Specialists (Nutrition, Psychology, Gynecology, etc.); * Vaccination Campaign; * Access to various courses on our Educa platform; * Wellhub; * Healthy Pregnancy Program; * Extended Maternity and Paternity Leave; * Childcare Assistance; * Birthday Day Off; * Flexible Dress Code; * Flexible Working Hours; * Short Fridays; * Extended Lunch Break (1h30).


