




We are seeking a Mid-Level Data Engineer to strengthen our data infrastructure on Google Cloud Platform. You will be responsible for developing and optimizing data pipelines, ensuring governance, security, and efficiency. The role involves working with Airflow, PostgreSQL, BigQuery, and modern architectures, enabling the creation of advanced risk and credit intelligence models for real estate lending. We operate within a multidisciplinary team comprising product specialists, engineers, data scientists, and credit experts. **Responsibilities** --------------------- * Design, develop, and maintain scalable pipelines using Airflow. * Build and optimize ETL and ELT processes for ingesting structured data into PostgreSQL and semi-structured data, ensuring data quality and reliability. * Implement and manage modern architectures (Data Lake, Data Warehouse) with a focus on scalability and governance. * Automate, integrate, and orchestrate internal and third-party data sources. * Develop data quality, monitoring, and observability processes. * Collaborate with data scientists to support advanced modeling and analytics. * Ensure security and compliance with Data Governance and LGPD. **Requirements** -------------- * Advanced proficiency in Python, Airflow, and SQL. * Experience with PostgreSQL, Data Warehouse, and Data Lake. * Hands-on experience with GCP. * Familiarity with data security, LGPD, and data governance. **Nice-to-Have** ---------------- * Knowledge of MLOps and DataOps. * Scalable data modeling for analytics and machine learning. Experience with data streaming (Pub/Sub, Kafka, or similar). * Infrastructure as Code (Docker, Cloud Build, CI/CD for data). * Experience in the financial sector, construtech, proptech, or fintech industries.


