




We are seeking a Senior Specialist Data Engineer to serve as the technical reference for data architecture, development, and modeling. This professional will play a central role in supporting the Data Engineering and BI teams, leading structural decisions, ensuring governance, and influencing both technical and business stakeholders. Knowledge of AI platforms and advanced data solutions will be considered a significant differentiator. **Responsibilities and Duties** * Design and evolve the enterprise data architecture, including Data Lake, Data Warehouse, and integrations across multiple layers (Raw, Refined, Curated). * Develop, orchestrate, and monitor scalable and resilient data pipelines in cloud environments (GCP and AWS). * Serve as the technical leader of the team, supporting engineers and analysts in best practices, modeling standards, versioning, and governance. * Structure batch and streaming ingestion processes, integrating diverse internal and external data sources. * Ensure data quality, security, cataloging, and observability. * Implement solutions using PySpark, Spark, Python, SQL, and tools such as Airflow, dbt, Glue, Step Functions, among others. * Optimize solution performance, cost, and scalability. * Collaborate with business units, BI, and Data Science teams to enable data products and actionable insights. * Contribute to DataOps and CI/CD initiatives, promoting automation and standardization. **Requirements and Qualifications** **Mandatory Requirements** --------------------------- * Solid hands-on experience with GCP (BigQuery, Cloud Storage, Dataflow, Pub/Sub) and AWS (S3, Glue, Athena, Redshift, EMR). * Practical experience with Databricks or equivalent platforms. * Proven experience building and managing multi-layered Data Lakes. * Proficiency in Python, SQL, Spark, and PySpark. * Experience with data pipelines, orchestration (Airflow, dbt, Step Functions), and DataOps. * Knowledge of version control (GitHub/GitLab) and best practices in security and governance. * Strong ability to communicate effectively with both technical and business teams. **Preferred Qualifications** ---------------- * GCP and/or AWS certifications. * Familiarity with Data Mesh, domain-driven architectures, and lakehouse architectures. * Experience with Infrastructure-as-Code (Terraform). * Exposure to generative AI platforms and ML pipeline architectures. **Behavioral Competencies** -------------------------------- * Technical leadership and systemic thinking. * Clear, assertive, and business-oriented communication. * Ability to prioritize and operate effectively in highly complex environments. * Proactivity, collaboration, and ownership. **Additional Information** **As a Contractor (PJ), you will receive:** * **Paid vacation** after 12 months of contract. * **Starbem:** Platform offering **4 free monthly sessions** with a psychologist. At **Febracis**, we believe in the transformative power of meaningful connections. We are not just a business school— we are a movement inspiring change and building extraordinary stories. Our mission is simple yet profound: **to develop leaders, transform potential into results, and generate positive impact** **in the world.** We are people who believe that where limits exist, opportunities are born; that connection is essence, excellence is the path, and transforming lives is our greatest legacy. Being a **FebraLover** means living this every day. It means acting with integrity, having purpose in every action, being the protagonist of your own story, and believing that success only makes sense when shared. If you also believe that work and purpose go hand in hand, then this is your place. **Join us in building an extraordinary world. Join us and become a FebraLover.**


