




**What are we looking for?** Responsibilities: Manage and optimize the Data Lakehouse architecture in AWS environments (S3, Glue, RDS). \- Develop, monitor, and maintain large-scale ingestion (ETL/ELT) and processing pipelines. \- Explore and implement new technologies to increase efficiency and reduce costs. \- Create and maintain technical documentation. \- Ensure data integrity and improve queries and processing jobs. **Requirements:** * Solid experience with Apache Spark and AWS Glue. * Knowledge of Cloud Computing (AWS, Google Cloud, and Azure). * Proficiency in Python and advanced SQL for data transformation. * Experience with relational and non-relational (NoSQL) databases. **Nice-to-have** * Experience with Terraform. * Basic experience with BI tools (Tableau, Power BI, or Looker). * Understanding of data integration for analytical models. * Experience with Forecasting Models. **Location:** On-site 4 days per week in Campos Elíseos, São Paulo \- SP **Why build your career at Meta?** We offer autonomy, clear goals, and a dynamic and challenging environment where professionals have the opportunity to interact with diverse technologies, participate in all types of projects, bring new ideas, and work from anywhere in Brazil—and why not?—the world. Additionally, we are one of the best companies to work for in Brazil according to Great Place to Work, and one of the 10 fastest-growing companies in the country for three consecutive years, according to the Informática Hoje Yearbook. **What are our values?** * We are people serving people. * We think and act like owners. * We have a drive for performance. * We grow and learn together. * We pursue excellence and simplicity. * Innovation and creativity are in our DNA. All individuals are welcome regardless of their condition, disability, ethnicity, religious belief, sexual orientation, appearance, age, or similar factors. We want you to grow with us in a welcoming environment full of opportunities. Do you identify with this? Then, \#JoinMeta!


