





**Job Mission:** Build and optimize robust, scalable data infrastructures that support the efficient collection, storage, and processing of large volumes of data. Ensure data integrity, quality, and security by implementing automated pipelines on cloud or on-premise environments. Additionally, collaborate with other team members and cross-functional departments to analyze data, generate insights, provide reliable and well-structured data foundations for advanced analytics, while monitoring and tuning solution performance to meet evolving business needs. **Key Responsibilities:** Engage with business units and technical teams across data projects to understand challenges and propose solutions. Innovation and Technological Trends: Research and apply emerging technologies and approaches in the field of data ingestion to keep the organization at the forefront of technological innovation. ETL/ELT Pipeline Development: Design and maintain automated processes for extracting, transforming, and loading data, ensuring data is cleansed, transformed, and ready for consumption. Data Architecture: Adhere to standards for data storage, movement, accessibility, and management within the organization, proactively proposing innovative improvements where feasible. Data Governance: Follow policies and procedures for data management and ingestion, including data quality, privacy, and regulatory compliance. Data Modeling: Create abstract and physical data models aligned with business requirements and supporting system integration. Data Security: Define and implement security policies to protect data against unauthorized access and data breaches. **Specific Knowledge** Data Modeling: In-depth understanding of data modeling concepts—including relational, dimensional, and NoSQL models—to design effective data structures meeting business needs. Databases: Comprehensive knowledge of various database management systems (RDBMS, NoSQL, NewSQL, data warehouses, etc.), their characteristics, advantages, and disadvantages—preferably SQL Server, Cloud SQL (MySQL), Firestore, and BigQuery. Query Languages: Proficiency in SQL and other query languages for data manipulation and analysis, as well as programming languages specialized for data handling, such as Python or R. ETL/ELT Technologies: Familiarity with ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) tools and techniques to integrate data from multiple sources, transform it as needed, and load it into a target system. Data Governance and Security: Knowledge of data governance practices, security policies, data privacy, and regulatory compliance (e.g., GDPR, LGPD), essential for protecting data and ensuring its ethical and lawful use. Data Analysis and Visualization: Familiarity with data analysis tools and techniques—including Business Intelligence (BI) and data visualization (preferably Power BI)—to support evidence-based decision-making. Cloud Computing and Cloud Data Services: Deep understanding of cloud computing platforms—preferably Google Cloud Platform (GCP)—and how to leverage their data services to build and scale cloud-based data applications. Agile Methodologies and Project Management: Familiarity with agile software development methodologies and project management principles to efficiently lead and implement data architecture initiatives. Minimum Education: Bachelor’s Degree Languages: English (Advanced) * Baby on Board – Extended maternity leave of 6 months; * Paternity Leave – 5 days off at home with your newborn; * Transportation Allowance (VT) – Amount required for commuting to work; * Diversity and volunteer programs. * Extra-Happy Day – Birthday day off; * Corporate University – Ancar Ivanhoe University; * Discounts with partner institutions (CONQUER, DESCOMPLICA, among others); * Meal & Food Allowance (VR & VA) – For everyone to eat well! * Health Insurance; * Dental Insurance; * Life Insurance; * Complimentary parking credential; * Childcare Assistance; * TotalPass


