




Description: To make this happen, we need you to master: * Be organized to work in a self-managed format; * Solid knowledge of relational and non-relational databases; * Experience with multidimensional modeling; * Advanced SQL/T-SQL (stored procedures, performance optimization); * Experience in Business Intelligence projects; * Microsoft Fabric (Pipelines/ADF, Lakehouse & Warehouse, Mirroring/Shortcuts/OneLake, Deployment pipelines); * Databricks (Jobs/Lakeflow, Notebooks, Asset Bundles IaC); * Version control with Git; If you want to stand out, it's good to have: * Advanced Spanish and/or advanced English; * Infrastructure as Code (IaC) with Terraform; * Experience with dimensional/star schema modeling, SCD, CDC, and performance tuning in Data Warehouses; * Experience with Linux OS commands; * Experience with Unity Catalog (governance in Databricks), Delta Sharing; * Experience working with agile methodologies; What will your day-to-day look like? * Align business requirements with project sponsors; * Map data sources used in projects (MSSQL, Postgres, MongoDB, etc.); * Design data structures that support Retail Analytics and Smart Task products; * Develop Data Integration processes for the Data Lakehouse; * Support resolving issues related to data quality and timeliness in the Lakehouse; * Analyze, monitor, and identify improvement opportunities in data integration process performance; * Identify ways to improve data trustworthiness, efficiency, and quality; * Design, monitor, and implement customer migration plans from legacy technology to modernized database platforms, aligning with business requirements and management needs; * Deliver updates to stakeholders based on analysis; * Investigate and propose new solutions for continuous evolution of data engineering processes; * Work with deliverables and document actions taken in project management tools. 2510300202211838714


