





Description: * Solid knowledge of SQL and relational databases (PostgreSQL); * Experience with Apache Airflow for pipeline orchestration; * Practical experience with DBT (Data Build Tool) for data transformation; * Knowledge of code versioning using Git and GitLab; * Familiarity with data cataloging tools (OpenMetadata or similar); * Ability to clearly document processes and data structures; * Knowledge of BI tools (Metabase, Power BI); * Experience with Python for automation and data processing; * Knowledge of dimensional modeling (Star Schema, Snowflake Schema); * Familiarity with Data Quality and Data Governance concepts. * Perform preventive and corrective maintenance on the current data infrastructure; * Execute migrations of new tables and data structures to the Analytics environment; * Serve as the technical focal point for the creation and modeling of new tables in the analytical layer; * Monitor and track executions of data migration and transformation pipelines; * Develop and optimize data transformations using DBT; * Orchestrate data workflows through Apache Airflow; * Ensure data quality, consistency, and documentation through OpenMetadata; * Collaborate with business teams to understand requirements and translate them into data solutions; * Implement best practices for code versioning using GitLab. 2512140202201862326


