




Description: * Solid knowledge of SQL and relational databases (PostgreSQL); * Experience with Apache Airflow for pipeline orchestration; * Practical experience with DBT (Data Build Tool) for data transformation; * Knowledge of code versioning using Git and GitLab; * Familiarity with data cataloging tools (OpenMetadata or similar); * Ability to clearly document processes and data structures; * Knowledge of BI tools (Metabase, Power BI); * Experience with Python for automation and data processing; * Knowledge of dimensional modeling (Star Schema, Snowflake Schema); * Familiarity with Data Quality and Data Governance concepts. * Perform preventive and corrective maintenance on the current data infrastructure; * Execute migrations of new tables and data structures to the Analytics environment; * Act as the technical focal point for creating and modeling new tables in the analytical layer; * Monitor and track execution of data migration and transformation pipelines; * Develop and optimize data transformations using DBT; * Orchestrate data workflows through Apache Airflow; * Ensure data quality, consistency, and documentation using OpenMetadata; * Collaborate with business teams to understand needs and translate them into data solutions; * Implement best practices for code versioning using GitLab. 2511120202181862326


