





Description: * Proficiency in SQL; * Knowledge of the Python programming language; * Experience or interest in PySpark and Spark; * Familiarity with the Databricks platform; * Mastery of ETL and ELT concepts; * Experience with relational databases (e.g., MySQL, PostgreSQL) and non-relational databases (e.g., MongoDB, Cassandra); * Experience with data workflow automation tools; * Degree in Computer Science, Data Engineering, or related fields. * Embody the organizational identity of the Brisanet Group; * Prepare reports, forms, or spreadsheets as requested; * Propose improvements to area routines and processes; * Populate area metrics; * Actively participate in organizational meetings and commitments when requested; * Provide support to employees on matters related to the area; * Use individual and collective safety equipment when necessary; * Design and maintain data pipelines (ETL/ELT) to ensure data is cleaned, transformed, and loaded correctly into appropriate systems for analysis; * Ensure data is collected from various internal and external sources and integrated into a format suitable for analysis; * Implement data quality validation and verification processes to ensure data accuracy, consistency, and reliability; * Optimize data system performance, ensuring scalability and efficiency when handling large volumes of data; * Create and maintain documentation for developed data processes; * Collaborate closely with data scientists, data analysts, and IT teams to provide the infrastructure required for analysis and reporting; * Continuously monitor data systems, promptly identifying and resolving issues to ensure uninterrupted data flow. 2512180202551924752


