




Job Summary: This professional will be responsible for developing and maintaining the cloud-based Data Warehouse, creating and managing data pipelines, and collaborating to foster a data-driven culture by translating business requirements into data solutions. Key Highlights: 1. Develop and maintain a scalable, high-performance cloud-based Data Warehouse. 2. Create and orchestrate automated data pipelines via Airflow. 3. Collaborate across departments and promote the use of self-service platforms. Description: Ingredients for the Combo: * Bachelor's degree in Computer Science, Engineering, Mathematics, Statistics, or related fields; * Advanced knowledge of SQL, data modeling, and ELT processes; * Practical experience with Python, especially for scripting, automation, and data integration; * Practical experience with pipeline orchestration using Airflow; * Knowledge or prior experience with data integrations via APIs; * Experience with cloud platforms (preferably Google Cloud Platform BigQuery, Cloud Storage, Composer, etc.); * Familiarity with visualization tools (Power BI, Looker Studio, or equivalents); * Nice-to-have: knowledge of dbt. What Will Be Your Challenge? * Develop and maintain the cloud-based Data Warehouse, ensuring a scalable, reliable, and high-performance architecture; * Perform analytical modeling and build Data Marts that meet business unit needs; * Create, document, and maintain automated, orchestrated ELT data pipelines via Airflow; * Implement and maintain data integrations with APIs and other external sources, ensuring data quality and consistency; * Collaborate with both technical and non-technical teams, translating business requirements into data solutions; * Support the dissemination of data culture by contributing to the implementation of best practices, processes, and training; * Design and maintain business metrics and promote the use of self\-service platforms in a decentralized environment. 2511140202461868506


