




Description: * Bachelor's degree in Computer Science, Computer Engineering, Information Systems, or related field; * Python; * SQL; * Database administration concepts; * Data Warehouse (DW) concepts; * Shell; * Infrastructure-as-Code (IaC) tool/language: Terraform, Ansible, Puppet; * Infrastructure concepts (network, disk, CPU, etc.); * Proficiency with containers (Docker, Kubernetes); * Software engineering concepts (API protocols, SoC, SOLID, etc.); * Concepts of data-intensive systems: lakehouse, data lake, MPP DW; * Proficiency with streaming processing tools: Kafka, Spark, Flink; * Data streaming concepts: hopping, session, tumbling; * Proficiency with NoSQL databases: MongoDB, Cassandra (ScyllaDB), Druid; * Distributed systems architecture concepts: ring, master/worker; * Machine learning concepts: training, serving, features, reinforcement learning; * Proficiency with cloud infrastructure. * Seek inspiration from the market to bring new ideas that can be used to improve the data platform; * Actively participate in identifying and resolving issues, opportunities, and improvements related to the data discipline within technology; * Ensure the team is aligned with the data platform’s evolution and that everyone understands the motivations behind it; * Identify and anticipate impacts of approaches on other cross-squad products; * Provide a holistic perspective while collaborating on the planning, monitoring, and organization of initiatives aimed at delivering value—supporting analysts responsible for such initiatives through to delivery, including risk and impact mapping; * Guide, engage, and proactively foster a data-driven culture across the entire company. 2512200202551553470


