




Job Description: Role in the Company: The Senior Data Engineer at Dataside serves as technical leadership on projects, supporting the Tech Lead and contributing to team development and knowledge dissemination. They have autonomy over end-to-end technical decisions, ensuring deliverables align with client needs. Additionally, they conduct mentoring sessions, perform rigorous code and architecture reviews, and drive the company’s technical evolution. With a strategic view of data value, they connect their deliverables to business impact. More than building solutions, they act as a consultative partner—provoking reflection and strengthening client relationships through well-founded proposals and trust. Responsibilities: Serve as lead engineer on contracts and autonomously drive technical decisions. Support the Tech Lead in technical and strategic project leadership. Conduct in-depth technical reviews and mentoring for the team. Propose scalable, efficient, and business-aligned batch and streaming architectures. Develop performance optimization and cost reduction plans. Diagnose technical and operational bottlenecks and propose sustainable solutions. Communicate solutions clearly to diverse audiences (business, technical, management). Work with multiple integrations, high-volume data, and performance-critical requirements. Requirements: Proven experience in highly complex data engineering projects. Ability to operate effectively in environments where prior familiarity with the tech stack is absent—learning quickly and adapting. Strong advanced Spark and Python expertise, applying sound software engineering practices (SOLID principles). Advanced SQL, including query tuning and execution plan interpretation. Experience integrating via RESTful APIs and working with distributed systems. Proficiency in the AWS analytics stack, including: S3, Athena, Redshift, Glue, EMR, Kinesis, Lambda, Step Functions, Lake Formation, Secrets Manager, CloudWatch, SNS, Cost Explorer, Glue DataBrew, Amazon Macie. Hands-on experience with Databricks, including: cluster and policy management, Unity Catalog (lineage, access control, versioning), SQL Warehouse, Workflows, Delta Live Tables (DLT). Familiarity with various data modeling techniques: Data Vault, dimensional modeling, event-driven pipelines. Knowledge of NoSQL databases such as DynamoDB, MongoDB, and Cassandra. Expertise in cluster sizing, distributed computing, and detailed cost analysis. Hard Skills: Advanced Spark—including Catalyst Optimizer, persistence, broadcast joins, caching, and bucketing. Python following SOLID principles and component reuse. Advanced SQL—including tuning, partitioning, and execution plan optimization. Consumption and integration via RESTful APIs. Advanced Databricks—including Unity Catalog, SQL Warehouse, Workflows, DLT, Pools, and cluster configuration. AWS Analytics Stack: S3, Athena, Glue, EMR, Redshift, Kinesis, Lambda, Step Functions, Lake Formation, Glue DataBrew, Macie, CloudWatch, SNS, Cost Explorer. Advanced Amazon Redshift—including sort key optimization, compression, vacuum tuning, advanced WLM, SVL/SVV logs, Redshift Spectrum, hybrid architecture with S3. Data modeling: Data Vault, 3NF, dimensional, event-driven pipelines. NoSQL databases (DynamoDB, MongoDB). CI/CD using Git, GitHub Actions, or GitLab CI. Unit testing concepts applied to data pipelines. Observability and data quality frameworks and practices: Great Expectations, Airflow, Prometheus, Grafana. Building reusable and composable solutions, focused on Data Products and Data Mesh strategies. Cloud cost assessment and planning: use of Cost Explorer, sizing with AWS Pricing Calculator. Soft Skills: Technical leadership and strategic influence. Ability to learn how to learn. Clarity and depth in communication with both technical and business stakeholders. Consultative mindset, active listening, and proactivity. Collaborative spirit, mentoring attitude, and team development orientation. Organization, delivery focus, and strong sense of responsibility. Intermediate English. Desired Certifications: AWS Certified Data Engineer – Associate. AWS Certified Solutions Architect – Associate. Advantages: AWS Certified Data Analytics – Specialty. AWS Certified Solutions Architect – Professional. Databricks Certified Professional Data Engineer. Our Benefits: * Health insurance allowance: monthly financial support to help cover your private health plan. * Wellhub: stay physically and mentally active—your way. * Fully company-funded online therapy—because mental health matters. * Online nutrition counseling, with up to two monthly consultations to support your dietary health. * Life insurance policy valued at R$125,000, providing greater security for you and your family. * Birthday day off—because your day deserves to be special. * Paid time off—so you can recharge. * Internal gamification—turning achievements into rewards and recognition. * Educational partnerships with institutions such as FIAP, Anhanguera, and Instituto Infnet—to support your growth and learning. * Technical certification bonus—recognizing and rewarding your effort to learn. We value every voice and every individual—because we know diversity makes us more innovative and stronger. 2512050202181885569


