···
Log in / Register

Mid-level Data Engineer (1)

Indeed
Full-time
Onsite
No experience limit
No degree limit
100 - 4 1201 - Plano Piloto, Brasília - DF, 70714-900, Brazil
Favourites
Share
Some content was automatically translatedView Original

Description

Job Summary: We are seeking a Data Engineer to join our team responsible for evolving an AWS-based Data Lakehouse platform. Key Highlights: 1. Solid experience with AWS and languages such as Python and advanced SQL. 2. Design and implement data pipelines using Airflow/MWAA. 3. Build and optimize analytical models in Amazon Redshift. Description: Technical Requirements (Must-have)* Solid AWS experience: S3, Glue Data Catalog, Airflow (preferably MWAA), Python DAGs, Amazon Redshift. * Languages: Python and advanced SQL. * Power BI: semantic modeling, DAX, incremental refresh, gateway, best practices for consuming Redshift. * Knowledge of Data Lakehouse architecture, columnar formats (Parquet), partitioning, and metadata. * Engineering best practices: Git, testing, code reviews, documentation, reliable pipelines. * Security and governance: IAM, encryption, principle of least privilege, LGPD applied to data. We are seeking a Data Engineer to join our team responsible for evolving an AWS-based Data Lakehouse platform. This person will play a key role in integrating data from multiple sources, orchestrating reliable pipelines, and delivering high-performance analytical layers for consumption via Amazon Redshift and Power BI. Work Model: Hybrid (2–3 days per week) Location: Brasília Key Responsibilities* Design and implement data pipelines using Airflow/MWAA with Python and SQL, following best practices for modularity, testing, and versioning. * Model Bronze/Silver/Gold layers (Medallion architecture) in S3 \+ Glue Data Catalog, defining partitions, formats (Parquet/Delta\*), and tables optimized for querying. * Build and optimize analytical models in Amazon Redshift, ensuring performance and cost efficiency. * Publish and maintain reliable datasets for Power BI, including gateways, incremental refresh, aggregations, and efficient use of DirectQuery/Import. * Collaborate with analysts and business teams to translate requirements into consumable datasets, KPIs, and analytical layers, while documenting data catalogs and data contracts. 2512200202551929806

Source:  indeed View original post
João Silva
Indeed · HR

Company

Indeed
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.