···
Log in / Register

Data Engineer

Indeed
Full-time
Onsite
No experience limit
No degree limit
79Q22222+22
Favourites
Share
Some content was automatically translatedView Original

Description

Job Summary: We are seeking a professional with deep expertise in scalable and secure data architectures, collaborating to co-create high-value solutions in AWS and GCP environments. Key Highlights: 1. Experience with AWS and GCP in scalable and secure architectures 2. Strong experience in ETL/ELT, data modeling, and data orchestration 3. Focus on designing scalable, high-performance, and reliable solutions Description: * Deep expertise in AWS and Google Cloud Platform (GCP), with hands-on experience in scalable and secure architectures; * Solid experience with ETL/ELT processes, data modeling, ingestion, transformation, and orchestration; * Ability to design scalable solutions while maintaining performance and reliability; * Technical and strategic communication skills to collaborate across different teams; * Interest in engaging with strategic business areas to co\-create solutions that deliver real value. Tools and Technologies * Data integration: Pentaho, Debezium, Apache Airflow; * Automation and manipulation: Shell Script and Python; * Visualization and monitoring: Power BI and Grafana; * Databases: PostgreSQL, MySQL, MongoDB, DynamoDB, Redis, and others. Nice-to-have: * Experience with non-relational databases such as: HBase, DynamoDB, Cassandra, or MongoDB; * Knowledge of Data Governance; * Experience with the Hadoop ecosystem (HDFS, HBase, MapReduce, Apache Spark, Apache Hive); * Familiarity with infrastructure-as-code provisioning tools such as Terraform and CloudFormation; * Understanding of agile methodologies, especially Scrum and Kanban. * Design, develop, and maintain scalable and efficient data architectures to support the company’s data analysis and processing needs; * Architect and implement Data Warehouses, Data Lakes, and Data Marts using Parquet, Delta Lake, and data governance best practices; * Orchestrate data workflows using Pentaho, Apache Airflow, and Glue, and model transformations with DBT; * Integrate and transform data from multiple sources using Pentaho, Python, and Shell Script; * Ensure data quality, security, and performance in AWS and GCP environments; * Work with relational databases (PostgreSQL, MySQL) and non-relational databases (MongoDB, Redis, DynamoDB, etc.); * Build dashboards and monitoring systems using Power BI and Grafana; * Collaborate with DBAs, Data Analysts, and business teams to understand requirements and propose high-impact solutions; * Participate in defining technical standards and data engineering best practices. 251111020218116084

Source:  indeed View original post
João Silva
Indeed · HR

Company

Indeed
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.