···
Log in / Register
Senior Data Engineer
Negotiable Salary
Indeed
Full-time
Onsite
No experience limit
No degree limit
79Q22222+22
Favourites
Share
Some content was automatically translatedView Original
Description

**About the Challenge** At CashMe, data is not just about reports—it’s the engine driving our decision-making. We’re looking for a Senior Data Engineer who will take **ownership** of our data platform. Our environment is hybrid: we have a mature modern stack (Snowflake, dbt, Airflow, AWS), market-standard ingestion tools (Skyvia, Stitch), and a limited legacy system (Pentaho) that must be migrated and decommissioned. We seek someone who doesn’t merely maintain existing systems, but leads modernization efforts—collaborating with Cloud, Security, and DevOps teams to resolve infrastructure challenges, and partnering with business areas to remove blockers and drive results. **What We Expect from a Senior Profile Here:** * **Ownership:** Don’t wait for a task card—investigate problems, propose technical solutions, and execute them, always aligning with peers. * **Resilience and Autonomy:** If an AWS access permission is missing, proactively coordinate with the Cloud/Security team to resolve it. Treat external dependencies as project milestones—not reasons to halt progress. * **Architecture:** Understand how a poorly written query impacts Snowflake costs and know how to orchestrate complex dependencies in Airflow. * **End-to-End Vision:** From raw CRM data to availability in BI or AI/ML models. **Your Responsibilities:** * Maintain and evolve the Data Lake (Snowflake), orchestration (Airflow), and transformations (dbt). * Manage ingestion pipelines using SaaS tools (Skyvia, Stitch) and develop custom Python/AWS pipelines when needed. * Lead legacy decommissioning: Execute final migration of Pentaho processes to the new stack. * Interact with AWS services (ECS, Batch, Lambda, S3, etc.), supporting the Cloud team in designing data-oriented infrastructure. * Ensure engineering best practices: CI/CD, version control (GitHub), and code reviews. * Support governance and administration of the Power BI workspace and dataset orchestration. * Monitor pipeline costs and bottlenecks, implementing corrective and optimization actions. **Requirements:** * Bachelor’s degree in Computer Science, Computer Engineering, Information Systems, or related fields. * Strong experience with Python and advanced SQL. * Proficiency in modern data architecture: Snowflake, dbt, and Airflow. * Experience with AWS compute and serverless services (Batch, ECS, Lambda, S3, etc.). * Familiarity with ingestion tools (Stitch, Skyvia, or similar). * Hands-on experience with Git/GitHub and CI/CD processes. **Desirable Skills (Nice-to-Haves):** * Knowledge of the Microsoft ecosystem: Experience administering Power BI Service (workspaces, gateways) or Microsoft Fabric. * Understanding of MS Dynamics 365 data structures. * Experience migrating legacy tools (e.g., Pentaho) to modern stacks. **What You’ll Find Here:** A collaborative environment with autonomy to propose solutions, constant challenges, and genuinely great people. We value knowledge sharing, continuous learning, positive impact, and excellence in data engineering. If you believe in this too—and want to actively contribute to our growth—join us!

Source:  indeed View original post
João Silva
Indeed · HR

Company

Indeed
Cookie
Cookie Settings
Our Apps
Download
Download on the
APP Store
Download
Get it on
Google Play
© 2025 Servanan International Pte. Ltd.