




Description: * Clear and effective communication. * Active collaboration with developers and stakeholders. * Ability to solve complex problems and adapt to change. * Efficient time management and high resilience under pressure. * Continuous learning and rigorous attention to detail. Hard Skills: * Knowledge of SQL and NoSQL databases (DynamoDB) * Experience with agile methodologies (Scrum, Kanban) and practices for security and performance optimization. * Familiarity with cloud platforms such as AWS, Azure, and GCP. * Experience with Azure DevOps pipeline. * Proficiency in Python, Spark, and Airflow development tools. * AWS certification is a plus. * Data Pipeline Development and Maintenance: Design, build, and maintain scalable and efficient data pipelines that support data collection, transformation, and loading (ETL) from diverse sources. * Data Modeling: Create and maintain data models that ensure data integrity, consistency, and availability. * Data Ingestion: Implement data ingestion processes to ensure raw data is collected and stored appropriately. * Data Governance: Ensure compliance with data quality and governance policies, including security, privacy, and regulatory compliance. * Monitoring and Support: Monitor data system performance and provide proactive support to resolve issues. * Performance Optimization: Identify and resolve performance bottlenecks and implement best practices for data processing optimization. * Documentation: Document processes, systems, and workflows to ensure continuity and comprehensive understanding of data systems. * Actively participate in team meetings for project review and improvement. 251220020255254752


