




Job Summary: We are seeking a Data Lake specialist to structure, organize, and govern the company's data, supporting data-driven decision-making with a reliable and scalable data ecosystem. Key Highlights: 1. Structure, organize, and govern the company's data 2. Responsible for a reliable, scalable, and secure data ecosystem 3. Support BI, Analytics, and Product teams in data-driven decision-making We are looking for a Data Lake specialist to structure, organize, and govern the company's data. This person will be responsible for ensuring the data ecosystem is reliable, scalable, and secure — supporting BI, Analytics, and Product teams in data-driven decision-making. **Key Responsibilities:** . Design, implement, and maintain data ingestion and transformation pipelines for the Data Lake; . Integrate internal and external data sources in an automated and secure manner; . Work with cloud-based storage (AWS S3, Azure Data Lake, GCP Storage); . Implement best practices for data governance, versioning, and data quality; . Support BI and Analytics teams in providing reliable data; . Monitor performance, costs, and security of the data environment. **Requirements:** . Experience with cloud-based Data Lakes (AWS, Azure, or GCP); . Proficiency in ETL/ELT and tools such as Apache Airflow, Glue, Databricks, or similar; . Strong knowledge of Python and advanced SQL; . Experience handling unstructured data storage (JSON, Parquet, Avro, etc.); . Experience in data architecture, modeling, and integration; . Familiarity with cloud security, governance, and cost management. **Nice-to-Have:** . Experience with Data Mesh or Data Governance frameworks; . Knowledge of Kafka (real-time data streaming); . Certifications in AWS / Azure / GCP Data; . Experience in complex or highly scalable enterprise environments.


