




Description: Apply quickly via email: Requirements and qualifications: * Bachelor's degree in Computer Science, Data Engineering, or related fields. * Minimum 3 years of experience with production data pipelines. * Proficiency in Python and SQL. * Proven experience with Airflow or similar tools. * Solid knowledge of AI and machine learning models applied to data. Desirable: * Experience with Docker and Kubernetes. * Knowledge of BigQuery, Snowflake, or Redshift. * Familiarity with MLOps and ML pipelines. * Certifications in Google Cloud or AWS. Benefits: Life insurance, Birthday day-off, Health insurance, Paid vacation Working hours: 08:00 to 18:00 — Monday to Friday (Hybrid Work Model) Knowledge: Education: Bachelor's degree — Data Science, Data Engineering, Software Engineering — Completed Technical skills: APIs, AWS, Airflow, Azure, Dagster, Excel, GCP, LLMs, Power BI, Prefect, Python, Webhooks Behavioral skills: Analytical, Attention to detail, Leadership ability, Communication, Clear communication, Discipline, Critical thinking, Proactivity, Relationship building, Problem solving, Business acumen 2512270202491885849


