




The **Machine Learning Engineer** will be responsible for integrating Machine Learning models into the corporate infrastructure, ensuring **reliability, scalability, monitoring, and governance** throughout the entire model lifecycle, from pre\-deployment to production. Will work on transforming models developed by Data Scientists into **operational and sustainable solutions**. * **Employment type:** Temporary, 2 months; Hybrid, 3 days onsite * **Work address:** Santo Amaro, São Paulo \- SP * **Contract type:** PJ **Key Responsibilities:** * Operationalize ML models, ensuring **automated testing, standardized pipelines, logging, and versioning**. * Convert prototypes and notebooks into **production-ready training and inference pipelines**. * Ensure **consistency, traceability, and stability** of data and features between training and production. * Deploy models in production environments and integrate them with corporate systems. * Support **safe deployment strategies** (blue/green, canary, and versioning). * Configure and maintain **technical, data, and model monitoring**. * Ensure **observability, governance, and traceability** of the model lifecycle. * Address **production incidents**. * Contribute to the evolution of the **MLOps / MLSecOps pipeline** and standardization of best practices. **Mandatory Requirements:** * Bachelor’s degree in technology, data, or related fields. * Experience in **Machine Learning Engineering, Data Engineering, Data Science, or DevOps**. * Hands\-on experience with **production pipelines, model deployment, and monitoring**. * Strong programming skills in **Python**. * Experience with **Spark / PySpark** and **SQL**. * Experience with **CI/CD, inference APIs, feature stores, or ML pipelines**. * Experience with **monitoring, governance, and incident handling**. **Preferred Qualifications:** * Knowledge of **MLOps** (MLflow, Docker, Kubernetes, CI/CD, Grafana, Prometheus). * Experience with **Feature Engineering, Model Evaluation, and Explainability**. * Knowledge of **Datalake/Lakehouse, Unity Catalog**, and ML frameworks (scikit\-learn, TensorFlow, PyTorch). * Experience with **Airflow, Prefect, Metaflow, Dask, or advanced PySpark**. * Knowledge of **SAS Enterprise Guide**. Minimum Education Level: Bachelor’s Degree Desired Education: *Information Technology, Bachelor’s Degree


