




Job Summary: A Data Professional responsible for collecting, interpreting, and communicating data insights to support strategic decision-making, participating in the end-to-end lifecycle of Data Science projects. Key Highlights: 1. Contribute to strategic decision-making through data analysis 2. Develop and train AI and Machine Learning models 3. Actively participate in all SCRUM ceremonies At Aquarela, we develop technology that surprises the market; therefore, we constantly seek the best talent to join our team. Here, you will be responsible for gathering, interpreting, and communicating all information contained in data, contributing to strategic and accurate decision-making. **Responsibilities:** Perform exploratory data analysis; Create mathematical, statistical, analytical, bottom-up, and top-down models; Train AI models across various paradigms, with a focus on meeting business requirements; Interact directly with clients and domain experts; Transform data to feed machine learning estimators; Design and document database tables; Assist in building ML models; Participate in developing ETLs for data preparation and model pipelines; Actively participate in all SCRUM ceremonies, including Sprint Planning, Daily Meeting, Sprint Review, and Sprint Retrospective; Collaborate with commercial teams to assess feasibility of new projects, products, and solutions; Support departmental processes by executing all related tasks and other duties assigned by the department head. **Requirements:** Experience in the end-to-end lifecycle of a Data Science project; Handling of time series data; Data visualization tools; Machine Learning models; Knowledge of statistics / Statistical inference; Analysis of structured and unstructured data; Knowledge of data modeling methodologies such as Kimball and Inmon. **Preferred Qualifications:** Academic background in Mechanical Production Engineering or Economics; Master’s or PhD in exact sciences; Distributed processing (PySpark); Familiarity with various types of databases (SQL, NoSQL); Model performance evaluation tools; Experience in highly complex projects; Familiarity with Scrum; Familiarity with pipeline development tools such as dbt and Dataform.


