




Job Summary: A data professional responsible for designing and architecting scalable solutions on Databricks/cloud, translating business requirements, and ensuring data governance and quality. Key Highlights: 1. Lead the design and architecture of scalable data solutions 2. Serve as a bridge between business and technology by translating requirements 3. Ensure data governance and quality in pipelines Description: Desired Experience: * Completed undergraduate degree; * In-depth knowledge of Databricks: Mastery of the Databricks platform and its components, such as Delta Lake, MLflow, and collaborative notebooks; * Big Data and Cloud Computing: Experience with big data processing and cloud data architecture (Data Lake, Lakehouse, DWH); * Programming Languages: Proficiency in languages supported by Databricks, such as Python, SQL, Scala, and/or R; * AI/MLOps Experience: Knowledge of machine learning operations (MLOps) and integrating AI into data solutions. Preferred Qualifications: * Certifications: Relevant cloud technology certifications (e.g., AWS Certified Solutions Architect, Azure Solutions Architect, GCP Professional Cloud Architect) and/or Databricks certifications are preferred. Differentiating Skills and Behaviors: * Holistic Vision: Ability to maintain a high-level view of the final product, integrating security, infrastructure, and quality aspects. In this role, you will be responsible for: * Solution Design and Architecture: Create and evolve scalable, resilient, and secure data solutions on the Databricks/cloud platform (AWS, Azure, GCP), aligned with client needs; * Business–Technology Bridge: Translate business requirements into effective technical specifications and solution architectures, communicating with stakeholders, product teams, security, and infrastructure teams; * Data Governance and Quality: Ensure data governance and pipeline quality, managing data flow and defining policies for system operation; * Resource Optimization: Manage and provision compute resources, such as Spark clusters, ensuring system efficiency and performance; * Implementation Support: Support projects from initial design through delivery, providing implementation assistance—though typically not directly responsible for code development; * Automation: Automate deployment and configuration of resources using infrastructure-as-code (IaC). 2512110202491859596


