Job Description:A minimum of 10+ years of experience in data architecture and analytics.∙Databricks: Proficiency in Databricks for data processing, machine learning, and analytics.∙Cloud Platforms: Strong experience with any one or more cloud platforms such as AWS, Azure, GCP, and the ability to integrate Databricks with them.∙Data Engineering: Strong experience engineering concepts, data modelling, and ETL processes.∙SQL: Advanced SQL skills for data querying, transformation, and analysis.∙Python: Proficiency in Python programming for data manipulation, scripting, and automation.∙Apache Airflow: Strong experience with Apache Airflow for orchestrating and scheduling ETL workflows.∙Big Data Technologies: Familiarity with big data technologies like Spark, Hadoop, and related frameworks.∙Data security best practices and compliance standards.∙Data Modelling: Understanding of data modelling principles and best practices.∙Data Warehousing: Knowledge of data warehousing concepts and best practices.∙Version Control: Proficiency in version control systems such as Git.∙Communication: Excellent communication and collaboration skills to work effectively with cross-functional teams.∙Relevant certifications in Databricks and cloud platforms are advantageous