This role plays a critical part in establishing a unified, governed data foundation that standardises core entities, definitions, and relationships across the organisation, enabling scalable analytics, reporting, and AI/ML use cases.
You will work across architecture, data engineering, analytics, and business teams to rationalise existing schemas, translate embedded business logic, and define canonical domains aligned to modern Lakehouse and cloud data architecture principles.
Key Responsibilities
- Design conceptual, logical, and physical data models supporting enterprise analytics and AI
- Define canonical data domains, entities, and relationships
- Map legacy SAS NEO datasets and unstructured sources to target Databricks models
- Translate embedded business logic into governed, reusable data structures
- Define OLTP, OLAP, and dimensional models (star/snowflake)
- Establish metadata, lineage, and documentation standards
- Ensure alignment with Lakehouse, ELT, and cloud architecture patterns
- Collaborate with engineers, architects, and stakeholders on target-state design
- Strong experience in Data Architecture, Data Modelling & Design
- Proven migration experience to Databricks or Lakehouse platforms
- Conceptual, logical and physical modelling expertise
- Dimensional modelling (star/snowflake) and OLTP vs OLAP design
- Advanced SQL
- Experience with relational databases (SQL Server, Oracle, Postgres, MySQL, SAS)
- Cloud platforms: Azure, AWS or GCP
- Data integration & ETL/ELT design patterns
- Metadata management, lineage, and governance frameworks
- Python, Scala, or Spark
- Informatica, ADF, Talend, Boomi, SSIS
- Kafka / Event Hubs / streaming
- Power BI, Tableau, Qlik
- MDM and data quality tooling
- Infrastructure-as-code exposure (Terraform / ARM)





