Data Architect ��� Azure Databricks (Insurance Domain)
JOB_53397631121021Job type
ContractLocation
TorontoProfession
Other/tbcIndustry
InsurancePay
N/A
Data Architect in Toronto, ON
Role: Data Architect
Location: Toronto, ON
Type: Contract
Job Description:
- Design scalable and secure data architecture solutions using Azure Databricks, Azure Data Factory (ADF), and Power BI.
- Build and optimize ETL/ELT pipelines for structured and unstructured data using PySpark, Spark SQL, and Delta Lake.
- Review and finalize data models before implementation.
- Translate business requirements into technical specifications and data structures.
- Implement Unity Catalog for fine-grained access control and data lineage.
- Ensure compliance with data governance, privacy, and security policies.
- Lead migration efforts from legacy platforms (e.g., Synapse, Hadoop) to Azure Databricks.
- Support AI/ML model migration and MLOps setup for insurance analytics use cases.
- Use Photon engine and indexing strategies to improve performance and reduce latency.
- Conduct rigorous testing including unit, integration, and performance tests.
Technical Environment
- Cloud & Tools: Azure Databricks, ADF, Azure SQL Data Warehouse, Azure Data Lake, Power BI, Azure DevOps, PolyBase
- Languages: Python, SQL, Scala, Spark SQL
- Frameworks: Delta Lake, Unity Catalog, Feature Store
- Certifications Preferred: Azure Fundamentals (AZ-900), DP-200/DP-201, Azure AI Engineer
Insurance-Specific Use Cases
- Guidewire Integration: Deep understanding of Guidewire PolicyCenter, BillingCenter, and ClaimCenter data models.
- Claims & Risk Analytics: Design AI environments for document scanning, fraud detection, and integrated claims analytics.
- Data Warehousing: Build and manage EDW, ODS, and DSS using dimensional modeling (Star/Snowflake schemas).
#LI-DNI
Location: Toronto, ON
Type: Contract
Job Description:
- Design scalable and secure data architecture solutions using Azure Databricks, Azure Data Factory (ADF), and Power BI.
- Build and optimize ETL/ELT pipelines for structured and unstructured data using PySpark, Spark SQL, and Delta Lake.
- Review and finalize data models before implementation.
- Translate business requirements into technical specifications and data structures.
- Implement Unity Catalog for fine-grained access control and data lineage.
- Ensure compliance with data governance, privacy, and security policies.
- Lead migration efforts from legacy platforms (e.g., Synapse, Hadoop) to Azure Databricks.
- Support AI/ML model migration and MLOps setup for insurance analytics use cases.
- Use Photon engine and indexing strategies to improve performance and reduce latency.
- Conduct rigorous testing including unit, integration, and performance tests.
Technical Environment
- Cloud & Tools: Azure Databricks, ADF, Azure SQL Data Warehouse, Azure Data Lake, Power BI, Azure DevOps, PolyBase
- Languages: Python, SQL, Scala, Spark SQL
- Frameworks: Delta Lake, Unity Catalog, Feature Store
- Certifications Preferred: Azure Fundamentals (AZ-900), DP-200/DP-201, Azure AI Engineer
Insurance-Specific Use Cases
- Guidewire Integration: Deep understanding of Guidewire PolicyCenter, BillingCenter, and ClaimCenter data models.
- Claims & Risk Analytics: Design AI environments for document scanning, fraud detection, and integrated claims analytics.
- Data Warehousing: Build and manage EDW, ODS, and DSS using dimensional modeling (Star/Snowflake schemas).
#LI-DNI
Data Architect ��� Azure Databricks (Insurance Domain)JOB_533976311210212025-08-292025-11-26
Talk to Ashley Hyslop, the specialist consultant managing this position
Located in Toronto (EN), 8 King Street East, 20th FloorTelephone: 4162031834JOB_53397631121021