Job Description
Software International (SI) supplies technical talent to a variety of clients ranging from Fortune 100/500/1000 companies to small and mid-sized organizations in Canada/US and Europe.
We currently have an indefinite contract role as a GCP Data Architect with our global consulting client, working at remotely This is a 6 month contract initially, but could be extended.
Role: GCP Data Architect
Type: Contract
Duration: 6 months to start + potential extension
Location: Toronto, ON - remote with occasional office visits
Rate: $110 -$140 CDN/hr C2C depending on overall experience
GCP Data Architect - Role Overview
We are seeking a highly skilled Google Cloud Platform (GCP) Data Architect with strong SAP data integration expertise to design, implement, and oversee enterprise-grade data solutions. The ideal candidate will combine deep expertise in cloud data platforms, data governance, security, and data modeling with hands-on experience in ETL/ELT pipelines, SAP data extraction, system migrations, and analytics. This role will collaborate with business stakeholders, and engineering teams to create a robust, scalable, and cost-effective data ecosystem that bridges SAP and GCP environments.
Key Responsibilities
1. Data Strategy, Security & Governance
- Define and implement enterprise-wide data strategy aligned with business goals.
- Establish data governance frameworks, data classification, retention, and privacy policies.
- Ensure compliance with industry standards and regulations (e.g., GDPR, HIPAA, PIPEDA).
2. Data Architecture & Modeling
- Design conceptual, logical, and physical data models to support analytics and operational workloads.
- Implement star, snowflake, and data vault models for analytical systems.
- Implement S4 CDS views in Google Big Query
3. Google Cloud Platform Expertise
- Architect data solutions on GCP using BigQuery, Cloud Storage, Dataflow, and Dataproc.
- Implement cost optimization strategies for GCP workloads.
4. Data Pipelines & Integration
- Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer), and Dataflow
- Integrate data from multiple systems including SAP BW, SAP HANA, Business Objects using tools like SAP SLT, or Google Cortex Framework.
- Leverage integration tools such as Boomi for system interoperability.
5. Programming & Analytics
- Develop complex SQL queries for analytics, transformations, and performance tuning.
- Build automation scripts and utilities in Python.
- Good understanding of CDS views, ABAP language
6. System Migration
- Lead on-premise to cloud migrations for enterprise data platforms (SAP BW/Bobj)
- Manage migration of SAP datasets to GCP ensuring data integrity and minimal downtime.
8. DevOps for Data
- Implement CI/CD pipelines for data workflows using GitHub Actions, Cloud Build, and Terraform.
- Apply infrastructure-as-code principles for reproducible and scalable deployments.
Preferred Skills
- Proven experience with GCP BigQuery, Cloud Storage, Pub/Sub, Dataflow.
- Strong SQL and Python programming skills.
- Hands-on experience with SAP data extraction, modeling, and integration from ERP, BW, and/or HANA systems.
- Knowledge of data governance frameworks and data security best practices.
- Experience with Boomi, Informatica, or MuleSoft for SAP and non-SAP integrations.
- Experience in Google Cortex Framework for SAP-GCP integrations.
Application Disclaimer
You are now leaving Jobdash.ca and being redirected to a third-party website to complete your application. We are not responsible for the content or privacy practices of this external site.
Important: Beware of job scams. Never provide your bank account details, credit card information, or any form of payment to a potential employer.