Description
GCP Data Architect
Inside iR35
We are heading up a project on behalf of a global IT consultancy that require a GCP Data Architect to join their team on a major government project that is based remotely.
Experience with Google data services (BigQuery, BigTable, Dataproc, Dataflow, Airflow or Cloud composer-DAG, CloudSQL, Cloudspanner etc)
Experience in Datamodelling/Data warehouse modernization/Cloud based data lakes
Create and maintain optimal data pipeline architecture.
Assemble large, complex data sets that meet functional/non-functional business requirements.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Google cloud big data technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Ability to translate business requirements into technical specifications
Ability to build tactical and strategic roadmap on GCP platform
Experience in defining the technical architecture on GCP involving cloud storage, big query, dataproc, cloud functions, dataflow
Hands on experience in migrating the on-premise data pipelines and datawarehouse to Google Cloud Platform
Experience in Data Migration from various databases to BigQuery
Python Knowledge Preferable
Should have data modelling experience in BigQuery
Should able to implement BigQuery best practices
GCP Professional data engineer certification or GCP Professional Architect certification is Preferred