Description
We are recruiting for a Data Engineer for an initial 6 month contract with a consultancy and their end client in the financial sector. For this role we will need someone who has designed and implemented data processing pipelines on Google Cloud Platform (GCP) including Real Time/streaming data processing. This role will be PAYE. This role will also be predominantly remote with a degree of onsite working.
Responsibilities:
- Able to handle various stakeholder management ranging from Business to Wider Architectural Committee and external SMEs.
- Able to make pragmatic decision on architecture and solution that aligns to delivery timelines, skill sets, ease of maintenance.
- Execute project specific development activities in accordance to applicable standards and quality parameters.
- Able to prioritize tasks aligned to bigger objective and raise risks and make course correction.
Experience Required
- Experience with Real Time processing on GCP Cloud.
- Should have indepth knowledge on Java programming and its ecosystem.
- Working experience on managing distributed platforms
- Knowledge of Stream processing and ingesting frameworks like., DataFlow, PubSub, Kafka, Kafka Streams, Flink or Spark Streaming
- Should be familiar with Git, Unit testing, Jenkins, Maven.
- Experience working in the financial/banking sector is beneficial
If you are interested and available for a new role please apply with an updated CV for immediate review.