Description
I'm recruiting a Senior Data Engineer for a non-for-profit organisation working on a programme that is helping to decarbonise the electricity industry.
Key tech/skill set required:
- Databricks: experience writing data transformation pipelines; experience of Structured Streaming is a bonus.
- PySpark
- Python Software Development (unit testing, packaging, coding standards etc. Specific experience of automated testing of Spark code is a bonus).
- Azure (the more the better)
What the role will involve:
Working as part of a team of data engineers to build data processing pipelines, using Pyspark on Databricks to ingest and transform large volumes of data.
If interested, please apply with your CV, or email/message me with this for a quicker response.
Xpertise acts as an employment agency.