Description
PySpark Data Engineer, AWS excellent rates, home based, long term programme, multiple extensions.
Experience in the following;
- PySpark Data Engineering experience
- AWS Services, AWS EMR clusters
- Python, Spark, SQL & noSQL experience
- Developing pipelines using PySpark
- Data Engineering tech
- Apache Airflow workflow automation for data pipelines
- TDD/BDD methodologies, Git
INSIDE IR35 - this assignment will fall within scope of IR35 legislation and appropriate umbrella company should be utilised during this assignment.
This advert was posted by Staffworx Limited - a UK based recruitment consultancy supporting the global E-commerce, software & consulting sectors. Services advertised by Staffworx are those of an Agency and/or an Employment Business.
Staffworx operate a referral scheme of £500 or new iPad for each successfully referred candidate, if you know of someone suitable please forward for consideration.