Description
Hadoop Data Developer (AWS)
Remote work offered
8 months rolling
3 X Daily Rate Roles
Exciting opportunity for an experienced Hadoop Developer's working on a high-profile big data project on an 8-month rolling daily rate contract.
A solid understanding of considerations for large scale delivery, suctioning and operationalisation of data lakes, Data warehouses, Data Services and analytics platforms on AWS is a must.
This is a hands-on development contract and candidates need a strong Data Engineering background, Hadoop and AWS ( EMR, EC2, Kinesis, integration tools etc.
Requirements
- Extensive Data Engineering background
- Hands-on Hadoop and AWS ( EMR, EC2, Kinesis, integration tools etc)
- Minimum 3 years of designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Podium, Java, Python, Scala, C++ etc.
- 3-4 Years of Experience with SQL against relational databases preferably with SQL Server, Oracle database platforms 10g and above on Linux/Unix.
If you have an expert understanding of considerations for large scale delivery, suctioning and operationalization of data lakes, Data warehouses, Data Services and analytics platforms on AWS this could be the right contract for you.
These roles are based on site and can offer particle remote work after an initial period, we cannot provide sponsorship for candidates for this daily rate contract and candidates need a minimum EU passport, Stamp 4 or Stamp 1g visa
These are urgent requirements as the project has already begun, first round interviews will begin next week. If you are interested in applying, please forward your updated CV to (see below)