Data Engineer

GB  ‐ Onsite
This project has been archived and is not accepting more applications.
Browse open projects on our job board.

Description

Start date- ASAP
End Date-30th September 2021 with 50% possibility of extension

Public Sector
End User: Client
IR35 Status: TBC please assume Inside for now to mitigate any risks

Job Title -Data Engineer
Rate - £676/day
Clearance required- BPSS

Nationality Requirements- It might be beneficial, but is not a requirement, if the candidate was capable of getting SC which would require continuous residency in the UK for 5 years

Interview process: First Interview: Skills Assessment by Capgemini Data Engineer
Second Interview: Cultural Fit by Capgemini Engagement Director

Location- Fully remote for now

Job Spec-
Previous GOV UK experience desirable

Person specification
You will work in the Product Delivery Team, developing data pipelines to support data science and visualisation products for a variety of stakeholders, data scientists and engineers. You will analyse and document requirements turning these into data models, data flow designs as well as developing data loads and transformations, including the development of metadata layers for publishing for visualisation & analysis. Daily monitoring of production data management and supporting issue resolution with internal and external third parties.
Person Specification
You will have the core data engineering skills to enable and deliver data and analytical solutions on our analytics and data platforms. You will be able to act as a technical expert in all aspects of delivery and be prepared to provide leadership and guidance to small project teams/resources in a collaborative way.
You will develop and maintain the technology infrastructure that is integrating new and existing sources of data into JBC analytical processes and workflows, so that it can be exploited with maximum efficiency and efficacy. Key skills required include: data engineering, data architecture, data analysis, security architecture, data wrangling, and devops.
Core Responsibilities:
Data engineering and manipulation. You can work with other technologists and analysts to integrate and separate data feeds in order to map, produce, transform and test new scalable data products that meet user needs. You have a demonstrable understanding of how to expose data from systems (for example, through APIs), link data from multiple systems and deliver streaming services. You know how to work with other technologists and analysts to understand and make use of different types of data models. You understand and can make use of different data engineering tools for repeatable data processing; you can compare between different data models
Programming and build - You can design, write and iterate code from prototype to production-ready. You understand security, accessibility and version control. You can use a range of coding tools and languages.
Technical understanding - You know about the specific technologies that underpin your ability to deliver the responsibilities and tasks of the role. You can apply the required breadth and depth of technical knowledge.
Testing - You can plan, design, manage, execute and report tests, using appropriate tools and techniques, and work within regulations. You know how to ensure that risks associated with deployment are adequately understood and documented
Problem resolution - You know how to log, analyse and manage problems in order to identify and implement the appropriate solution. You can ensure that the problem is fixed.
Understanding analysis across the product life cycle (data science). You understand the different phases of product delivery and can plan and run the analysis for these. You can contribute to decision-making throughout the product life cycle. You know how to work in collaboration with user researchers, developers and staff in other roles throughout the product life cycle. You understand the value of analysis, how to contribute with impact and what data sources, analytical techniques and tools can be used at each point throughout the product life cycle.

Essential skills/experience/knowledge:
Strong at python, unit testing (pytest) and pep8 standards
Writing robust data pipeline code that can run unattended
Pandas data validation, manipulation, merging, joining and at times visualisation
Unix environment, server health and management of ongoing running processes
Github, git, pull requests, CI and code review
Logging and reporting pragmatically
Ability to troubleshoot and solve numerical and technical problems
Data engineering experience using Python, SQL, Spark and AWS
Hands on ETL development experience utilizing Microsoft enterprise stack/Azure
Knowledge of data management platforms and development with SQLServer
Experience with publishing data sets for visualisation and analysis
Experience with supporting design of data models/data flows
High attention to detail
Excellent communication and facilitation skills evidenced through verbal and written means to a wide range of stakeholders
Experience with Agile delivery

Individual skills/experience/knowledge:
Ability to work as part of a team to develop and deliver end-to-end data warehouse and data pipeline solutions
Able to work effectively as part of a wider BI/IT team internally and externally to support the delivery of projects
Ability to apply appropriate prioritisation and good time management skills to deliver work to standard and deadline
Excellent communication skills that promote an effective and collaborative team-working environment
Analytical skill set with an ability to understand data requirements and support the development of data solutions
Strong attention to detail with an ability to work effectively in a team and independently

How you will be assessed:
You will be assessed on the below Technical and civil service competencies skills at the Interview and a test will be sent 1 hour before the interview - this will be assessed through technical questions during the interview
Behaviors:
Working Together
Making effective decisions
Delivering at pace
Seeing the bigger picture

Technical Skills:
Programming and build (data engineering). You can design, write and iterate code from prototype to production-ready. You understand security, accessibility and version control. You can use a range of coding tools and languages.
Technical understanding (data engineering). You know about the specific technologies that underpin your ability to deliver the responsibilities and tasks of the role. You can apply the required breadth and depth of technical knowledge.
Data development process. You can integrate and separate data feeds in order to map, produce, transform and test new data products.
Data integration design. You can develop fit-for-purpose, resilient, scalable and future-proof data services to meet user needs. You have a demonstrable understanding of how to expose data from systems (for example, through APIs), link data from multiple systems and deliver streaming services.
Problem resolution (data). You know how to log, analyse and manage problems in order to identify and implement the appropriate solution. You can ensure th

Start date
ASAP
Duration
2 months
From
fortice
Published at
30.07.2021
Project ID:
2174493
Contract type
Freelance
To apply to this project you must log in.
Register