Data Engineer

Lithuania  ‐ Onsite
This project has been archived and is not accepting more applications.
Browse open projects on our job board.

Description

REMOTE Opportunity

You will become a part of the ML Engineering CoE working in the Advanced Commercial Analytics squads. The focus will be to productionize the developed ML models and ensure continuously integration of the model code developed by the Data Scientists. The project is seeking a contractor to speed up the deliveries for developed solutions in various business segments.

Responsibilities

The main responsibility is to

- Help deploy and operate prioritised models to PROD 2.

- Educate CoE in DE and MLOps best practices and help draw up internal guidelines

- Assist in pipeline foundation development or tailoring to ACA CoE needs together with the infrastructure department: a. Data pipelines b. CI/CD c. Scheduling d. Monitoring All this to ensure the Advanced Commercial Analytics department reaches it commercial goals in 2021.
- Responsibility will be to build the pipelines for the statistical models utilizing the existing tech stack as well as bringing your experience into play to leverage new technologies. All tasks will be solved as part of multifunctional DevOps squads consisting of broad capabilities delivering end to end Data Science solutions. You will therefore be working closely with experienced Data Scientists and Machine Learning Operations Engineers as well as the business units who are consuming the data science services.

Requirements

We are looking for anMLOps Engineer with a solid experience in CI/CD within Data Science and having experience in working in agile setup.

Required competencies:
  • Version control (Git/Bitbucket)
  • CI/CD tools (Azure DevOps, Jenkins, or similar)
  • Code analysis and unit testing toolkit (e.g., PyLint & PyTest, SonarQube)
  • Model registry (e.g., Artifactory, MLflow) ? Workflow orchestration using Airflow or similar
  • Monitoring (e.g., Prometheus, Grafana, AppDynamics)
  • Experience with Public Cloud (Either Azure/AWS/GCP)
  • Understanding of Feature stores (e.g., Hopsworks/Iguazio/FEAST)
  • Container technologies (Docker/Kubernetes/OpenShift)
  • Strong Python knowledge (incl PySPark)
  • Linux/BashIT Consultancy
  • Strong Python knowledge (incl PySpark)
Start date
06/2021
Duration
6 Months
From
Source Technology
Published at
15.06.2021
Project ID:
2136011
Contract type
Permanent
To apply to this project you must log in.
Register