Big Data Engineer - Freelance - Antwerp

Antwerp  ‐ Onsite
This project has been archived and is not accepting more applications.
Browse open projects on our job board.

Description

Job description

The Data Excellence team is responsible for the life cycle of Data, Business Intelligence and Internet of Things (IoT) related information systems.

We are responsible for the development of new applications, the continuous improvement of existing applications, as well as the day-to-day IS/IT support.

In order to secure competence related to modern data and cloud technologies, we are looking for an multi-disciplinary big data engineer.

You will implement selected experimental use cases from a technical perspective using various big data technologies and tools.

You will analyze and map their strengths and weaknesses, and assess the fit with the future company data architecture landscape.

In a later phase you will operationalize the selected technologies, designing and implementing data pipelines.

Project responsibilities

- You are able to work hands-on with the involved systems on a deep technical level, and document their strengths and weaknesses.

- You are able to design and implement data pipelines in such a way that they can be properly operationalized, monitored and supported

Maintenance and monitoring responsibilities

- You maintain and monitor implemented data pipelines

- You provide and maintain technical documentation

- You provide 2nd and 3rd level support when required

Various

- You will report on your activities, status, budget and timing to all stakeholders

- Occasional interventions outside office hours may be required

- You will report to the Data Excellence teamleader, within the IT department

Additional requirements

- You are fluent in English: both written and spoken.

Knowledge

Important areas of interest are:

Data Storage: Investigate (cloud) storage solutions, both for unstructured and structured data (HDFS, Azure Data Lake Store, Azure SQL) serving a multitude of use cases (BI, Analytics, Operational)

Data Ingestion: Investigate streaming and batch ingestion (eg. StreamSets, Kafka, Azure IOT Hub, CDC) from operational source systems (ERP, IOT, CRM, ) both in the cloud and on-premise.

Data processing: Investigate appropriate ETL/ELT, Scheduling/orchestration and monitoring technologies (eg. Airflow, Azure Data Factory, Azure Data Lake Analytics, Databricks)

Date warehousing and exposure/integration: Investigate technologies related to expose raw and curated/refined data to a broad audience of data scientists, data analysts, BI tools and operation systems (eg. API's. Azure Polybase, Azure SQL Datawarehouse, Tibco Data Virtualixation server)

Education

- Bachelor/Master in related domains or equivalent by experience

- minimum 5 years of experience in related domains

Personality

- You have a passion for innovation and technologies.

- You have a "can-do" approach, and are decisive. You are not afraid of making errors and are willing to learn by doing.

- You are a good communicator and are able to maintain excellent relationships with your stakeholders and colleague, however are not afraid to challenge if necessary.

- You are flexible and prepared to work outside of business hours if required to meet a deadline.

- You are customer focused, enthusiastic and professional.

- You are proactive.

- You have very strong analytical skills

- You are result oriented and quality focused in terms of processes.

- You are stress-resistant.

g2 Recruitment are committed to equality of opportunity for all applications from individuals are encouraged regardless of age, disability, sex, gender reassignment, sexual orientation, pregnancy and maternity, race, religion or belief and marriage and civil partnerships or any other characteristic protected by law.

Start date
ASAP
Duration
12 months
From
G2 Recruitment Solutions
Published at
05.12.2018
Project ID:
1682833
Contract type
Freelance
To apply to this project you must log in.
Register