Big data Architect/Data lake Architect

Rhineland-Palatinate, Ludwigshafen am Rhein  ‐ Onsite
This project has been archived and is not accepting more applications.
Browse open projects on our job board.

Description

Should have overall 10+ years of rich IT Experience in DW/BI, with minimum 4+ years of experience in Big data Hadoop eco systems consultant

Mandatory skill in Big data:

• Should have minimum of 4 years Hands-on experience with Hadoop/Big Data and Data Lake architecture (HDFS, Hadoop M/R and streaming, Spark and Spark Streaming)
• Should have Conceptual Know-How of Data Lake design, esp. data zones (landing, raw, hot, cold etc.) and meaningful user rights
• Hands-on experience with ingestion (message queuing, Kafka, Nifi) and data provision downstream
• Hands-on experience with data governance and data lineage
• General support for applying best practices for data-lake-based big-data analytics
• Good knowledge in ElasticSearch, Neo4j and traditional relational Databases (Oracle, PostgreSQL)
• Knowledge in data processing (e.g. Spark, NiFi, HiveQL, Impala, R-Statistics, KNIME, PipelinePilot)
• Good experiences with agile methods and languages like Java
• Knowledge in Firewall concepts and Firewall rule sets , Identity and access management and security topics
• Good knowledge on ITIL concepts and CI/CD in DevOps
• Fair knowledge on Linux , Windows OS and Middleware components like Apache , Tomcat
• Should have support knowledge, preferably, in the following areas of Apache Hadoop Framework:
o Apache Knox
o Apache Ranger
o Apache Atlas
o Apache Avro
o Apache Hive
o Apache Druid
o Apache Oozie
o Apache Nifi
Start date
ASAP
From
Adroit People Ltd
Published at
31.05.2019
Contact person:
Roshini Agarwal
Project ID:
1778834
Contract type
Permanent
To apply to this project you must log in.
Register