Description
Intuition IT are currently looking for a English speaking Hadoop Support for a 12 months contract in Hungary (remote). Please can you check the specification and let me know if you would be suitable and interested in the position below? If so, please forward your updated CV.
Start: ASAP
Duration: 12 Months
Location: Hungary
Languages: English
Remote: YES- Based in the EU
Must be currently eligible and have work rights (no sponsorship is given)
Key expectations:
Big Data (Hadoop, Hive), Kafka, Chef, Rundeck, Linux, Public Cloud ( Azure)
Job Duties
Work in an agile environment to manage and operationalize multi-datacenter (MDC) Hortonworks Hadoop deployment with main components HDFS, MapReduce2, YARN, Tez, Hive, Flink,oozie, Kafka, Zookeeper.
Optimize and tune the MDC Hadoop deployments to meet the performance requirements.
Troubleshoot complex operational issues in MDC Hadoop deployments.
Collaborate with release and engineering teams to ensure requirements and enhancement goals are met for Operations.
Participate in architecture design and review to drive continuous service improvements.
Participate in Capacity Planning and ensure scalability of the Cloud environments.
Drive performance issues out of Operations, in collaboration with various internal organizations.
Identify and correct issues that have resulted from integrations, customizations, and migrations.
Work with and enhance tools, operational processes and knowledge centers to ensure smooth and consistent operations.
EDUCATION AND QUALIFICATIONS/SKILLS AND COMPETENCIES:
Experience in troubleshooting, and supporting enterprise applications.
Hands-on experience in managing, monitoring, troubleshooting and scaling Hortonworks HDP/HDF/opensource kafka Hadoop distributions or any other Hadoop Platforms.
Experience working with Kerberos and Ranger in Hadoop clusters.
Exposure to Operation Intelligence and experience with tools like Splunk and ElasticSearch.
Experience with public cloud, as such as AWS and Microsoft Azure and GCP.
Experience with Apache Flink and troubleshooting the data pipe line issues.
Experience with Hadoop encryption technologies like Ranger KMS and Ranger KMS with Key Trustee Server.
Knowledge of ServiceNow, Zabbix is a preferred.
Knowledge of Docker and Kubernetes is a plus.
Knowledge of Devops tools like Jenkins, Chef, Puppet, Rundeck, Ansible, terraform is a plus.
Familiarity with infrastructure automation using tools like Terraform, Chef, Rundeck etc.
Strong interpersonal, communication and writing skills.
Ability to handle multiple projects simultaneously, meet deadlines, while effectively managing priorities and communicating progress.
Knowledge on Data governance ex: GDPR etc