Profileimage by Prashant Gupta Hadoop Admin from Bangalore

Prashant Gupta

available

Last update: 06.09.2022

Hadoop Admin

Company: TCS
Graduation: BE
Hourly-/Daily rates: show
Languages: English (Full Professional) | Hindi (Native or Bilingual)

Attachments

Prashant_Hadoop_Admin_Resume.pdf

Skills

Overall 6.1 years of IT experience, as a Hadoop administrator and Pentaho BI Tool and MySQL. Having 3.2 years of hands on experience as a Hadoop Administrator in MapR and Hortonworks Hands on experience on ecosystem components Hive, Sqoop, Pig, HBase, Oozie, Zookeeper and MapReduce. Hands on experience in installation, configuration, supporting and managing Hadoop Clusters. Decommissioning and commissioning the Node on running Hadoop cluster. Expertise in HDFS Architecture and Cluster Concepts. Installation of various Hadoop Ecosystems and Hadoop Daemons. Rebalancing the Hadoop Cluster. Hands on experience on Hadoop Security in Ranger and Kerberos. Hands on experience on data transfer/migration across the clusters in Hortonworks. Hands on experience on mirroring a volume in MapR. Hands on experience on hive and hbase data migration. Expertise in Cluster Installation for POC, Dev, Staging and Production environment

Project history

05/2019 - Present
IT Analyst
TCS (Internet and Information Technology, >10.000 employees)

Project 1: HP Inc.
Organization: TCS Pvt Ltd
Technologies: MapR, Hortonwork, hive, Hbase, Ranger, Sqoop, Kerberos.
Responsibilities:
 Installed Hadoop Clusters for PROD, ITG, DEV and POC in MapR and Hortonworks.
 Collaborated with multiple teams for design and implementation of Hadoop clusters

Health and HDFS space for better performance.
 Responsible for disk repairing in cluster.
 Rebalancing the Hadoop Cluster in Hortonworks.
 Working on Name node high availability.
 Allocating the name and space Quotas to the users in case of space problems as per the analysis in
Grafana.
 Installation of various Hadoop Ecosystems and Hadoop Daemons.
 Involved in Installing and configuring Kerberos for the authentication of users and Hadoop daemons.
 Implemented Kerberos integration with LDAP.
 Implementing and troubleshooting SSH key based password less authentication.
 Implemented scripts for Kerberos keytab generation.
 Able to transfer the data across the cluster in Hortonworks.
 Configured the mirroring setup in MapR cluster for data transfer.
 Good knowledge on hive, hbase and oozie installation and DB configuration for the same.
 Installing and upgradation of Packages and patches according to the client requirement.
 Changing file permissions as per the client request.
 Good exposure in coordinating with vendor related issues for all kinds of hardware failures.
 Processes administration and Management like monitoring, start/stop/kill various process

06/2018 - 04/2019
Senior System Engineer
HCL infosystems Ltd (Internet and Information Technology, >10.000 employees)

Project 2: UIDAI- Aadhar
Organization: HCL infosystems Ltd
Technologies: MAPR, Hdfs, Hive, MAPRDB, Oozie, Sqoop, Pig, Control Minder
Description:
Unique identification project was initially conceived as an initiative that would provide
Identification for each resident across the country and would be used primarily as the basis for
Efficient delivery of welfare services.
Responsibilities:
 Cluster maintenance, commissioning & decommissioning data nodes.
 Installation and configuration of MapR Hadoop cluster , Design & develop MapR DR setup, and
manage data on MapR cluster
 End-to-end performance tuning of MapR clusters and Hadoop Map/Reduce routines against very
large data sets, working with MapR cluster along with MapR-Table(creation, import, export,
scan, list)
 Managing & monitoring cluster.
 Performed data balancing on clusters
 Applications PROD Support as roaster and Hadoop Platform Support.
 Managing MFS cluster users (MAPR), permissions and Application users access.
 Working on Name Node high availability customizing zookeeper services.
 Improve speed, efficiency and scalability of the continuous integration environment
 Managing quotas to Mapr File System.
 Recovering from node failure and troubleshooting common hadoop cluster issues.
 Responsible for Mapr File system data rebalancing.
 Responsible for performing the backup and Restoration of data from MFS to SAN and Tapes as
per Retention Policy.
 Coordinating with team members for proper resolution of tickets
 Checking daily jobs and space alerts
 Manage and review Hadoop log files.
 Troubleshooting day-to-day issues, such as login problems, network issues, permission issues.

10/2014 - 05/2018
Software Engineer
Baryons Software Solutions (Internet and Information Technology, 50-250 employees)

Project 3: Exigent (Chameleon Post SignOff)
Organization: Software Engineer
Technologies: Java, Pentaho BI, Mysql, Apache Hadoop, JIRA,
Description:
Exigent has a software product roadmap to build a Contract Management product that can be
implemented both on-cloud and on premise for Exigent’s customers.
Responsibilities:
 Load the processed data from each upstream application to HDFS via SQOOP
 Experience in analyzing structured data using HIVE, PIG
 Involving in developing the Hive Reports.
 Monitoring and managing the Hadoop cluster through Ambari.
 Working on setting up Hadoop multi node clusters, pig, Hive and Hbase using Ambari.
 Using Sqoop extensively to import data from RDMS sources into HDFS.
 Performed transformations, cleaning and filtering on imported data using Hive and loaded final
data into HDFS.
 Involving in creating external table, partitioning, bucketing of table.
 Responsible for creation and setting up of environment and re-configuration activities
 Created reports for the BI team using Sqoop to export data into HDFS and Hive.

Local Availability

Only available in these countries: India

Other

I am Prashant Gupta, very interested to make a good career with your esteemed organization. 


Below are my brief details for your consideration. My detailed resume can be found as an attachment. 

Experience: 6 years of overall IT experience and 3.1 years Hands on experience as a Big Data Hadoop Administrator in MapR and Hortonwork Distribution.

Current Company: TCS Pvt. Ltd., Bangalore


Current Role : I.T. Analyst

Proficient-in: Hadoop, HDFS, Hive, Pig, Mapreduce, Oozie, Sqoop, Hbase, Ranger, Kerberos, MySql, Pentaho BI Tool

Qualification: BE, ECE(2013) with 64.%

Current Location : Bangalore

Job Role :Permanent

Please consider my application for this suitable opportunity in Hadoop Admin.
Profileimage by Prashant Gupta Hadoop Admin from Bangalore Hadoop Admin
Register