RV

Rajesh Vasam

available

Last update: 06.09.2022

Senior Big Data Engineer, Senior Software Engineer, Sr Data Engineer

Company: Zedeye Labs
Graduation: not provided
Hourly-/Daily rates: show
Languages: German (Native or Bilingual)

Attachments

101635.doc

Skills

Apache Spark, Kafka, HDFS, Hive, Sqoop, Pig, Flume, Oozie, HBase, AWS, Google Cloud, ETL, Tablaeu, object-oriented programming language Python, Apache, Hadoop, GCC, analytics, cloud, EMR, Oracle, DB , Oracle database, S3, PySpark, external tables, EC2, Amazon Linux, Redhat, backup, Amazon Glacier, Elastic, VPC, Release Management, Big Data, IBM, Cloudera, E-mail, database, UDF, storing data, MySQL, Mainframes, code review, coding, functional testing, Cassandra, SQL, Python, ELB, RDS, Cloudwatch, Google, BigQuery, Compute Engine, Cloud Storage, Cloud SQL, Linux, COBOL, JCL, VSAM, CICS, FileAid, Xpeditor

Project history

06/2010 - 03/2018
Senior Big Data Engineer
Macy's; IBM India Pvt Ltd

Implementation and Maintenance
Senior Big Data Engineer
Organization: IBM India Pvt Ltd
Environment: Cloudera (CDH) Hadoop Distribution



Macy's, originally is a mid-range chain of department stores owned by American corporation
Macy's Inc. It is one of two divisions owned by the company, with the other being the upscale
Bloomingdale's. The Macy's division operates 789 department store locations in the continental
United States, Hawaii, Puerto Rico, and Guam, including the prominent Herald Square flagship
location in Midtown Manhattan, New York City.

As part of the engagement, I was involved in

* Worked on setting up Cloudera distribution (CDH) and different big data analytic tools
including Pig, HBase and



RAJESH VASAM




Phone: 015144920792



Sr Data Engineer



E-mail: rajesh.vasam@gmail.com


3



Sqoop.

* Responsible to perform queries using Hive and developed Map-Reduce jobs to analyze data.

* Importing and exporting data into HDFS and Hive using Sqoop

* Experience with HBase database.

* Implemented Hive and Pig UDF's for evaluation, filtering, loading & storing data.

* Written Hive UDFs to extract data from staging tables.

* Involved in creating Hive tables, loading with data and writing hive queries

* Worked extensively with Sqoop for importing and exporting data from MySQL into HDFS and Hive.

* Responsible to manage data coming from different sources.

* Load and transform large sets of structured, semi structured and unstructured data.

* Analyze the data by performing Hive queries and running Pig scripts.

* Involved in preparing Unit Test Plan and document Unit Test Results

* Involved in Weekly Project Update meetings

* Performing Release Management Activities

* Mentoring team members/new resources in their assigned tasks, coordinating with the team, work
assignments, gathering status and reporting to project manager and onsite team.

* Ensure timely and quality delivery as per release plan.

03/2008 - 06/2010
Senior Software Engineer
Experian Credit Services; Perot Systems India Pvt Ltd

Implementation and Maintenance
Senior Software Engineer
Organization: Perot Systems India Pvt Ltd
Environment: Mainframes


Experian® is a global leader in providing information, analytical tools and marketing services
to organizations and consumers to help manage the risk and reward of commercial and financial
decisions. Experian® products are Credit Services, Decision Analytics, Marketing Services &
Interactive. Organizations rely on Experian® for help in finding new customers and developing
successful relationships with their existing customers.



As part of the engagement, I was involved in

* Involved in creating Estimates with detailed loe (level of effort) and define the complexity
(high, medium, low) of the work requests.
* Preparation of Tech Specs and Program Specs such as HLD(High Level Design) and DLD(Detailed
Level Design)
* Construction and code review of the programs as per Client specifications of coding
standards.
* Writing unit test cases and unit test to meet all the functional requirements.
* Document the unit test results and get the test results approved by the SME (Subject Matter
Expert).
* Support the QA functional testing by resolving the QA defects assigned to the development
team.
* One to One communication with the client by updating the status of development phase, QA
testing phase and production deployment phase.

03/2006 - 03/2008
Sr Data Engineer
Depository Trust and Clearing Corporation; CTS India Pvt Ltd

E-mail: rajesh.vasam@gmail.com


4



4) Depository Trust and Clearing Corporation
Mar-2006 - Mar 2008

Implementation and Maintenance
Software Engineer
Organization: CTS India Pvt Ltd
Environment: Mainframes


The Depository Trust & Clearing Corporation (DTCC) is the largest financial services
post-trade infrastructure organization in the world, handling trades from virtually every
marketplace in the U.S. Through its subsidiaries, DTCC provides clearance, settlement and
information services for equity, corporate debt, municipal debt, government securities and
mortgage-backed securities in the U.S., and emerging markets debt trades globally.

As part of the engagement, I was involved in

* Identify the impacted program modules and come up with estimates on code changes and unit
testing.
* Construction and code review of the programs as per Client specifications of coding
standards.
* Writing unit test cases and unit test to meet all the functional requirements.
* Document the unit test results and get the test results approved by the SME (Subject Matter
Expert).

Local Availability

Open to travel worldwide
Profileimage by Rajesh Vasam Senior Big Data Engineer, Senior Software Engineer, Sr Data Engineer from Dresden Senior Big Data Engineer, Senior Software Engineer, Sr Data Engineer
Register