Profileimage by InnoMinInfinite Technologies IBM InfoSphere DataStage - Developer/Tester/Architect from

InnoMinInfinite Technologies

available

Last update: 05.11.2016

IBM InfoSphere DataStage - Developer/Tester/Architect

Graduation: not provided
Hourly-/Daily rates: show
Languages: English (Limited professional) | Hindi (Limited professional)

Attachments

rajunath_etl-architect--v1.8.doc

Skills

  • 9+ years of working experience for IBM both at onshore and offshore in various capacities.
  • Strong design experience to independently turn Physical Data Model into DataStage job design.
  • Strong DataStage implementation skill to complete implementation within tight project schedule.
  • Experienced in Full Life Cycle Development of building a Operational Data Store for large technology projects including Analysis, Design, Development, Testing, Maintenance and Documentation.
  • Have good hands-on experience on SAP Integration project by using the CFF stage.
  • Very strong experience with various migrations and actuations projects in both DataStage and Mainframe.
  • Expert level knowledge of relational databases (especially DB2 and Oracle) and worked on Flat files.
  • Strong understanding of Investment Management system or trade domain experience.
  • Good Unix/Awk shell scripting skill.
  • Good knowledge on Data warehouse Star schema, Snow flake schema concepts.
  • Extensively Worked on Job Sequences to control the execution of the job flow using various activities like Job Activity, Email Notification, Sequencer, Routine activity and Exec Command Activities
  • Complete team co-ordination involving task allocation among team members and holding frequent discussions to monitor progress of project work
  • Extensively worked on Parallel Jobs using Various Stages like Join, Lookup. Lookup File Set, Change Capture, Filter, Funnel, Copy, Sort, File set, Sequential File, Data set, Oracle Enterprise, Merge, Aggregator, Remove Duplicate Stages, Pivot Stage, Stored Procedure stage, CFF Stage.
  • Good working experience on DB2 and JCL in the Operational Database Applications.
  • Good experience in the Conversion projects, Data migrations and Mainframe to ETL technology transformation projects.
  • Good experience with Client Interaction, Onshore – Offshore working model.
  • Quick learner with excellent analytical and problem solving skills.
  • Good interpersonal, written and verbal communication skills. Multilingual, highly organized, detail-oriented professional with strong technical skills.
  • Passionate about all aspects of the software development life-cycle, from design through development, deployment, Production Support and Data Quality & Data Management.

Project history

SKILL MATRIX
Database:            Oracle 9i, 10-11g,12c, DB2, Netezza, Teradata, HP Enterprise Vertica.
O/S:                       Red Hat Linux, UNIX, Windows XP, Windows 7, Windows NT/2000 Server
Tools:                    DataStage11.5, DataStage8.x, Informatica 9.0, Cognos TM1 9.5, Datastage Fast Track, Quality stage, TeamForge, AntHillPro, Power System, BMC Remedy Mid Tier 7.6, Auto-sys, Control M.  
 
 KEY SKILLS
  • Service Delivery
  • Design & Deployment
  • Leadership/ Team Management
  • Process Enhancement
  • Client Relationship
  • Process Streamlining & Re-engineering
 
EDUCATION   
Master of Computer Application - 2007 from West Bengal University of Technology (WBUT), India

Astellas Pharma US, Northbrook, IL                                                                                        Jun 2016 – Till Date
Role:     ETL Architect / Technical Lead
 
Project Description:
Astellas Pharma US, Inc. is committed to providing patients, customers, community and employees with a bright future by changing Tomorrow. Our commitment is made possible because we are a different kind of pharmaceutical company. This is apparent not only in what we do, but how we do it.
The objective of this project is to create full view of payment information and account information for the list of Payment\Accounts provided.
 
Project Responsibilities:
  • Responsible for reviewing functional specifications and preparing technical design framework for ETL Datasge. Interacted extensively with client/onsite team to resolve technical and infrastructural issues.
  • Review LLD & HLD documents as per the requirements specification documents.
  • Interacted with Business Partners to understand the business requirements, Analyze and design the integration Approach.
  • Performed ETL Datastage coding using Datasets, Sequential file, Transformer, Sort, Merge, Aggregator stages compiled, debugged and tested. Extensively used stages available to redesign DataStage jobs for performing the required integration.
  • Used DataStage Designer tools for creating Parallel jobs and importing and exporting the jobs.
  • Used DataStage Job Sequences to execute the jobs in a sequence order.
  • Used Quality stages for enrich data to meet business objectives and data quality management standards.
  • Created DataStage Jobs to build business rules to load data.
  • Used DataStage Director to run and monitor status of the jobs.
  • Imported table/file definitions into the Data stage repository.
  • Controlled jobs execution using sequencer, used notification activity to send email alerts.
  • Involved in Performance Tuning and Enhancement.
  • Documented ETL Datastage test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
  • Used Control-M job scheduler for automating the regular BAU Calendar schedule cycle in both production and UAT environments.
  • Coordinating with Release Management, Data Management and Configuration Management.
  • Ensure technical relationships with third party vendors.
Conduct the Weekly Status meetings with onshore and offshore Teams
 
Environment:    UNIX, AWK, Datastage 11.5, Oracle 11g, Salesforce DB, SQL developer, Netezza DB, Putty, Control M, HPQC, Power Center.
Team Size:          10 to 15
 
Bank of America, Agoura Hills, CA                                                                                           Aug 2015 – Jun 2016
Role:     ETL Architect / Technical Lead
 
Project Description:
Bank of America has diverse enterprise application servicing the various Lines of Business like Consumer Real Estate, Deposits, Ecommerce, Card Solutions and Transaction Services.
The objective of this project is to create full view of customer information and account information for the list of Customers\Accounts provided. Bank was having some of the credit card accounts information with a different vendor. So for every inquiry of those accounts bank was paying some $ for the vendor. With this new ODS application which is within the bank, bank has decided to service the online channels by avoiding the calls going to the vendors.
 
Project Responsibilities:
  • Responsible for reviewing functional specifications and preparing technical design framework for ETL Datasge. Interacted extensively with client/onsite team to resolve technical and infrastructural issues.
  • Review LLD & HLD documents as per the requirements specification documents.
  • Interacted with Business Partners to understand the business requirements, Analyze and design the integration Approach.
  • Performed ETL Datastage coding using Datasets, Sequential file, Transformer, Sort, Merge, Aggregator stages compiled, debugged and tested. Extensively used stages available to redesign DataStage jobs for performing the required integration.
  • Used DataStage Designer tools for creating Parallel jobs and importing and exporting the jobs.
  • Used DataStage Job Sequences to execute the jobs in a sequence order.
  • Used Quality stages for enrich data to meet business objectives and data quality management standards.
  • Created DataStage Jobs to build business rules to load data.
  • Used DataStage Director to run and monitor status of the jobs.
  • Imported table/file definitions into the Data stage repository.
  • Controlled jobs execution using sequencer, used notification activity to send email alerts.
  • Involved in Performance Tuning and Enhancement.
  • Documented ETL Datastage test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
  • Used Auto-sys, Control M job scheduler for automating the regular BAU Calendar schedule cycle in both production and UAT environments.
  • Coordinating with Release Management, Data Management and Configuration Management.
  • Ensure technical relationships with third party vendors.
  • Support Bank’s acceptance testing needs.
Conduct the Weekly Status meetings with onshore and offshore Teams
 
Environment:    UNIX, Perl, AWK, XML, XSD, Datastage 11.3, MDM, Salesforce, Business Glosary,  Oracle 11g, SQL developer, Putty, Auto-Sys, Control M, HPQC, Power Center.
Team Size:          15 to 20
 
Client: Verizon FiOS NBO Analytics, New Jersey. NJ                         May 2015 – Aug 2015
Role:     Technical Architect  
 
Project Description:
Verizon Fios (formally FiOS) is a bundled Internet access, telephone, and television service that operate over a fiber-optic communications network to over 5 million people in 13 states. Service is offered in some areas of the United States by Verizon Communications and Frontier Communications. Verizon was one of the first major U.S. carriers to offer fiber to the home, and received positive ratings from Consumer Reports among cable television and Internet service providers. Other service providers often use fiber optics in the network backbone and existing copper or coax infrastructure for residential users. 
 
Project Responsibilities:
  • Responsible for leading a project team in delivering solution to our customer.
  • Working in Verizon FiOS NBO Analytics project as a Data Analyst.
  • Deliver new and complex high quality solutions to clients in response to varying business requirements
  • Responsible for managing scope, planning, tracking, change control, aspects of the project. 
  • Responsible for effective communication between the project team and the customer. Provide day to day direction to the project team and regular project status to the customer.
  • Translate customer requirements into formal requirements and design documents, establish specific solutions, and leading the efforts including programming and testing that culminate in client acceptance of the results.
  • Utilize in-depth knowledge of functional and Technical experience in ETL data warehouse and other leading-edge products and technology in conjunction with industry and business skills to deliver solutions to customer.
  • Establish Quality Procedure for the team and continuously monitor and audit to ensure team meets quality goals.
 
Environment:    UNIX, SAS Enterprise Miner Tool, Teradata DB, Teradata Studio 15, Teradata SQL
Assistant, Putty, WinSCP.
Team Size:          15 to 20
 
 
Northern Trust Corporation, Chicago IL                                                                    Dec 2014 – April 2015
EDW - Technical Architect
 
Project Description:
The Northern Trust Corporation is an American international financial services company headquartered in Chicago, Illinois. It provides investment management, asset and fund administration, fiduciary and banking services through a network of 85 offices in 18 U.S. states and 20 international offices in North America, Europe and the Asia-Pacific region. As of June 30, 2014, Northern Trust Corporation had $106 billion in banking assets, $6.0 trillion in assets under custody and $924.4 billion in assets under management.
 
Project Responsibilities:
  • Closely worked with Data Analyst to convert the business requirement to technical documents.
  • Created and maintained the source to target mapping specifications.
  • Made changes to all DataStage Jobs impacted due to ICD10 changes that loads into staging and or ODS.
  • Modified the jobs to read the new ICD10 fields from mainframe files based on the copy book provided.
  • Developed new jobs as part of the Informatica rewrite.
  • Upgraded the jobs from 7.5 to 8.1 using emblem ETL framework as part of shutdown activity
  • Worked with most of the stages like Datasets, Sequential File Stage, Oracle EE Stage, Filter, Funnel, Join, Lookup, Copy, Aggregator, and Change Capture during the rewrite from 7.5 to 8.1.
  • Worked with Datastage administrator to setup and test SQL Server connectivity from Datastage 8.1
  • Automated the process through Autosys to eliminate manual intervention
  • Created new JIL’s to execute jobs through autosys
  • Worked with MFT core team to create new job to transfer files to the client location
  • Worked actively with QA to test the code in test environment.
  • Worked with production support team to handle production issues during releases and job failures
  • Involved in code move and testing  prior to migrating to datastage 8.1
  • Worked closely with Datastage administrators to resolve issues during 8.1 migration.
  • Developed new jobs for vendor feeds that are consumed by internal Emblem Users.
  • Worked closely with users during UAT phase.
  • Created and presented documentation (CAB) for production moves.
  • Actively worked on the production deployments.
  • Created Unix shell scripts and wrote SQL scripts.
 
Environment     : UNIX, Datastage 9.1, Oracle 11g, SQL developer, Putty, Control M, HPQC
Team Size           : 15 to 20
 
Citigroup US      Tampa, FL                                                                                               Mar 2014 – Nov 2014
Project: Financial Back Bone (FBB)
Technical Architect
 
Project Description:
Citigroup Inc. or Citi is an American multinational investment banking and financial services corporation headquartered in Manhattan, New York City. Citigroup was formed from one of the world's largest mergers in history by combining the banking giant Citicorp and financial conglomerate Travelers Group in October 1998.
 
Project Responsibilities:
  • Responsible for leading a project team in delivering solution to our customer.
  • Deliver new and complex high quality solutions to clients in response to varying business requirements
  • Responsible for managing scope, planning, tracking, change control, aspects of the project. 
  • Responsible for effective communication between the project team and the customer. Provide day to day direction to the project team and regular project status to the customer.
  • Translate customer requirements into formal requirements and design documents, establish specific solutions, and leading the efforts including programming and testing that culminate in client acceptance of the results.
  • Utilize in-depth knowledge of functional and Technical experience in ETL data warehouse and other leading-edge products and technology in conjunction with industry and business skills to deliver solutions to customer.
  • Establish Quality Procedure for the team and continuously monitor and audit to ensure team meets quality goals.
 
Environment:    UNIX, Datastage 9.1, MDM, Salesforce, Business Glosary, Oracle 11g, SQL developer, Putty, Auto-sys, HPQC
Team Size:          100 to 150
 
Wal-Mart            Bentonville, AR                                                                                  Sep 2010 – Feb 2014
Project: Global Replenishment System
Technical Lead/Technical Architect
 
Project Description:
Wal-Mart Stores, Inc., doing business as Walmart is an American multinational retail corporation that operates a chain of hypermarkets, discount department stores and grocery stores. Headquartered in Bentonville, Arkansas, the company was founded by Sam Walton in 1962 and incorporated on October 31, 1969. As of January 31, 2016, Walmart has 11,535 stores in 27 countries, under a total of 72 banners. The company operates under the Walmart name in the United States and Canada.
 
Project Responsibilities:
  • Involved in status meetings, and interacted with the Business Analyst to get the business rules.
  • Migrated Outbound Data Feeds from SQL Server as source to Oracle as destination.
  • Documented the purpose of mapping so as to facilitate the personnel to understand the process and incorporate the changes when necessary.
  • Extracted data from text files, using FTP Stage and loaded into different databases.
  • By using the CFF stage extracted data from SAP source system and loaded into different database systems.
  • Designed Parallel jobs using various stages like join, merge, remove duplicates, filter, dataset, lookup file set, modify, aggregator and funnel stages.
  • Extensively used Built-in (Sort, Merge, Oracle, Aggregator, DB2 Stages), Plug-in Stages for extraction, transformation and loading of the data.
  • Used Quality Stage stages such as Investigate, Standardize, Match and Survive for data quality and data profiling issues.
  • Received the master data and populated the Dimension Tables, Time generation, Surrogate key generation. 
  • As per Data Architect’s directions, created and executed the SCD Type 2 implementation on Dimensional data
  • As per the data volume, presented the size requirements and partitioning requirements for ODS tables in Oracle.
  • Used MetaStage for the synchronization and Integration of Metadata from various Data Warehouse related tools and also used for automatically gathering process data from operational systems.
  • Developed the customized routines and functions for better job performance.
  • Used Autosys to schedule the DataStage ETL batch jobs on a daily, weekly and monthly basis.
  • Created shell scripts that will invoke the Data Stage jobs passing all variables for job to execute with parameterized databases connection information.
  • Used Management Studio to query the SQL Server database.
  • Analyzed data with discrepancies through Error files and Log files for further data processing and cleansing
  • Tested and modified jobs running in production environment, with minimized downtime.
  • Optimized partitioning and parallelism in jobs.
  • Worked on performance tuning and enhancement of DataStage jobs. 
 
Environment:    UNIX, Mainframe, Datastage 8.5, MDM, Salesforce, Business Glosary, Oracle 11g, SQL developer, Putty, CA -7, HPQC
Team Size:          70 to 120
 
Employer:           DS InfoTech/CMC, Kolkata                                                           June 2008 – Aug 2010
Projects: Electors Photo Identity Card Program (EPIC), Photo Electoral Roll Mgmt System (PERMS) 
Role:      Software Developer
 
Project Description:
CMC Limited is an information technology services, consulting and software company having its headquarters in New Delhi, India.CMC is part of the TATA Group and is owned by Tata Consultancy Services. CMC was incorporated on 26 December 1975, as the 'Computer Management Corporation Private Limited’. The Government of India held 100 per cent of the equity share capital and owned by government of India
 
Project Responsibilities:
  • Worked as a Software Developer.
  • Proactively sought out new tasks to make best use of available time.
  • Earned reputation for impeccable work and attention to detail.
  • Maintained good working relationship with managers and colleagues.
  • Executed unit tests to ensure proper GUI and coding standards.
  • Given training to the client when updated was done.
  • After every Intensive Revision of Electoral roll we can consolidate it with the Supplements roll using the S/W. 
 
Environment:    Data warehousing, Oracle 9i, PL/SQL Developer, UNIX.
Team Size:          10 to 20
 

DIP Software Solutions, Kolkata                                                                                                 May 2007 – May 2008
Project: ProcessWare ERP System  
Software Developer
 
Project Description:
As a Project Member responsible for Analysis, design, development.
 
Project Responsibilities:             
  • Leading the project team in delivering quality business solutions to the client in response to their varying business requirements.
  • Managing the scope, planning, tracking and change control aspects of the project.
  • Providing daily direction to the project team and status updates to the client regarding the same.
  • Translating client requirements into formal requirements and design documents, establishing system solutions and leading efforts in programming and testing of the proposed system enhancements for client approval.
  • Establishing a quality procedure for the project team and continuously monitoring the same to ensure quality goals are met
 
Environment:    Oracle 9i, PL/SQL Developer, UNIX.
Team Size:          10 to 15
 
TRAININGS
DataStage8.1 at IBM, Quality Stage, Cognos TM1 9.5, Datastage Fast Track, Oracle 11g DBA  by oracle experts

Local Availability

Only available in these countries: USA
Any where within USA
Profileimage by InnoMinInfinite Technologies IBM InfoSphere DataStage - Developer/Tester/Architect from IBM InfoSphere DataStage - Developer/Tester/Architect
Register