Profileimage by Neha Talwar Informatica ETL Developer, Data Analyst from

Neha Talwar

available

Last update: 02.02.2024

Informatica ETL Developer, Data Analyst

Graduation: not provided
Hourly-/Daily rates: show
Languages: English (Native or Bilingual) | Hindi (Native or Bilingual)

Attachments

A-neha-lumen_020224.docx

Skills

Relational Databases, XML, Informatica Power, Oracle, SQL Server, Heterogeneous Sources, Informatica Designer, processing tasks, Workflow Manager, SQL, PL/SQL, Shell scripting, Pentaho, ETL, Data modeling, Data Management, ETL scripts, SQL databases, database, Pentaho Data Integration, Server 5.0, Dimensional Modeling, Star Schema, Pentaho Enterprise Console (PEC), Data Warehouse architecture, ETL framework, data analysis, data conversion, data integration, ETL modules, ETL software, Informatica, stored procedures, Aggregator, ODBC, connectivity, databases, Web Service, large data, partitioning, Pentaho 7.6, Oracle 10g, UNIX, SVN, Windows XP, VISIO, Business Objects, DB2, router, SQL queries, referential integrity, Power Exchange Navigator, IMS, Oracle 11g, CDC, Source Qualifier, business logic, bugs, Unit testing, metadata, Parsing, coding, Informatica powercenter 9.6x, Winscp, putty, Toad, Analytics, data visualization tools, machine learning, UAT, quality assurance, data mining, data extraction, Microsoft Azure cloud, release management, Excel Services Reports, data warehouse, data integrity, Machine learning algorithms, Regression

Project history

10/2014 - 08/2017
Informatica ETL Developer
BlazeClan Technologies

BlazeClan Technologies - Pune, India

Responsibilities:

* Worked with filter transformations and flat files to identify source and target bottlenecks
* Worked with various transformations including router transformation, update strategy,
expression transformation, lookup transformation, sequence generator, aggregator transformation
and sorter transformation.
* Used Oracle to write SQL queries that create/alter/delete tables and to extract the necessary
data
* Used UNIX to navigate around the system and check for specific files, the files' content,
change permissions and see who the current users are.
* Generated the required Meta data at the source side, in the Informatica Mappings.
* Trained technical and non-technical users on the use of Informatica
* Have prepared the impact analysis document and high-level design for the requirements.
* Have involved on performance tuning by collecting statistics and observing explain plan.
* Carried out performance tuning both on Informatica side and database side.
* Have worked on slowly changing Dimensions and Operational Schemas.
* Experience in creating Transformations and Mappings using Informatica powercentre Designer
* Processing tasks using Workflow Manager to move data from multiple sources into targets and
vice versa.
* Created and managed schema objects such as Tables, Views, Indexes and referential integrity
depending on user requirements.
* Created Data Maps / Extraction groups in Power Exchange Navigator for legacy IMS Parent
sources.
* Staged Data from legacy Proclaim IMS system into Oracle 11g Master Tables
* Performed CDC capture registrations
* Assisted in building the ETL source to Target specification documents by understanding the
business requirements
* Developed mappings that perform Extraction, Transformation and load of source data into Derived
Masters schema using various power center transformations like Source Qualifier, Aggregator,
Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure, SQL,
Normalizer and update strategy to meet business logic in the mappings
* Reusable transformations and Mapplets are built wherever redundancy is needed
* Performance tuning is performed at the Mapping level as well as the Database level to increase
the data throughput
* Designed the Process Control Table that would maintain the status of all the CDC jobs and
thereby drive the load of Derived Master Tables.
* Investigating and fixing the bugs occurred in the production environment and providing the
on-call support.
* Performed Unit testing and maintained test logs and test cases for all the mappings.
* Maintained warehouse metadata, naming standards and warehouse standards for future application
development.
* Parsing high-level design specification to simple ETL coding along with mapping standards.

Environment: Informatica powercenter 9.6x, Oracle 11g,Winscp,putty,Toad

06/2012 - 09/2014
Data Analyst
Quantiphi Analytics

Quantiphi Analytics, Mumbai India

Responsibilities:

* Executed data integration using ETL processes of the source systems.
* Analyzed data using data visualization tools and reported key features using statistic tools and
supervised machine learning techniques to achieve project objectives.
* Developed implementation strategies for Sales Plan integration into the system.
* Trained Business Analysts on system operation. Provided requirements gathering and UAT support
for capability development.
* Defined configuration specifications and business analysis requirements.
* Performed quality assurance and defined reporting and alerting requirements.
* Assisted in designing, documenting and maintaining system processes.
* Reported on common sources of technical issues or questions and make recommendations to product
team.
* Communicate key insights and findings to product team.
* Performed data mining tasks related to system break/fix issues, and provided workarounds and
problem solving.Work independently or collaboratively throughout the complete analytics project
lifecycle including data extraction/preparation, design and implementation of scalable machine
learning analysis and solutions, and documentation of results.
* Developed and deployed Machine learning as a service on Microsoft Azure cloud service.
* Worked with sales and Marketing team for Partner and collaborate with a cross-functional team to
frame and answer important data questions
* Tracked key project milestones and adjusted project plans and resources to meet the needs of
customers.
* Provided support for projects in project planning, quality plan, risk management, requirements
management, change management, defect management and release management.
* Built and maintained queries for analysis/extraction for different databases.
* Developed Excel Services Reports for the Network Team.
* Created technical documentations for each Mapping for future developments.
* Maintained data warehouse tables through the loading of data and monitored system
* configurations to ensure data integrity.Supervised and Unsupervised Machine learning algorithms
like Regression, Decision Trees, Random

Local Availability

Only available in these countries: USA

Other

  • Over 7+ years of IT experience in analysis, design, development and implementation of software applications in data warehousing.
  •  Experience in analysis, design, development and implementation of software applications in data warehousing.
  • Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.
  • Experience in Designing, developing data conversion/data integration modules to support large scale system migrations and implementations.
  • Expertise in Metadata manager by importing metadata from different sources such as Relational Databases and XML Sources into Frame Work Manager
  • Expertise in Data Warehouse development working with Extraction/Transformation/Loading using Informatica Power Center with Oracle, SQL Server and Heterogeneous Sources
  • Developed Data Warehouse architecture, ETL framework and BI Integration using Pentaho Reports and Pentaho Dashboards.
  • Experience in building the Data warehouse using Ralph-Kimball methodology.
  • Extensive experience in developing mappings for Extraction, Transformation, Loading (ETL) data from various sources into Data Warehouse/Data Marts.
  • Experience in creating Reusable Transformations (Joiner, Lookup, Sorter, Aggregator, Expression, Update Strategy, Router, Filter, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Extensively worked on developing and debugging Informatica mappings, mapplets, sessions and workflows.
  • Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Experience in performance tuning of SQL queries and ETL components.
  • Extensively used SQL and PL/SQL in creation of Triggers, Functions, Indexes, Views, Cursors and Stored Procedures.
  • Proficient in writing documents, preparing presentations and presenting them.
  • Experienced in UNIX Shell scripting as part of file manipulation, Scheduling and text processing.
  • Well organized, goal oriented, with excellent trouble shooting and problem solving skills.
  • Strong ability to quickly adapt to new applications, platforms and languages.
Profileimage by Neha Talwar Informatica ETL Developer, Data Analyst from Informatica ETL Developer, Data Analyst
Register