Keywords
Data Analysis
Information Engineering
Extract Transform Load (ETL)
Unix Shell
Acceptance Testing
Audit Trail
Unit Testing
Microsoft Azure
Cloud Computing
Data Integration
Data Warehousing
IBM DB2
Relational Databases
Electronic Data Interchange (EDI)
Apache Hadoop
Hadoop Distributed File System
Python (Programming Language)
Oracle SQL Developer
Azure Data Lake
Reverse Engineering
SQL Databases
Azure Data Factory
Informatica Powercenter
Pyspark
Splunk
Databricks
+ 16 more keywords
Attachments
saikotroy_deetl_4-5_v502.pdf
Please upgrade to the business membership to download freelancers' CVs.
Skills
user Acceptance Testing, Hadoop, audit log, Azure Data Factory, Azure Data Lake, Cloud, data analysis, Data Analyst, data integration, Data Warehousing, Databricks, Data Exchange, ETL, HDFS, IBM DB2, Informatica PowerCenter, Data Engineer, azure, Oracle SQL, Pyspark, Python, RDBMS, reverse engineering, SQL, Splunk, Unit-testing, Unix Shell Scripting, shell
Project history
The primary focus is to design the end to end pipeline and transform the data as per business requirement.
Source to target data analysis and planning for the activities required layers.
Creation/Develop scripts to automate repetitive tasks.
Performed necessary testing for Data Quality.
Top focus on client's satisfaction & meet the expectations