Description
Essential role requirements:
Experience of working with Data Lake technologies, ideally a mixture of Hadoop components.
Experience in modelling data for the purposes of analytics including the definition star schemas (Kimball methodology).
Experience in the use of graphical ETL tools. Ideally this would be with Talend, although experience of similar tools such as Informatica would be considered.
Experience of writing SQL and performing SQL performance tuning
Experience of producing design documentation and specifications
Experience of Datamodelling toolsets such as Sparx EA
Exposure to, or experience to, data mining and machine learning.
Exposure to Oracle Warehouse Builder and PL/SQL
Exposure to working with core HANA data provisioning tools/concepts (SAP Data Services, DXC, SLT)