Description
Assignment Description
You will be part of the team working with the clients Advanced Analytics platform in central Stockholm, dealing with data engineering tasks. The team work in a truly agile way, so priorities will be set with the team on a sprint or daily basis. You will be responsible for the data ingestion, which lifts gigabytes of data to the data lake, from several sources and in multiple formats.
Work tasks
- Implement data pipelines from a variety of sources (streaming data, APIs, data warehouse, messages etc.)
- Coordination with other teams to design optimal patterns for data ingest and egress, as well as lead and coordinate data quality initiatives and troubleshooting
- Pro-actively and continuously learn new technologies, assessing their maturity and fit to our roadmap
- Design and build solutions to track data quality, stabilize data pipeline, etc. to ensure reliable operations
- Align with Infra & DevOps team regarding new releases and upgrading of cloud-based data platform
- Ensure best practices are followed across architecture, codebase and configuration
Competences
Required skills/qualifications:
- Solid experience and advanced hands-on skills in database modelling and database querying, in large corporate environments
- Experience in delivery of IT projects in retail
- Experience in architecting, developing and maintaining large scale data warehouses in IBM Netezza platform
- Experience with data models and data warehouse solution
- Experience with Git-based development workflows
Beneficial/Nice to have qualifications:
- Programming skills in Python
- Experience with Azure Data Factory
Tools/software:
- SQL
- Azure Data Factory
- PowerShell
- Bash
- Python
Personal Competences
- Excellent communication skills in English
- Comfortable in expressing and defending your ideas to other skilled engineers
- You consider yourself a team player, who is willing to give and receive feedback on a regular basis