Data Engineer

Warsaw, Masovian Voivodeship  ‐ Onsite
This project has been archived and is not accepting more applications.
Browse open projects on our job board.

Keywords

Java (Programming Language) Information Engineering Data Streaming Testing Airflow Amazon Web Services Architecture Automation Microsoft Azure BigQuery Cloud Computing Data Processing Continuous Delivery Continuous Integration Customer Demand Planning Data Infrastructure Data Structures Data Warehousing Design Elements and Principles Dialectical Behavior Therapy Distributed Systems Infrastructure Management Java Virtual Machine (JVM) Python (Programming Language) Mathematics Scala (Programming Language) Software Engineering SQL Databases Scheduling Google Cloud Apache Spark Cloud Technologies Backend Kotlin Data Lake Kubernetes Apache Flink Coaching and Mentoring Apache Kafka Mock Ups Terraform Docker Amazon Redshift

Description

Required Technical Qualifications & Experience
• Academic qualification in a computer science or STEM (science, technology, engineering or mathematics) related field or the foreign equivalent
• Professional experience working in an agile, dynamic and customer facing environment
• At least 5 years of recent hands-on professional experience (actively coding) working as a data engineer (back-end software engineer considered)
• Understanding of distributed systems and cloud technologies (AWS, GCP, Azure, etc.)
• Understanding of data streaming and scalable data processing frameworks (Kafka, Spark Structured Streaming, Flink, Beam etc.)
• Experience with SQL (any dialect) and Data tools (ie. Dbt)
• Experience in the all stages of software development lifecycle (requirements, design, architecture, development, testing, deployment, release and support)
• Experience with large scale datasets , data lake and data warehouse technologies on at least TB scale (ideally PB scale of datasets) with at least one of {BigQuery, Redshift, Snowflake}
• Experience in Infrastructure as Code (ideally Terraform) for Cloud based data infrastructure
• Good experience with using a JVM language (Java/Scala/Kotlin, preferably Java 8+) or extensive knowledge of Python



Desirable Technical Qualifications & Experience
• Experience with a scheduling system (Airflow, Azkaban, etc.)
• Understanding of (distributed and non-distributed) data structures, caching concepts, CAP theorem
• Understanding of security frameworks / standards and privacy
• Desired – experience in automating deployment, releases and testing in continuous integration, continuous delivery pipelines
• A solid approach to writing unit level tests using mocking frameworks, as well as automating component, integration and end-to-end tests
• Experience with containers and container-based deployment environment (Docker, Kubernetes, etc.)

Soft Skills
• Ability to work in a collaborative environment and coach other team members on coding practices, design principles, and implementation patterns that lead to high-quality maintainable solutions.
• Ability to work in a dynamic, agile environment within a geographically distributed team
• Ability to focus on promptly addressing customer needs
• Ability to work within a diverse and inclusive team
• Technically curious, self-motivated, versatile and solution oriented
• Excellent written and verbal communication skills in English

Start date
ASAP
Duration
6 months
(extension possible)
From
Adroit People
Published at
03.05.2023
Contact person:
Saipraneeth Godishala
Project ID:
2590892
Contract type
Freelance
To apply to this project you must log in.
Register