Big Data Solution Engineer

81545  ‐ Remote
This project has been archived and is not accepting more applications.
Browse open projects on our job board.

Description

Big Data Solution Engineer (m/f/d)

We are looking for a Big Data Solution Engineer to work 100% on remote for our client in Munich.

Location: Remote
Start Data: January 2021
Duration: 3 months + Extension (long project)
Rate: Daily Rate
Main Tasks:
· Rollout a data platform to internal customers and support the adaption of the technical services and the data capabilities
· Design the solutions of the complete use-case pipelines, from data ingestion to data processing, software deployment and feedback loop
· Understand and explain the advantages and disadvantages of the proposed solutions to internal and external stakeholders
· Build and maintain data-driven applications
· Contribute to improve the platform stack and define the final production environment
· Keep up with trends and evolving technology in the big data and analytics world
· Look for opportunities to improve performance, reliability, and automation
· Write technical documentation, announcements, and blog posts

Technical Skills:
Data Engineering (using the tools to build data-driven solutions)
· Competence in running big data workloads in production at scale
· Data engineering patterns (ingestion, transformation, storage, consumption, etc.)
· Event-based systems (deep knowledge of Kafka, ideally Confluent Kafka)
· Databases (e.g. PostgreSQL, MongoDB)
· Cloud storage experience (e.g. Azure Blob and Azure Data Lake Storage)
· Overview of common data modeling techniques, in particular: Data Vault 2.0
· Data quality patterns and tools (e.g. technical and business rules, DLQ patterns, Informatica Axon)
· Data virtualization (e.g. Denodo, Dremio, Tibco)
· Distributed systems (e.g. Spark)

Software Development
· Proficiency in at least one core programming language (Python, Java, Scala, etc.)
· Software engineering (e.g. microservices, design patterns, version control, automated testing, concurrent programming, etc.)
· CI/CD principles and tools (e.g. GitHub/GitLab, Jenkins)

General IT Skills
· Advanced Experience with Linux
· Hands-on proficiency with at least one cloud environment (preferably Microsoft Azure)
· First exposure to data science requirements, platforms, and tools (Jupyter notebooks, TensorFlow/Keras/PyTorch, MLflow, GPU-accelerated training of ML models)
· Some experience with IT architecture (users, roles, requirements, layering, functional components)
· Requirement engineering (e.g. solution strategy, application of reusable patterns, managing stakeholder requirements)
· Containerizing and distributing apps (e.g. Docker, Azure Container Registry)
· Planning, deploying and maintaining apps on container runtime (with Kubernetes, preferably Azure Kubernetes Service)
· Networking skills (NAT, Firewalls, Proxies, etc.)
· Solid understanding of IT security and data protection best practices on the cloud

Other Skills
· DevOps mindset (you build it, you run it; taking responsibility for your work)
· Willingness and ability to learn new technologies and tools
· Team player open to working in an agile environment
· Fluent English (written and spoken) is a must
· Capability of result-oriented communication with people from different departments with different skill sets


If you are interested please send me you CV , rate and availability to

Merry Christmas!
Start date
01.2021
Duration
3 months
(extension possible)
From
Futuro Associates GmbH
Published at
21.12.2020
Contact person:
Paola Ochoa
Project ID:
2020200
Contract type
Freelance
Workplace
100 % remote
To apply to this project you must log in.
Register