Description
ROLE OVERVIEW
We are looking for a talented Data Engineer to join our team and play a crucial role in designing, building, and maintaining robust data pipelines and infrastructure. In this role, you will collaborate with cross-functional teams to ensure efficient data flow, storage, and accessibility, empowering the organization to make data-driven decisions. Your expertise in modern data engineering tools and techniques will be key to delivering high-performance and scalable data solutions.
KEY RESPONSIBILITIES
Design, develop, and maintain data pipelines and ETL processes to ingest, transform, and deliver data efficiently.
Build and manage scalable data storage solutions (data warehouses, data lakes) to support analytics and business needs.
Collaborate with data scientists, analysts, and software engineers to ensure data accessibility and usability.
Optimize data systems for performance, reliability, and scalability.
Implement and enforce data security and governance policies.
Monitor and troubleshoot data pipelines to ensure seamless data flow and identify areas for improvement.
Stay updated with industry trends and emerging technologies in data engineering.
WHAT WILL MAKE US HAPPY ?
Strong proficiency in Python, SQL, and frameworks like Apache Spark or Kafka.
Experience with cloud platforms (e.g., Azure, AWS, or GCP) for data storage and processing.
Expertise in relational databases (e.g., PostgreSQL, MySQL) and non-relational databases (e.g., MongoDB, Cassandra).
Familiarity with data modeling, schema design, and performance optimization.
Hands-on experience with tools like Airflow, dbt, or similar for workflow orchestration.
Knowledge of version control tools (e.g., Git) and CI/CD pipelines.
Solid understanding of data security, privacy, and compliance best practices.
Excellent problem-solving and analytical skills.
Strong communication skills to work effectively with diverse teams.
NICE TO HAVE
Experience with big data technologies like Hadoop or Databricks.
Knowledge of machine learning frameworks and their integration with data pipelines.
Understanding of data visualization tools such as Tableau or Power BI.
Familiarity with containerization and orchestration tools like Docker and Kubernetes.
We are looking for a talented Data Engineer to join our team and play a crucial role in designing, building, and maintaining robust data pipelines and infrastructure. In this role, you will collaborate with cross-functional teams to ensure efficient data flow, storage, and accessibility, empowering the organization to make data-driven decisions. Your expertise in modern data engineering tools and techniques will be key to delivering high-performance and scalable data solutions.
KEY RESPONSIBILITIES
Design, develop, and maintain data pipelines and ETL processes to ingest, transform, and deliver data efficiently.
Build and manage scalable data storage solutions (data warehouses, data lakes) to support analytics and business needs.
Collaborate with data scientists, analysts, and software engineers to ensure data accessibility and usability.
Optimize data systems for performance, reliability, and scalability.
Implement and enforce data security and governance policies.
Monitor and troubleshoot data pipelines to ensure seamless data flow and identify areas for improvement.
Stay updated with industry trends and emerging technologies in data engineering.
WHAT WILL MAKE US HAPPY ?
Strong proficiency in Python, SQL, and frameworks like Apache Spark or Kafka.
Experience with cloud platforms (e.g., Azure, AWS, or GCP) for data storage and processing.
Expertise in relational databases (e.g., PostgreSQL, MySQL) and non-relational databases (e.g., MongoDB, Cassandra).
Familiarity with data modeling, schema design, and performance optimization.
Hands-on experience with tools like Airflow, dbt, or similar for workflow orchestration.
Knowledge of version control tools (e.g., Git) and CI/CD pipelines.
Solid understanding of data security, privacy, and compliance best practices.
Excellent problem-solving and analytical skills.
Strong communication skills to work effectively with diverse teams.
NICE TO HAVE
Experience with big data technologies like Hadoop or Databricks.
Knowledge of machine learning frameworks and their integration with data pipelines.
Understanding of data visualization tools such as Tableau or Power BI.
Familiarity with containerization and orchestration tools like Docker and Kubernetes.