Description
Dear Consultant,
We are looking for an experienced ETRM Data Architect with strong expertise in Python, Azure Data tools, and trading system integrations. This role requires hands-on development experience in data transformation and architecture within the Energy Trading and Risk Management (ETRM) domain, especially in Gas and Power trading.
Location: Poland (Remote)
Type: B2B Contract
Experience: 10+ years
Key Responsibilities:
Architect and develop data transformation workflows using Python
Build and maintain APIs and dashboards using Streamlit
Work with Azure Data Factory, Data Lake, Databricks, and Snowflake
Integrate with trading systems such as Allegro, RightAngle, Endur
Implement robust CI/CD pipelines and ensure scalable DevOps practices
Lead data architecture planning and mentor junior developers
Optimize real-time data processing and ensure data quality
Must-Have Skills:
Strong Python development and Streamlit expertise
Kafka, FastAPI, and AKS exposure
Azure Data Factory (ADF), Data Lake, Snowflake, Databricks
Unit testing with pytest/unittest
Deep understanding of ETRM data flows
DevOps and CI/CD pipeline experience
Nice-to-Have:
Hands-on experience in ETRM tools (Allegro, RightAngle, Endur)
Knowledge of power/gas trading concepts
Certifications in Azure or related tech
Qualifications:
Bachelor’s or Master’s in Computer Science or related field
We are looking for an experienced ETRM Data Architect with strong expertise in Python, Azure Data tools, and trading system integrations. This role requires hands-on development experience in data transformation and architecture within the Energy Trading and Risk Management (ETRM) domain, especially in Gas and Power trading.
Location: Poland (Remote)
Type: B2B Contract
Experience: 10+ years
Key Responsibilities:
Architect and develop data transformation workflows using Python
Build and maintain APIs and dashboards using Streamlit
Work with Azure Data Factory, Data Lake, Databricks, and Snowflake
Integrate with trading systems such as Allegro, RightAngle, Endur
Implement robust CI/CD pipelines and ensure scalable DevOps practices
Lead data architecture planning and mentor junior developers
Optimize real-time data processing and ensure data quality
Must-Have Skills:
Strong Python development and Streamlit expertise
Kafka, FastAPI, and AKS exposure
Azure Data Factory (ADF), Data Lake, Snowflake, Databricks
Unit testing with pytest/unittest
Deep understanding of ETRM data flows
DevOps and CI/CD pipeline experience
Nice-to-Have:
Hands-on experience in ETRM tools (Allegro, RightAngle, Endur)
Knowledge of power/gas trading concepts
Certifications in Azure or related tech
Qualifications:
Bachelor’s or Master’s in Computer Science or related field