Your mission
We are looking for a part-time Data Engineer to take ownership of our existing data pipelines and design new ones as our business grows. You will work closely with our analytics, customer success, and finance teams to make sure we have the data we need in a single, reliable source of truth. We have around 15 years of data from infrastructure construction projects.
As Data Engineer you will strengthen the robustness of our pipelines, making sure jobs complete successfully and run on the correct schedule. As our data ecosystem matures, you’ll also have the opportunity to research and test AI and advanced analytics use cases if that interests you.
You can plan your own schedule and working time flexibly, but we hope that you can work on average 15 hours per week. You can work full-time during the summer if you wish. This is fixed-term contract until the end of 2026, with option to continue after that on part-time or full-time basis.
What you’ll do
You’ll work with a modern analytics stack:
As Data Engineer you will strengthen the robustness of our pipelines, making sure jobs complete successfully and run on the correct schedule. As our data ecosystem matures, you’ll also have the opportunity to research and test AI and advanced analytics use cases if that interests you.
You can plan your own schedule and working time flexibly, but we hope that you can work on average 15 hours per week. You can work full-time during the summer if you wish. This is fixed-term contract until the end of 2026, with option to continue after that on part-time or full-time basis.
What you’ll do
- Take ownership of our existing ELT pipelines, ensuring they are reliable, well-documented, and scalable. You will be supported by our internal analytics team, extensive documentation, and external consultants who know our systems.
- Collaborate with analytics, customer success, and finance teams to understand data needs and improve data availability, quality, and usability.
- Design and implement new data pipelines as our business and data use cases evolve.
- Monitor, troubleshoot, and improve pipeline performance to make sure jobs run correctly and on schedule.
- Optionally research and prototype AI or advanced analytics use cases as our data ecosystem matures
You’ll work with a modern analytics stack:
- Singer and Fivetran for data extraction and loading
- Snowflake as our data warehouse
- dbt for data transformation
- Power BI for visualization and business intelligence
- Apache Airflow for orchestration
- Python for writing small scripts