Data Engineer
About the job
Job Title: Data Engineer
Location: Birmingham or Northampton (Hybrid)
Contract Type: Full-time, Permanent
Salary: Competitive + Benefits
About the Role
We’re recruiting for a Data Engineer to join a forward-thinking organisation at a pivotal stage of its data transformation journey. With legacy systems approaching end of life, the business is investing in a modern data platform built on Microsoft Fabric, forming a core part of its long-term data strategy.
This role will play a key part in designing, building, and maintaining scalable data infrastructure, supporting a wide range of business functions including finance, HR, and commercial teams. You’ll work closely with both technical and non-technical stakeholders to deliver robust, reliable data solutions and reporting capabilities.
This is an excellent opportunity for someone who enjoys working in a collaborative environment, contributing to modern data architecture, and driving continuous improvement across data engineering practices.
Key Responsibilities
- Support the implementation and ongoing development of a Microsoft Fabric-based data platform.
- Design, build, and maintain data ingestion pipelines from multiple source systems.
- Develop and manage ETL/ELT processes using tools such as Azure Data Factory and Fabric Data Pipelines.
- Build scalable data models to support reporting across finance, HR, and commercial functions.
- Create and maintain a high-quality reporting layer for internal and external stakeholders.
- Ensure strong data governance practices, including compliance with GDPR and data security standards.
- Collaborate with stakeholders to translate business requirements into technical solutions.
- Contribute to documentation, knowledge sharing, and continuous improvement of engineering processes.
- Support and mentor colleagues to enhance technical capability across the team.
Candidate Profile
- Proven experience in data engineering within enterprise or complex environments.
- Strong experience with Microsoft Fabric (including OneLake, Lakehouse, Data Pipelines, Dataflows, Power BI models).
- Proficiency in SQL (including optimisation) and Python/PySpark for data transformation.
- Experience with ETL/ELT processes using Fabric Data Pipelines and/or Azure Data Factory.
- Solid understanding of data modelling, including Lakehouse and dimensional design approaches.
- Experience working with data governance frameworks, security (RBAC), and GDPR compliance.
- Familiarity with version control (e.g. Git) and CI/CD processes.
- Strong communication skills, with the ability to engage both technical and non-technical stakeholders.
- Experience with SAP data environments (e.g. HANA, BW, SuccessFactors) is advantageous.
What Our Client Offers
- Competitive salary and comprehensive benefits package.
- Hybrid working model (Birmingham or Northampton base).
- Opportunity to be part of a major data transformation programme.
- Collaborative and supportive team environment.
- Ongoing professional development and career progression opportunities.
- Exposure to modern data technologies and enterprise-scale projects.
