Job summary
The role holder is responsible for the leading the design, development, and ongoing enhancement of organisations data infrastructure and pipelines to support advanced data collection, storage, processing,
and analysis.
Main duties of the job
The role holder is accountable for a team of data engineers, fostering a culture of technical excellence and continuous improvement. The role holder will work collaboratively with cross-functional teams,
including analysts, software engineers, and key stakeholders, to ensure that data solutions are robust, scalable, and aligned with the strategic goals of delivering high-quality care for our patients.
About us
Since 2012, CHEC has been working with the NHS to increase patient choice and provide better access to exceptional, timely, locally-based ophthalmology and gastroenterology care free at the point of care.
CHEC has a nationwide portfolio of community hospitals and clinics operating in a unique hub-and-spoke model. We are proud to have a role alongside colleagues in the NHS, offering patients the choice of access to essential procedures and help achieve the best possible clinical outcomes. We continue to expand our community-based offering of vital healthcare to patients across England, including ENT (Ear, Nose and Throat) and Dermatology services.
CHEC is committed to safeguarding and promoting the welfare of children, young people and vulnerable adults and expects all employees to share this commitment, this includes being aware of and adhering to all CHEC Safeguarding policies. Recruitment checks are undertaken in accordance with the NHS Employment Check Standards and successful applicants may be required to undertake an Enhanced Disclosure via the Disclosure and Barring Service (DBS).
Job description
Job responsibilities
Lead the design and execution of scalable data storage solutions, including databases, data
warehouses, and data lakes, ensuring efficient handling of large data volumes.
Oversee development and optimisation of ETL pipelines for effective data extraction, transformation, and loading from diverse sources.
Establish and manage data schemas, models, and dictionaries to promote data governance and
ensure consistency across the organisation.
Develop data integration solutions to facilitate seamless and secure data flow between systems.
Lead data cleansing, validation, and enrichment processes to ensure data accuracy and quality.
Engage with analysts and software engineers to convert business needs into robust data
engineering solutions and provide comprehensive reports.
Identify and address performance bottlenecks, optimise data processing and query performance
for scalability.
Implement monitoring frameworks to oversee data pipeline health, swiftly resolving issues to
maintain system integrity.
Conduct performance tuning to fulfil scalability and availability targets.
Develop and enforce stringent data security measures, including access controls and
encryption, to protect sensitive data.
Ensure all data handling complies with relevant data protection and privacy regulations.
Document data engineering processes and configurations to maintain a detailed knowledge
base.
Lead collaborations with stakeholders to align data services with business requirements,
delivering essential reports and data visualisations
Job description
Job responsibilities
Lead the design and execution of scalable data storage solutions, including databases, data
warehouses, and data lakes, ensuring efficient handling of large data volumes.
Oversee development and optimisation of ETL pipelines for effective data extraction, transformation, and loading from diverse sources.
Establish and manage data schemas, models, and dictionaries to promote data governance and
ensure consistency across the organisation.
Develop data integration solutions to facilitate seamless and secure data flow between systems.
Lead data cleansing, validation, and enrichment processes to ensure data accuracy and quality.
Engage with analysts and software engineers to convert business needs into robust data
engineering solutions and provide comprehensive reports.
Identify and address performance bottlenecks, optimise data processing and query performance
for scalability.
Implement monitoring frameworks to oversee data pipeline health, swiftly resolving issues to
maintain system integrity.
Conduct performance tuning to fulfil scalability and availability targets.
Develop and enforce stringent data security measures, including access controls and
encryption, to protect sensitive data.
Ensure all data handling complies with relevant data protection and privacy regulations.
Document data engineering processes and configurations to maintain a detailed knowledge
base.
Lead collaborations with stakeholders to align data services with business requirements,
delivering essential reports and data visualisations
Person Specification
Experience
Essential
- Previous experience in a similar, fast paced environment
- Proven experience working as a lead data engineer or in a similar role, handling large datasets
- and complex data pipelines.
- Previous experience managing a team
- Experience with big data processing frameworks and technologies.
- Experience with data modelling and designing efficient data structures.
- Experience with data integration and ETL (Extract, Transform, Load) processes.
- Experience in data cleansing, validation, and enrichment processes.
- Strong programming skills in languages such as Python, Java, or Scala.
- Knowledge of data warehousing concepts and dimensional modelling.
- Understanding of data security, privacy, and compliance requirements
- Proficiency in data integration and ETL tools
- Strong analytical skills and the ability to understand complex data structures.
- Capable of identifying data quality issues, troubleshooting problems, and implementing effective
- solutions.
Person Specification
Experience
Essential
- Previous experience in a similar, fast paced environment
- Proven experience working as a lead data engineer or in a similar role, handling large datasets
- and complex data pipelines.
- Previous experience managing a team
- Experience with big data processing frameworks and technologies.
- Experience with data modelling and designing efficient data structures.
- Experience with data integration and ETL (Extract, Transform, Load) processes.
- Experience in data cleansing, validation, and enrichment processes.
- Strong programming skills in languages such as Python, Java, or Scala.
- Knowledge of data warehousing concepts and dimensional modelling.
- Understanding of data security, privacy, and compliance requirements
- Proficiency in data integration and ETL tools
- Strong analytical skills and the ability to understand complex data structures.
- Capable of identifying data quality issues, troubleshooting problems, and implementing effective
- solutions.
Disclosure and Barring Service Check
This post is subject to the Rehabilitation of Offenders Act (Exceptions Order) 1975 and as such it will be necessary for a submission for Disclosure to be made to the Disclosure and Barring Service (formerly known as CRB) to check for any previous criminal convictions.
UK Registration
Applicants must have current UK professional registration. For further information please see
NHS Careers website (opens in a new window).
Additional information
Disclosure and Barring Service Check
This post is subject to the Rehabilitation of Offenders Act (Exceptions Order) 1975 and as such it will be necessary for a submission for Disclosure to be made to the Disclosure and Barring Service (formerly known as CRB) to check for any previous criminal convictions.
UK Registration
Applicants must have current UK professional registration. For further information please see
NHS Careers website (opens in a new window).