Inspections get transparent
The Care Quality Commission’s new single assessment framework is getting underway and private healthcare providers are wondering what it means for them.
Solicitor Philippa Doyle outlines the changes and argues that this new way of inspecting, evidence gathering and reporting will provide a far more consistent and transparent approach to regulation.
The Care Quality Commission’s (CQC’s) long-awaited Single Assessment Framework (SAF) is slowly being implemented, with providers imminently starting to receive information from the watchdog directly.
It is an important time for independent practitioners to understand what this means and how it will affect your service.
While the SAF is badged as a full-scale change in the inspection process, that change sits with the the CQC, not with providers.
It is the CQC which is changing the way it inspects, the questions it asks and how it rates rate and how it reports.
For providers, it is business as usual. And the regulatory framework that underpins the delivery of care has not changed and is not changing.
The Health and Social Care Act 2008 (Regulated Activities) Regulations 2014 and the Fundamental Standards of Care, enshrined in Part 2 of those regulations are here to stay.
All of the good work that you and your teams have done so far and continue to do and all the excellent policies and procedures you have in place can stay.
The SAF is the CQC’s new inspection regime. It says it will regulate in a smarter way, adapting and responding to risk, uncertainty and demand. We have seen clear signs of that already.
CQC inspections were led by risk during and post-Covid. So direct complaints to its helpdesk, whistle-blowing allegations and concerns raised by commissioners led to poorly performing services –or those perceived as being poorly performing – being at the front of the queue when it came to re-inspections.
Long gone is the inspection programme where a good service would not expect to see the CQC again for two years.
The four ratings of ‘inadequate’, ‘requires improvement’, ‘good’ and ‘outstanding’ remain. And the five key questions or domains of ‘safe’, ‘effective’, ‘caring’, ‘responsive’ and ‘well led’ will also stay.
What’s changing
What changes is the introduction of quality statements, which replace the previous 300 or so key lines of inquiry.
These quality statements are phrased as questions and providers will be judged against the available evidence to show how those questions are met.
Each statement is scored from one to four, based on the quality of the evidence submitted to the CQC.
At the end of the process, providers will be able to see very clearly exactly where their service is doing well and where there are gaps requiring extra attention.
Those providers looking to increase their rating up to the next level will also be able to see how far away from achieving that rating they are and where they need to focus their efforts.
The CQC has very helpfully shared the different types of evidence that might be required to answer each of the quality statements, and this is where the SAF differs for different sector groups.
The principle focus for the independent sector will be the independent doctors group, although you may also offer services under the other areas, so check your registration on the CQC website.
Each sector group will have slightly different evidence requirements, but all the details are available on the CQC’s website for a service to map across each quality statement and what evidence they need to put forward.
The evidence categories are all very familiar, too, and are all based on the work the CQC currently carries out when it inspects a service.
There are six evidence categories, although Outcomes is not applicable for independent doctors:
- People’s experience of health and care services;
- Feedback from staff and leaders;
- Feedback from partners;
- Observation;
- Processes;
- Outcomes.
Number 5 (Processes) is one of the more notable ones for providers to be aware of. This is any series of steps, arrangements or activities which are carried out to enable a provider or organisation to deliver their objectives.
Measuring outcomes
CQC assessments will focus on how effective policies and procedures are. To do this, the watchdog will look at information and data sources that measure the outcome from processes.
For example, it may consider processes that measure and respond to information from audits, look at learning from incidents or notifications, and it will review people’s care and clinical records.
It might be most helpful for providers to look at the quality statements in context.
A CQC PDF on its website, called ‘Independent Health Single Services: Evidence Categories’, will be very helpful to review and consider your service against.
The evidence categories are expressed as ‘we’ statements. The ‘we’ statements show what is needed to deliver high-quality person-centred care.
Let’s use ‘learning culture’ as an example.
The ‘we’ statement reads: ‘We have a proactive and positive culture of safety based on openness and honesty, in which concerns about safety are listened to, safety events are investigated and reported thoroughly, and lessons are learned to continually identify and embed good practices.’
The evidence categories the CQC will use to judge and score your responses to those statements are:
- People’s experience of health and care services: feedback from surveys;
- Feedback from staff and leaders: individual interviews and focus groups, whistleblowing;
- Processes: duty of candour records, evidence of learning and improvement, incident, near misses and events records.
The evidence submitted is then scored out of 1-4.
4 = exceptional standard;
3 = good standard;
2 = show shortfalls;
1 = significant shortfalls.
So if, for example, the service scored a 2 in learning culture, this would then feed into the overall scoring in the safe domain. The total is calculated and the final percentage is assessed against this score.
The percentage scores are clearly laid out:
25% to 38% = inadequate;
39% to 62% = requires improvement;
63% to 87% = good;
Over 87% = outstanding.
The service can see where they need to do better with quick reference to the lower score. The theory is that the service will be able to submit evidence to the CQC of its improvements, and if the CQC is satisfied with it, the scoring would be adjusted, which for some services might see an uplift from ‘requires improvement’ to ‘good’.
Do be mindful there are some rating limiters.
1. If the key question score is within ‘good’ range, but with a score of 1 for one or more statement, the rating is limited to ‘requires improvement’.
2. If the key question score is within the ‘outstanding’ range but with a score of 1 or 2 for one or more quality statement, the rating is limited to ‘good’.
More consistent approach
I am very hopeful that this new way of inspecting, evidence gathering and reporting will provide a far more consistent and transparent approach to regulation.
Evidence is key, though, even more than it ever has been. Providers will be uploading a lot of information onto the provider portal before a visit and the CQC’s analysis of a service will only be as good as the evidence submitted.
It is worth engaging with the different questions and what evidence the watchdog is looking for to ensure what you provide is specific and relevant. It is no good telling the CQC everything you know about a subject; you have to actually answer the exam question.
Hempsons has worked with providers for many years supporting them through the CQC process. Our fundamental standards of care training packages for registered managers and front-line staff can help you navigate the regulations, improve the lives of the people you support and tick CQC boxes too.
First published in the January edition of Independent Practitioner Today.