The Ministry of Health (the Ministry) and the Health Quality & Safety Commission (the Commission) are introducing patient experience measures for primary care using an online patient survey. The primary care patient experience survey, or PES, has been developed by the Commission to find out what patients’ experience in primary care is like and how their overall care is managed between their general practice, diagnostic services, specialists, and/or hospital staff. The information will be used to improve the quality of service delivery and patient safety.
The survey looks at a patient’s experience of the whole health care system using primary care as a window. It focuses on the coordination and integration of care, rather than just the last visit to a GP’s surgery.
The survey is modular: patients answer questions relevant to their experiences. For example questions on medication and chronic conditions will be answered only by patients for whom this is relevant.
Patient feedback is voluntary and anonymous.
Yes. The PES will be adopted by all practices as part of the PHO Services Agreement. However, there will be a phased roll out.
The survey enables patients to have a voice and health teams that care for them can hear it through a direct and timely link. Participation in the survey provides PHOs and practices access to real time reporting via a secure log in through any internet browser. This is funded by the Ministry.
Participation in the survey will also be recognised as a source of evidence that can be used towards meeting Indicator 9 of the Foundation Standard and Aiming for Excellence: The practice includes patients’ input into service planning.
Survey invitations will be emailed or texted to patients through Cemplicity, the provider of the national survey and reporting system. Patients will receive a website link and be asked to complete the survey within three weeks. Their anonymous responses will be reported to practices and PHOs in real time via a secure online report portal. Summarised information will be reported to DHBs, the Commission and the Ministry the same way. The process is automated to minimise administrative burden to practices.
If not already doing so, we encourage practices to start routinely asking for patients’ email contact information. Individual rather than family contacts are preferred.
Cemplicity has worked with public health sector clients for a number of years. Their data protocols have been developed in close consultation with government agencies responsible for the protection of patient privacy and the data security of public health records.
We are testing and considering in situ surveying, where patients can complete the online survey using a tablet while at the practice. We will let you know once this method has been validated.
Patients aged 15 years and over and enrolled with and seen by participating practices in the survey sample week each quarter will receive a survey invitation via email or SMS. Children under 15 are not included in the survey.
The survey is conducted nationally every three months. Patients won’t be asked to participate more than once every six months. In fact there is only a 19.8 percent chance of a patient that sees a doctor or nurse once a month receiving the survey twice in a year.
The length of the survey depends on how many health care services were accessed in the past year. Most people can complete the survey within 20 minutes.
Survey length was considered during the cognitive testing phase. While some questions were shortened or removed, patients wanted more opportunities to provide comments and both patients and medical staff felt the value of the comments outweighed the additional time spent to complete the survey.
Patients can opt-out permanently of the survey in two ways:
Instructions for opting patients out via PMS are available here.
As it possible that people may wish to provide feedback on their experience at another time, we encourage people not to opt-out permanently from the survey. It is better they ignore the invitation at the time if they don’t have any comment to make but retain the potential to provide feedback in the future.
As part of the pilot phase, test surveys were sent to a subset of patients from July to October 2015. The PHOs involved in the pilot have been running the survey each quarter since February 2016.
Five PHOs were involved in the pilot phase: Procare Networks, National Hauora Coalition, Whanganui Regional Health Network, Compass Health and Pegasus Health.
Additionally, Midland Health Network participated in the cognitive testing process.
For practices whose PHO was not involved in the pilot, roll-out of the survey will occur as the National Enrolment Service (NES) is implemented.
Resources for PHOs and practices, including detailed instructions for joining the survey are available to download on the Commission’s website. Once you are ready to participate in the survey, practices need to inform their PHO, and PHOs should provide collated information to Cemplicity for each of their practices wishing to join the survey. Practices that have begun using NES can start recording patient preferences using the functionality in NES, but need to contact their PHO to participate in the survey. The next survey period is from 7 – 13 February 2017.
Understanding patients’ experience is vital to improving patient safety and the quality of care. It helps us understand the quality of health and disability services. Currently New Zealand does not have a consistent national approach to collection, measurement and use of primary care patient experience information on a regular basis.
The survey is framed in four domains that are closely aligned with current international best practice: coordination, partnership, physical and emotional needs and communication. The selection of these four domains emphasise that high quality experience for patients depends on effective communication, a real partnership, and seamless coordination of care which meets both the physical and emotional needs of the patient.
The survey results are reported in an online dashboard that is designed to show summarised results at the national level. Each item shown in the dashboard is available in more detail in the trend and comments reports.
Survey results are summarised into four key domains – coordination, partnership, physical and emotional needs and communication. The overall scores for each of the four domains are reported in the portal. More information on the questions and scoring is available in our reporting portal user guide.
First and foremost, the survey results are a tool for practices to use in their quality improvement activity to improve patient outcomes.
PES, along with inpatient survey, is an important component of the patient experience of care System Level Measure. The uptake of PES should form part of contributory measures in the System Level Measures improvement plan. The pilot PHOs may use the PES results to set their improvement milestone for this System Level Measure.
The Royal New Zealand College of General Practitioners has confirmed that the survey will be recognised as a source of evidence towards meeting indicator nine of the Foundation standard: the practice includes patients’ input into service planning.
Practices may choose to make their survey results available to their patients, for example, displaying a poster showing key results or improvements made following feedback.
There are no plans to publish practice-level survey results and only PHOs will be able to see their practice's results. The Commission may publish an annual report on the survey findings. This will not report results below the PHO level. Responses with the potential to be identifiable, whether it is practice, practitioner or patient, will not be reported.
A sector governance group determines the data access and reporting levels (who can see what). The table below shows who might have access and what their level of access is by organisation.
|General practice||General manager, practice manager, general practitioner, nurse, admin team||
Own practice results; other practices are anonymous. Can only view comments for their practice.
|PHO||Quality manager/lead, clinical director, primary care manager||See all practices in their PHO by name.|
|DHB||Planning and Funding, Quality and Risk Managers, possibly DHB Alliance representative||Results only for PHOs where they are the lead DHB. Can see comments by unnamed practices in their area.|
|National||Commission (three people from Health Quality Evaluation) and two Ministry of Health staff||Can see all comments, although only the PHO and DHB are identifiable.|
The survey was cognitively tested in three phases and in six PHOs providing a North Island, South Island, urban and rural spread. The PHOs were: Procare (Auckland); National Hauora Coalition (national but mainly providing services in Auckland, Waikato and Tairawhiti); Midlands Health Network (Gisborne, Taranaki, Taupo-Turangi and Waikato); Whanganui Regional PHO (Whanganui and rural areas such as Taihape); Compass (Wellington, Kapiti and Wairarapa); and Pegasus (Canterbury).
Fifteen focus groups were undertaken nationally, covering the following population groups: Adults aged 25–64 years (2); Maori (2); Pasifika (2); Asian (1); Refugee new migrant (1); Older adults aged 65 years + (2); Younger adults aged 18-24 years (2); Women (2); People with disabilities (1).
The survey was extensively changed following this testing and the Commission is confident that it is understandable for these populations. Download the PES cognitive testing report.
While a wide range of patients are sent the survey, there is concern that Māori and Pacific people could be under represented in the survey. For this reason the Commission is testing the use of tablets that are given to patients in the waiting room so they can complete the survey before and after their appointment. Because the survey covers experience over the last months and not the most recent 15 minutes, this in situ method is theoretically valid. The use of tablets in practices with high Māori and Pacific populations is likely to be an effective way to address possible under-representation and early results of in situ testing support this view. Tablets can also be used to get feedback from other populations with low access to devices with internet access.
The other way that lower response rates in certain populations are handled is to weight the overall results to correct this. This method is described in our methodology and procedures document.
The survey is currently available in English, and it has been tested and adjusted for people with English as a second language. The survey cannot be translated until a full evaluation has been completed. The evaluation is planned to start in March 2017.
Some practices have patient satisfaction surveys they like to use. Local patient satisfaction surveys can be used as contributory measures in the System Level Measures improvement plan. However the patient experience survey must be used as well, as an important way to consistently measure patient experience across the health system.
It is helpful to make a distinction between patient satisfaction and patient experience. Satisfaction surveys typically ask patients whether or not they were happy with their care. Research has shown that patients can report high satisfaction at the same time as describing suboptimal experience. Satisfaction is subjective and is influenced by differences in expectations, previous experience and disposition. Patient experience surveys seek to ask whether or not things that should happen did happen – were things explained in a way they could understand, how long did you wait for your appointment.
To date, the primary care patient experience survey has had a national average completed response rate of 14 percent: 23 percent via email and 6–8 percent via SMS. There are a number of ways to improve response rates:
There are three related issues of overall sample size:
The number of people needed to respond per practice only needs to be between 30 and 50 for the responses to be useful. Because we are starting with a large sample size (in August, 16,600 people across the five pilot PHOs were invited to participate) the current response rate is statistically valid. In the patient comments section, a much lower number of comments, eg, five or ten with a common thread is also considered valid.
The second issue is demographic representativeness of responders. The literature is divided as to whether this leads to invalid patterns of responses. There are well established ways of dealing with this through demographic weighting which are set out in the methodology document.
Finally there is the risk that the responders hold unrepresentative views, regardless of their demographic representativeness. This increases with low response rates, however, the Commission has undertaken a study of the attitudes of non-responders for the adult inpatient experience survey in conjunction with Victoria University and found that the attitudes of those who didn’t respond are identical to those of responders. A copy of this report will be published on the Commission’s website.
A primary care patient experience survey governance group has been established to act as the decision making body for the implementation of the survey across PHOs and practices and over the information that it collects. This group meets regularly. The group consists of representatives from PHOs, DHBs, general practices, General Practice New Zealand, the Ministry of Health, consumer group and the Commission.