RPID Customer Satisfaction Survey to Inform the Futures Programme

The Scottish Government’s Rural Payments and Inspections Division (RPID) commissioned Ipsos MORI Scotland to measure customer satisfaction with the services of the division and its partner organisations Forestry Commission Scotland and Scottish Natural Heritage.


Appendix 1 - Technical details

Assembling the sampling frame

The Scottish Government wanted to be able to analyse the survey results geographically, and also to structure the sample to limit the number of agents included in the survey and to ensure a robust number of interviews with crofters. To achieve this sample structure, a number of information sources needed to be combined and any ineligible contacts, overlap and duplication removed.

We were provided with customer files to allow our samples to be drawn. These were:

  • A file of all Agents, which was filtered to leave only those with a current mandate. This allowed us to obtain valid records for agents active in 2012.
  • A file of 'SAF 2012 customers'. This had to be cross-referenced with 'SAF 12 application submitted by agent' using BRN number and looking at the 'submitted by agent' column. Customers who had submitted an application through an agent were excluded from the sample.

Although the intention was to stratify the sample by region, this flag was missing from the 'SAF 2012 customers' spreadsheet but was added by matching in the area office from other spreadsheets and then aggregating area offices to regions.

Survey quotas

The achieved sample of 500 interviews for the telephone survey was to be structured along the following lines:

  • A broadly even spread of interviews across the 8 regions. The area office to region breakdown is as follows:
Region Area office
Northern & Northern Isles Golspie
Thurso
Kirkwall
Lerwick
Highland Inverness
Portree
Argyll & Western Isles Benbecula
Stornoway
Oban
Grampian Elgin
Inverurie
Perth (Central) Perth
South Western Ayr
Hamilton
Galashiels (South Eastern) Galashiels
Dumfries (Southern) Dumfries
  • and the following breakdowns by customer type:
    • 10% (50 interviews) agents
    • 15-20% (75-100 interviews) with crofters, although since crofters are not specifically identified on the RPID databases, farm size was used as a proxy. Crofters were targeted by setting a quota on farmers with holdings of less than 10 hectares. Not all farms under 10 hectares would be crofters but farm size was the best measure available for sampling. The survey questionnaire verified whether businesses in this group were crofts or not. To obtain the count of holdings under 1- hectares, the following fields were combined:
      scot_lfa_hectarage scot_nlfa_hectarage other_lfa_hectarage other_nlfa_hectarage
    • 10-15% Rural Priorities applicants (50-75 interviews). The purpose of this quota was to try to capture environmental and forestry businesses in the survey. Farmers can also apply for this grant, but as with crofters, this was the best measure available for these businesses at the sampling stage.
    • 60% of customers (300 interviews) who do not fall into one of the previous categories.

The leeway in some of the sample quotas reflected the use of proxies in defining the categories of interest.

Issued sample

In quota surveys, the main source of potential savings accrue from the fact that interviewers do not make repeated attempts to contact and interview a relatively small number of selected respondents. Instead, working from a large pool of potential respondents, interviewers contact and attempt to interview respondents according to quotas that reflect the structure of the population or, as in this case, the desired structure of the final sample.

Having structured the SG databases into the quota groups identified above, 6,478 contacts were selected and provided to the telephone centre for interviewing. This gave an average of 12 leads for each achieved interview required.

Survey fieldwork was carried out over the period from 23 July to 13 August and the target numbers of interviews, both overall and within quota groups, was achieved.

Respondent selection

The Government's files of agents and customers contain a named contact and these people were the only eligible respondent for each business.

Questionnaire and interview length

The survey questionnaire was designed by research staff at Ipsos MORI in consultation with research and policy staff at the Scottish Government to ensure that the survey met the Government's information requirements. A copy of the survey questionnaire is attached as Appendix 2.

The survey had been budgeted on the basis of a 15-minute interview and the final average interview length was 15 minutes and 1 second.

Online survey

The Scottish Government was keen to test the potential to conduct the survey online, reflecting a general move towards conducting business online and because an online survey offers the potential to reduce the cost of the survey in future waves.

Ideally, the online survey would have been conducted as part of a true experiment with contacts randomly assigned between telephone and online sample groups. However, there was an over-riding need to ensure that the main survey - the telephone survey - was conducted in a way that would provide reliable results for the Government, so this was given priority. This meant that when the telephone sample was being selected, no consideration was given to whether a contact had a valid email address and could have been used for the online test. In fact, 2,458 telephone contacts also had an email address. Only a sample that was not selected for the telephone survey but which had a valid email address, were used for the online test.

The survey questionnaire developed for the online test was re-scripted for the online test. Apart from minor modifications required for the online delivery of the questionnaire, the two surveys were identical except that the online survey collected some additional information to capture details of the upload and download speeds achieved by each respondent. The online script instructions are shown in blue text in Appendix 2.

A total of 3,714 leads were issued, with the expectation that we might achieve around 300 completed surveys. No geographic or customer type controls were set for the online survey. In essence, each person was invited to participate and the resulting sample would be an important determinant of the ability of this type of survey to generate a broadly representative sample.

Each contact was emailed a link to the online survey. The link was unique to that person.

Survey fieldwork and response rates

The online survey took place alongside the telephone survey with contacts sent one reminder, a week before the survey was scheduled to close. At the end of the fieldwork period a total of 520 questionnaires had been completed giving an unadjusted response rate of 14%.

Profile of telephone and online respondents

To assess the extent to which the online survey provided a valid alternative to the telephone survey it is necessary to compare the unweighted profile of online respondents with the weighted profile of telephone respondents, bearing in mind that the telephone sample was designed to boost certain customer groups. This disproportionate sampling needs to be corrected back to the population profile before comparing with the online sample where no quota controls were set. This means that the weighted telephone sample should be the same as the population profile on quota variables.

In terms of key demographic information we found that the two samples were broadly similar, with some over- and under-representation. The extent of weighting implied by this is well within the normal range, with the largest weights (correcting under-representation) of 1.75 in the Southern region and the smallest (correcting over-representation) of 0.79 in the South Eastern region.

Region Population Online (unweighted)
Northern & Northern Isles 15 14
Highland 11 9
Grampian 13 14
Argyll & Western Isles 14 13
Central 21 26
South Western 9 6
South Eastern 11 14
Southern 7 4

Most of the differences in regional profile would not be statistically significant, with the exception of the Central and South Western areas, which are more likely to have good internet access. Having said that, the areas most likely to have poor internet access - Northern and Northern Isles, Highland, and Argyll and the Western Isles - are not significantly under-represented.

Business type Telephone Online
Croft 26 22
Farm* 61 50
Agent 6 9
Forestry or horticultural business 1 3
Estate 3 7
Environmental Group 1 1
Community Group 0 *
Something else* 2 7

Farms make up a larger proportion of telephone sample (after weighting) than do online respondents (before weighting) but broadly the two profiles are consistent.

In terms of non-quota variables, where we do not have population statistics for comparison, the two modes, there are some complementary differences between the modes. Online respondents are, for example, more likely to have sought information on rural grants and support services in the past 12 months - 62% compared with 46% of telephone respondents. But they are just as likely to have submitted a Single Application Form (96% and 95% respectively), although online respondents are much more likely to have submitted their form online (72%) than are telephone respondents (44%).

Within their most recent SAF, the two respondent groups applied for a broadly similar mix of schemes, although applications for Land Managers' Options and Rural Priorities were higher among online respondents than telephone respondents.

Schemes applied for Telephone Online
Single farm payment 88 83
Land managers' options* 48 56
Less-favoured area support 61 57
Rural development contracts - rural priorities* 16 24
None of these 2 2

Both groups were equally likely to apply for rural schemes not covered by SAF, with 35% of telephone respondents and 39% of online respondents making an application.

Importantly, the respondents' role in the business was very similar across the two groups. This is important since the online option gives a little less control over who completes the survey. Indeed, if anything the online survey seems more likely to be completed by owners / tenants whereas the telephone survey has a higher proportion of respondents who are a relative of the owner/tenant, although the difference is modest.

Respondent type Telephone Online
Owner 45 49
Tenant 21 20
Business partner 20 23
Manager 3 5
Relative of owner / tenant 7 2
Other employee 1 0
Something else 1 1

By definition, online respondents are all internet users, with 94% using the internet for both work and personal use, compared with only 68% of telephone respondents. Over a fifth (21%) of telephone respondents said they never use the internet.

However, this does not have such significant consequences for the age and sex profiles of respondents.

Age group Telephone Online
16-40 years 10 9
41-64 years 60 69
65 years and over 29 21
Refused * 1
Male 79 75
Female 21 25

There are undoubted differences in how respondents go about obtaining information, with the internet a much more prominent feature of online respondents' routes for sourcing information than it is for telephone respondents. Telephone respondents are also more likely to visit offices in person than are online respondents (37% and 25% respectively).

Should the two samples be treated as one?

The only circumstance in which the two samples should not be combined is if it is concluded that one is acceptable as a representative subset of the population and the other differs so substantially that it must be considered fundamentally biased. The other circumstances all argue in favour of combining the samples. These would be that both can be considered representative or that both might be biased but can be considered broadly representative with appropriate weighting.

Our opinion, taking account of the similarities and differences between the samples, is that where they differ, the differences are consistent with the survey methods and are more likely to be complementary. For instance, with a sampling ratio of 12:1, the telephone survey has the potential to suffer from availability bias. This means that potential respondents who are not available - out working - when the interviewer calls are more likely to be passed over and interviews obtained with respondents at home or in the office when the interviewer calls. Online survey helps to counter this by always being 'available' when the respondent is available.

Given the similarities and differences we considered neither survey to be fundamentally biased and judged that combined, and weighted to reflect the population characteristics available to us, they provided a better indication of customers' views than either would on its own.

Contact

Email: Angela Morgan

Back to top