As with the earlier Planning Review Consultation analysis (undertaken in 2015-16), and the analysis the Places, People and Planning consultation, there were some broad methodological challenges in this analysis of written evidence. These included:
- ● First, by the very nature of the public call, participation was on a self-selection basis. The sole sampling criterion therefore was interest in the topic. This is important and means that no full societal population generalisation can be drawn.
- ● Second, the planning system relates to a very broad field in terms of the different stakeholders likely to pursue different/contradictory agenda, challenging the analysis to compare and contrast their diverse perspectives.
- ● Third, the focused timescale for the work required analysis by multiple team members which, in turn, necessitates clear methodological frames in order to obtain cross-cutting consistency.
The questions posed have resulted in a highly qualitative set of data, in which respondents give their views on the Position Statement. The responses range from brief statements of support or disagreement to proposals in the Position Statement to responses with qualifications, to very detailed responses to the Position Statement. Many of the responses only focus on question-one on the four main themes in the Position Statement, with around only a third responding to Questions 2 – 4.
As previously, the coding of the responses has highlighted the need to consider all the evidence carefully. The 'mixed methods' approach addresses the challenge that the quantitative data cannot stand alone and must be considered alongside the qualitative data, particularly as so many submissions provided qualifications and caveats to the more overtly measurable element, such as a yes/no response.
Furthermore, for the quantitative data, the numbers in the graphs do not represent the number of responses per se, but the frequency and nature of each categories response. For example, in a single response, you can have an agreement with part of the proposal, a disagreement or concern with another and further details requested. In some responses, there may have been just a comment or idea related to the proposal. Some respondents, particularly those from the civil society sector, noted that tracking between the consultation document and consultation questions was sometimes difficult, expressing a view that this made responding directly to some proposals somewhat challenging.
The data analysis comprises three broad stages:
Stage 1: Review of material and data processing – organisation and cataloguing of the written evidence.
Stage 2: Analysis of Evidence – using both qualitative data software Dedoose and researcher-led techniques.
Stage 3: Reporting – initial reporting of findings, followed by detailed chapters on each theme.
In stage one, we catalogued the anonymised responses by stakeholder groups; we constructed an Excel database and inputted the evidence into a Dedoose database for further analysis linking each submission(s) to a participant case and stakeholder group (descriptor).
In stage two, we established a coding framework for sorting through the responses that were in the first instance based on the Government's Consultation questionnaire. A team member engaged in coding each stakeholder grouping (e.g. civil society; policy and planning; and business) using the Dedoose software. Dedoose is a software package designed specifically for the analysis of qualitative data. It allows researchers to set a coding framework and to then, sort the data according to this framework. This allowed the team to codify a large volume of data and identify patterns and emerging themes. It should be noted that the software is a tool and that the overall research is still dependent on the judgement and analysis of the researchers.
Coding and sub-coding
As the coding progressed and more submissions were reviewed, additions of sub-codes under each question enabled the team to identify how the respondent was answering the question (i.e. agreement, disagreement, presenting an idea, asking for more information). Through Dedoose's analysis tools the team could see the coding patterns develop, allowing for observation and reflection on differences and similarities between stakeholder groups. The team also held regular updates to talk through emerging findings.
In the third stage of the analysis, we (re)-coded the textual data in a second cycle under the full set of the Consultation's key themes, proposals and questions in order to highlight areas of agreement, areas of concern, more information requested, and ideas suggested. This was an iterative process. The report follows this structure for each of the review theme summary papers, which have informed the chapters.
Second cycle coding identified:
(1) Areas of agreement – what are the areas in which opinion generally agrees across the four response categories
(2) Concerns – what are the key concerns – or contentions - highlighted.
(3) More information/clarification –questions and further details requested by the respondents.
(4) Ideas – what were the ideas and opinion suggested by the respondents.
It is, therefore, this third stage material; see process diagram below that has formed the basis for this report, with additional supporting material placed in the Appendices.
The three stages of the analysis process