Innovation data baseline: final report

Independent consultant EKOS were commissioned to undertake a review of the methods for measuring the impacts of investments in innovation. The study is part of a wider programme of work - which primarily focuses on the innovation activities of the Enterprise and Skills agencies in Scotland.


6. Conclusions and Recommendations

6.1 Introduction

The objectives of the current study were to:

  • Identify the baseline data and methods required to better evaluate innovation support interventions in terms of both short and longer term outputs and impacts on the Scottish economy;
  • Review the existing data and methods used by the agencies to monitor and evaluate innovation support activities; and
  • Make recommendations for new approaches that will address gaps in current data collection and encourage consistent measurement and evaluation across all agencies.

It is important to be clear that the ultimate aim here is to improve the means of measuring and evidencing the impacts of innovation support rather than to improve ways of measuring innovation more generally across the economy and society.

This is an important distinction, and introduces the notion of attribution. Macro changes in wider innovation performance, however measured, cannot easily be attributed to the effects of specific support interventions. Instead, what is required is a more specific set of measures that can track more directly the impacts of various support mechanisms.

6.2 Issues and Considerations

Our work has identified a number of issues to be considered when seeking to develop a more consistent and complete approach to measuring the impact of innovation support. These are discussed below.

Definitions

As noted earlier, the definition of innovation provided by ESAU is purposely broad and inclusive. However, this allowed considerable debate across the agencies as to what constitutes innovation support, and therefore which programme and activities should be included. The study proceeded on the basis of a pre-agreed list of support activities, some of which are changing or are dis-continued.

As such, in order to support any future implementation of a revised monitoring framework there is a need to develop a clear definition of what constitutes innovation support (the definition of innovation itself having already been agreed for the purposes of this work). In particular, some of the programmes included within this review may be discontinued and new or different programmes may replace them. The agencies will need further guidance and clarification as to what is within and outwith the scope of assessment/ measurement.

Attribution

Perhaps the single biggest challenge is in attributing changes in innovation performance to specific support interventions. This is particularly the case where companies may have received multiple forms of support for a single innovation project at different stages of its development (as with innovation grants for example). This is also very difficult where academic knowledge may be a crucial input to a successful innovation, but is subsequently developed through multiple interactions and support before reaching application in the economy or society. Attributing eventual impacts back to the original investment in research that generated the crucial knowledge is therefore complex. There is also a need to understand what resources might be required to enable better tracking of these effects.

To a large extent, it may be necessary to live with some imperfection in relation to attribution, and to fill gaps and address possible double counting with informed assumptions. Here prior evaluation evidence may be useful, and we return to this below.

Timescales

There is enormous variation across the innovation landscape in the time it takes for an initial idea to reach successful application. In some cases this may be relatively quick, particularly where speed to market would confer competitive advantage. However, in many cases, and in most evaluations of innovation support that we have seen, the timescales between original support and eventual economic impact are long (five to ten years) and frequently underestimated. The time it takes for new understanding developed through academic research to find market application can be longer still.

Indeed, some realism is required about when a new approach to measurement could be expected to deliver useful impact data, as different interventions will produce impacts in different timescales.

Thus, of the various investments made into different forms of innovation support in any particular year, some may produce impacts within, say, 3 - 5 years, while others may take more than ten years to show impacts.

This is a complicating factor, and will need to be borne in mind when interpreting and reporting the results.

Current Measures

Across the three agencies, there is a plethora of innovation-related measures and indicators being collected in different ways. Many are similar, but expressed in slightly different terms, and there is an opportunity to refine these into a more consistent set of common measures (as we have proposed in the framework). This does not preclude other measures being collected as required and/or other approaches by the individual agencies e.g. case studies and periodical evaluations.

There is also a degree of reliance on forecast measures of future impact. While some data on actual impacts are collected, this is not always consistent, nor within the realistic timescales over which impacts may be realised. In other words, where there is follow-up data collection to gather actual impacts this is often undertaken too soon for these impacts to have been produced. Although as noted, there may be practical and resource considerations for longer term tracking.

Resources

There is a trade-off between the need to consistently gather robust and detailed performance data and the resources available to support this process - monitoring should be proportionate to the scale of the investments. While some in the agencies recognised the value of longer term tracking to gather actual impact data, they noted that resources could be a constraint and deployment needs to be proportionate to the scale of the investment. The key question here is whether this is a function of scarcity of resources or their allocation (i.e. resources not being prioritised in this way).

Academic Research and the Role of Universities

As already discussed, there are some issues arising from the inclusion of investment in academic research within the overall definition of innovation support. While this is wholly consistent with the role of HE in the innovation process, it does raise specific measurement issues.

Research in the UK is assessed through the Research Excellence Framework (REF) and, by virtue of the dual support system and the national and international nature of the academic market, it would be extremely problematic for Scottish HEIs to withdraw from this process in favour of an alternative approach.

Universities are also autonomous bodies and as such make their own decisions about how to use their funds. Tracking expenditure and activity relating to specific funding can be difficult, particularly in relating to the REG and the UIF.

Global measures of innovation such as those reported in the HE BCI survey are useful but lack detail in crucial aspects (e.g. the location of businesses supported, the impacts of the support) and are not attributable to specific investments or support.

University contributions to innovation appear to be better captured where they are delivered through specific knowledge exchange projects and programmes (such as Innovation Centres).

Many HEIs have limited systems for gathering data on their interactions with industry. Previous research for Creative Scotland and the Scottish Funding Council into the role of HE in innovation in the creative industries[35] found few universities had detailed records of their interactions with external organisations. Through Edinburgh Innovations, the University of Edinburgh has been developing a CRM system to track its relationships with external partners with a view to being able to identify both the extent of the institution's engagement with specific firms and the impacts of that engagement. Although still a work in progress, this offers some potential to enhance the data that universities may be able to provide regarding the impacts of their activities on innovation performance.

6.3 Gaps

As set out in Section 5, there are some significant gaps in the data that are currently collected to measure the impacts of innovation support. These are as follows:

  • there is no information to detail how funding to universities for research and innovation activities is being used. This includes both the REG and the UIF;
  • the REF provides quality ratings and useful information on research outputs, but is based on self-selected samples from institutions, and does not provide consistent measures of the totality of research activity, research outputs, or the uses to which new knowledge may be put;
  • unless delivered as part of specific projects or programmes (e.g. Innovation Centres) the impacts of universities' interactions with external organisations are not captured. There are some data on the extent of these interactions, but not on their impacts;
  • while the enterprise agencies have fairly well developed measurement frameworks and data collection processes, there are issues with differences in definition and terminology (of measures), use of forecast rather than actual data, and incomplete data due to limitations in data collection processes or poor responses to requests for data from assisted companies;
  • there is a lack of long term tracking of companies in receipt of innovation support to assess the impacts. This is a particularly significant gap in light of the well-documented long term nature of returns to innovation activity; and
  • there is limited attempt to capture the extent of failure across the innovation system.

6.4 Summary Conclusions

The implicit question at heart of the study is whether or not it is possible to measure the impacts of Scotland's current investments in innovation support. On the basis of current approaches and measurement practice, such an assessment is not possible, for the following reasons:

  • there are significant gaps where either data are nor collected at all, or the available data are not sufficiently robust or complete;
  • measures are sometimes inconsistent in their definition across different agencies, even if similar in general meaning;
  • the attributional links between investments and impacts are not always clear, and there is reliance in some places on global measures of innovation performance (such as those in the HE BCI for universities or BERD for the wider economy) which cannot always be attributed to specific support or investment;
  • there is considerable potential for double counting of impacts where companies have received multiple forms of support;
  • impact data (e.g. turnover and employment gains) are frequently based on forecast rather than actual data; and
  • the long timescales between intervention and impact in innovation projects are such that impacts are not always captured by existing data collection methods as longer term tracking is limited.

Given this, the follow up question is whether or not it is possible to improve and adapt our processes to ensure a consistent and evidence driven approach that will enable better assessment of the impacts of innovation support. To this, the answer is a cautious yes, but with some caveats.

A 'perfect' system does not exist and any new approach will have to make some assumptions. Each of the five domains of innovation outlined in the conceptual model are necessary components of an innovation system (linear or otherwise). However, the ways in which each links to the others are complex, and may not always conform to existing measures of success e.g. patents do not always lead to new products (and may even hinder access to technology). Instead, the assumption that progress in each will contribute to the whole may have to suffice.

Moving to a new approach will also require the full support and cooperation of the three agencies and likely also the universities. It may also have resource implications for these organisations and may take some time to establish.

All stakeholders should be realistic about timescales. It will be many years before a clear assessment of the impacts of innovation support can be produced. The reasons for this have been extensively covered in this report.

6.5 Recommendations for a New Approach

The proposed frameworks of measures for each of the five domains are presented below. We have then provided recommendations for implementation, including the implications for the agencies and stakeholders.

For this paper the measures have been kept deliberately 'high level' and will require further detailed consideration, updating and refinement with the input of the three agencies, Scottish Government and partners.

Proposed Measures Framework

Table 6.1: Knowledge Creation - Proposed Measures

Sources

Comment

Input Measures

HE Research Income:

  • UK Govt/ public agencies
  • Industry
  • Other

HE BCI Survey (HESA - under review)

SFC KE Metrics (under review)

TRAC

HE Income and Expenditure (HESA)

Retain as now

Number of research active staff in HEIs

Research Excellence Framework (REF)

University Data

HESA HE Staff data

SFC to ask HEIs to provide total number of research active staff by Unit of Assessment (annual)

Number of research students (postgraduate)

HESA Student Data

Retain as now

Business Enterprise on Research and Development (BERD)

Scottish Government via Office for National Statistics (UK)

Retain as now

Gross Expenditure on Research and Development (GERD)

Scottish Government via Office for National Statistics (UK)

Retain as now

Activity Measures

Investment by HEIs in research capacity/ infrastructure

University data

SFC to request data from universities to account for use of REG and UIF monies

Number of HE research projects

Research Excellence Framework (REF)

University Data

SFC to ask HEIs to provide total number of research projects by Unit of Assessment (annual)

Quality of HE research

Research Excellence Framework (REF)

Retain as now

Output Measures

No of academic research publications (including peer reviewed journals, books and book chapters and conference presentations)

Research Excellence Framework (REF)

Retain as now

No. of bids for funding submitted

UKRI

Under development[36]

Success rate of research proposals

UKRI

Under development

No. of researcher FTE posts funded in Scotland

Universities / HE BCI data

Under development

Impact Measures

HE Research Quality Ratings

Research Excellence Framework (REF)

Retain as now

HE Research Impacts

Research Excellence Framework (REF)

HE BCI data

Retain as now

Table 6.2: Innovation Capacity - Proposed Measures

Sources

Details

Input Measures

Investment in innovation capacity building activities

Agency data on project expenditure

Retain as now

Activity Measures

Number of innovation capacity building programmes

Agency data on project expenditure

Retain as now

Output Measures

No of firms participating in innovation capacity building programmes

  • Sectors
  • SMEs/ large companies

Agency data on programmes

Retain as now with splits to be agreed

No of firms participating in leadership development programmes

  • Sectors
  • SMEs/ large companies

Agency data on programmes

Retain as now with splits to be agreed

Impact Measures

No of new innovation active firms

EU Community Innovation Survey (economy-wide and not for specific support programmes)

HIE innovation ladder (uses different measures)

SE programme monitoring data

Bi-annual (?)

Retain as now but align measures across SE/ HIE

Increase in firm productivity

Agency data required

Agencies to gather data from beneficiary firms

Table 6.3: Knowledge Diffusion - Proposed Measures

Sources

Details

Input Measures

Investment in knowledge flows/ diffusion activities/ projects

Agency data on project expenditure

Needs to be collated across agencies

Investment in collaborative R&D (companies)

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies and with beneficiary firms

Activity Measures

No of HE/ Industry collaborative projects

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies and measures aligned

No of business to business collaborative projects

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies and measures aligned

Output Measures

No of firms involved in collaborative R&D/ innovation projects

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies and measures aligned

No of HEIs involved in HE/ industry collaborative projects

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies

HE income from collaborative and contract research

HE BCI Survey (HESA - under review)

SFC KE Metrics (under review)

Retain as now

IP registrations (patents, disclosures, licences)

HE BCI Survey (HESA - under review)

Agency data required

Retain as now

Agencies to collect data from beneficiary firms

No of firms licensing technologies from HEIs

Agency and project data collected through ongoing monitoring

University data

Data collection to be improved across agencies

SFC to ask HEIs to provide data

No of new products/ processes/ services developed

Agency and project data collected through ongoing monitoring

University data

Data collection to be improved across agencies

SFC to ask HEIs to provide data (recognising that product development may occur after HE involvement)

Impact Measures

R&D jobs created/ safeguarded

Agency and project data collected through ongoing monitoring

University data

Data collection to be improved across agencies

SFC to ask HEIs to provide data (recognising that impacts may occur after HE involvement)

No of academic spin-outs

HE BCI Survey (HESA - under review)

Retain as now

Sales from new products/ processes/ services developed

Agency and project data collected through ongoing monitoring

University data

Data collection to be improved across agencies

SFC to ask HEIs to provide data (recognising that impacts may occur after HE involvement)

Increase in firm productivity

Agency data required

Agencies to gather data from beneficiary firms

Table 6.4: Innovation Development - Proposed Measures

Sources

Details

Input Measures

Investment in innovation development projects

Agency data on project expenditure

Needs to be collated across agencies

Leveraged industry investment in innovation projects

Agency monitoring data

Needs to be collated across agencies

Activity Measures

No of innovation projects:

  • Feasibility studies
  • Proof of concept
  • R&D/ Large R&D
  • Product development

Agency monitoring data

Needs to be collated across agencies

Measures to be aligned

No of business to business collaborative projects

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies and measures aligned

Output Measures

No of new products/ processes/ services developed

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies

IP registrations (patents, disclosures, licences)

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies

Follow on investment in R&D

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies

Impact Measures

R&D jobs created/ safeguarded

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies, including longer term tracking

R&D FDI

Agency data

Ad-hoc

Sales from new products/ processes/ services developed

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies

Increase in firm productivity

Agency data required

Agencies to gather data from beneficiary firms

Table 6.5: Application and Exploitation - Proposed Measures

Sources

Details

Input Measures

Investment in application and exploitation (e.g. marketing new product, export promotion, IP protection etc.)

Some agency and project data collected through ongoing monitoring

Ongoing

Activity Measures

No of IP Audits

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies

No of projects taking innovations to market

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies

Output Measures

No of firms taking new products/ processes/ services to market

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies

No of new products/ processes/ services launched on the market

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies

IP registrations (patents, disclosures, licences)

HE BCI Survey (HESA - under review)

Agency and project data collected through ongoing monitoring

Annual survey of HEIs

Data collection to be improved across agencies

Impact Measures

R&D jobs created/ safeguarded

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies, including longer term tracking

Sales from new products/ processes/ services developed

Agency and project data collected through ongoing monitoring

Data collection to be improved across agencies

Increase in firm productivity

Agency data required

Agencies to gather data from beneficiary firms

Wider Impacts

The ultimate impacts of innovation support will be economic, social and environmental. The measures framework above focusses mainly on economic impacts, but three main areas remain to be considered:

  • Failure: as noted, it is to be expected that projects will fail or be discontinued at different stages on the journey towards application. Indeed, this may not always be a negative outcome, as innovation development may provide evidence that a particular technology or approach will not work, thereby saving further wasted investment. However, at present there are no data to quantify the failure rate across different parts of the innovation system, and this may be worth further consideration and definition of "failure";
  • Social Impacts: many innovations will deliver social as well as economic benefits. Advances in healthcare and treatment, for example, would be expected to result in public health and quality of life benefits. However, the potential range of the social impacts is very broad indeed, and defining and capturing these within a single measurement framework would be a significant undertaking. Nonetheless, the potential for social impact should also be considered, particularly in relation to developing measures that align to the Scottish Government National Performance Framework and the ESSB's Strategic Performance Framework; and
  • Environmental Impacts: as climate change becomes an increasingly urgent global policy concern, so the focus on measuring the environmental impacts of innovation will increase. Green technologies will be an area of growth, but the expectation is that more and more of the wider economy will work to reduce carbon impacts towards a net zero economy. Work is underway within the enterprise agencies (and partners such as Zero Waste Scotland) to develop ways of measuring the environmental impacts of projects and programmes, and this should in turn inform the innovation measurement work.

Recommendations

Before setting out the next steps, there is a need to consider the cost and benefits of implementing a new approach and framework to measure the impacts of innovation support.

On the one hand, as things stand, it is not possible to measure the impact of investment in innovation support in Scotland. Wider evidence on the benefits of R&D spend (e.g. OECD and Innovate UK) can help provide a rationale for investment and programme evaluation can assess the impact of some interventions (and inform action to improve delivery), but the overall impact of c £349m of annual expenditure is not known.

On the other hand, it is clear that developing a more robust and credible way of collecting the data necessary to allow this assessment will require investment of time and resources on the part of the agencies and their partners. Even then, the results will remain subject to numerous assumptions and likely issues with data quality, and it will be many years until the framework produces the anticipated outputs.

There is therefore a decision to be made about the value of making the investment of time (and money) to establish a new approach.

This notwithstanding, the steps necessary to take forward the proposed framework are discussed below.

At a high level, there are implications for each of the agencies as follows:

Scottish Funding Council

The REF and the HE BCI are both useful data sources, along with others such as HE Staff data, TRAC and HE Income and Expenditure data, but are all general rather than specifically tied to particular investments. Thus, what is proposed is some further data collection to supplement these sources and address current data gaps. It should be noted that SFC is currently undertaking a wider review of 'Coherent Provision and Sustainability' which includes consideration of funding, operations and accountability frameworks.[37] This piece of work should be considered as complementary to this study and may start to identify solutions to some of the challenges set out.

In particular, SFC should seek to gather data from HEIs on:

  • the specific use of REG and UIF monies against defined categories of expenditure (these could relate to infrastructure, research activity and staff, for example);
  • more global measures of research activity and resources including the number (and nature) of active research projects and research active staff (although HE Staff data may suffice for the latter);
  • the number and location of businesses and other external organisations with which they engage (this could be captured in the HE BCI and could be an issue to input into the HESA review process); and
  • engage with and support the universities to collect data on the impacts of their innovation activities (the University of Edinburgh's CRM project may be a useful case study example here). In particular, where a university is working with a company on an innovation project that is not supported by other innovation support programmes then the impacts of this work should be captured. Where the project is supported elsewhere (e.g. through an Innovation Centre) then these impacts should already be captured.

Enterprise Agencies

While there are numerous more detailed issues to work through, for the enterprise agencies three overarching principles should be considered:

  • there is a need to adopt a consistent set of measures between the agencies such that data collection and reporting can be more aligned. The framework above provides initial suggestions and many of the existing measures collected by the agencies are similar to those specified above. However, changing the definitions of some measures may have implications for the enterprise agencies' current measurement frameworks;[38]
  • there should be greater commitment to longer term tracking of assisted firms to assess impacts over time (applied proportionately to the scale of investments). This is particularly important for innovation support where timescale can be lengthy; and
  • data collection should move from reliance on forecasts to actual impact data - which will require the longer term tracking noted above.

Other Stakeholders

Implementing a new approach will have implications for other stakeholders and actors within the innovation landscape. Most obviously, the universities have a key role to play in collecting data that can more accurately demonstrate their contribution to innovation performance (again the Edinburgh CRM project will be a useful example here). This includes more detailed recording of research outputs and their uses, tracking interactions with external organisations and collecting data on the impacts of these interactions (again over time). This has obvious resource implications for the universities.

Similarly, other actors in the innovation system (for example, Innovation Centres, Interface etc.) should also seek to improve data collection to include longer term tracking of impacts and collection of actual rather than forecast or estimated impacts.

A more robust approach to monitoring will also have implications for businesses in receipt of innovation support who will have to collate and report performance data in the required format. Issues of proportionality will need to be considered here, but it seems reasonable that firms receiving significant levels of public investment and support should be required to account for the impacts of that support.

Finally, there is also a question about who should be responsible for collating and reporting data on innovation support for the Enterprise and Skills Strategic Board.

Next Steps

The next steps in taking forward the new framework would be as follows:

  • further engagement with the three agencies to define in more detail the specific measures within the framework and scope out the various data collection methods - this might also extend to wider partners and stakeholders;
  • agree and develop a data governance model that identifies roles and responsibilities including responsibility for the co-ordination of the data collection and reporting processes;
  • develop a common reporting format (template) for the agencies to input data on an annual basis; and
  • develop data collection processes (including arrangements for longer term tracking) and allocate necessary resources to support; and begin data collection to establish a baseline position for Year 1.

It is worth noting that some of these actions could be implemented with limited impact on resources. These include:

  • developing a consistent set of measures and common reporting template across the agencies; and
  • establishing agreed data collection processes and timeframes across eth agencies.

Implementing new data collection processes may then require the allocation of resources and this is a decision to be made by the agencies and the ESSB.

Contact

Email: enterpriseandskillsPMO@gov.scot

Back to top