Building trust in the digital era: achieving Scotland's aspirations as an ethical digital nation

An Expert group supported by public and stakeholder insights has reviewed evidence and provided recommendations which will support and inform future policy. The report has a focus on building trust with the people of Scotland through engaging them in digital decisions that affect their lives.


Recommendation Summary

Desire for a National Digital Guardian

Fractured responsibilities and lack of holistic oversight is leaving too many gaps in the governance of digital innovations by both the public and private sector. Where unethical or problematic practices are witnessed, too often it is unclear which of the plethora of regulatory or support agencies with their different remits, these issues should be raised to (e.g. advertising, consumer protection, data privacy, cybersecurity, policing etc.).

In addition to the presentation of recommendations thematically a requirement that was often mentioned in the Public Panel, is the need for a ‘go-to’ digital ‘ombudsman’.

To support this recommendation, there is a need for a comprehensive overview of the laws and regulations that can be brought to bear in Scotland to ensure digital ethics, particularly in cases of online harm, as well as to understand the challenges that would be involved in aligning existing diverse bodies around a common set of digital governance objectives.

Securing Trust in Government Digital

Scotland is on a mission to be a world- leader in data-driven innovation, with bold plans to harness information about citizens, government, businesses, the economy and our natural environment, to serve various goals for science, society, and the econom Plans to digitise government services are also moving apace, with digital first systems, the data intelligence network and digital identity credentialing projects at an advanced stage.

  • Ensure this is done with the consent or verified assent of all stakeholders.

Allegations of unfair procurement, unwise partnerships with Big Tech, wasteful spending and abuse of data power, levelled south of the border during COVID-19, have fuelled widespread public (including media) concern about integrity in public life.

  • The rule of law and standards in public life must be seen to be adhered to through transparent and accountable government decision making and spending around technology projects, grants, contracts etc. Open Government (a coalition of active citizens and civil society organisations committed to making Scottish government work better for people) is part of this.
  • Multiple strategies for involving citizens should be pursued, including information campaigns, and online surveys or consultations, to long- lasting deliberative engagement and direct participation in decision- making groups and institutions.
  • Honest conversations about difficult trade-offs are needed, especially where digital decisions may cause harm to certain individuals or groups but may be ‘fair’ and necessary in other ways. A deeper understanding of the impact of digital innovation on the environment is also needed.

Presentation of recommendations thematically

Public Awareness of Data Use and Sharing & Harm Protection when Online

There is a strong need for actions to protect citizens from harms caused by malign online influences, particularly in the case of children. These involve both direct and indirect forms of harm, including:

  • Industrial harms such as pornography, gambling, fast-food marketing
  • Criminal harms such as grooming, blackmail, drug pushing
  • Psychological harms, such as stalking, sexting, shaming, cancelling
  • Group harms, such as racism
  • Covert exploitation such as addictive media, product placement in online games, or auctioning the web clicks of identifiable users to advertisers
  • Vulnerable groups (e.g. domestic violence victims may be digitally stalked but their privacy may also be abused during police investigations).

Tackling these many different forms of harm is no mean feat. As already noted, it can involve many different agencies and instruments and the sources and severity of harm can sometimes be hard to pin down. The major internet, social media, streaming and gaming platforms are headquartered overseas and the laws surrounding their obligations for consumer and data protection are reserved. This limits how much direct control the Scottish Government can exert over these practices. Many other steps are nevertheless possible. These include:

  • Scottish MPs advocating for change in the UK parliament
  • Scottish Government exerting soft influence through lobbying platforms for change
  • Sponsorship of support agencies and helplines
  • Appropriate information/education for different age groups aimed at raising awareness
  • Due diligence to ensure Scottish Government is not inadvertently incentivising or sponsoring Scottish business that contributes to these harms
  • Building on existing efforts on legal enforcement and prosecution, while taking into account the risks associated with surveillance and monitoring mentioned earlier in the report.

Scotland’s businesses, public sector and citizens are subject to harm from cybercrime – not only carried out by opportunists, but also organised crime groups and state actors. This can target infrastructure, money, intellectual property, or personal information. These actions can directly damage vital systems, livelihoods, reputations, services, political actors and the economy as a whole. Technologies are being designed to support cybersecurity, many in Scotland, but these can also lead to more intrusive surveillance of individuals and still holds the potential to be misused.

  • Continued investment in cybersecurity capacity building is essential and, given the workforce shortage, it is inevitable that automated technologies will have to play a role. It is key that innovation is guided towards producing ethical technology that prioritises human rights, here or abroad.

There is potential for digital forms of engagement to enable democratic participation at a local, community, council or governmental level.

  • Further work to establish the best way of leveraging digital engagement methods would be worthwhile, along with the resources to support this.

The internet and social media are seen as primary sources of misinformation and disinformation, although this can equally be disseminated via traditional media and public figures. All of which is weakening trust in institutions, undermining faith in democracy and polarising communities.

  • Careful educational campaigns and better use of social media by independent democratic bodies is necessary to help overcome this
  • Ethical use of methods such as social network analysis can help to expose patterns of political influence and may prove useful as an educational resource.

A ‘Green’ Digital Scotland

Scotland is leading the way in green tech, much of it digital, offering new opportunities to address the UN Sustainable Development Goals for climate action. As we strive to become a leading nation for big data, high performance computing, artificial intelligence, digital gaming, cryptocurrency and space technology, as well as ‘digital first’ public services and ID, there has been a failure to acknowledge the environmental cost. The power consumption of some of these innovations dwarfs that of many conventionally energy intensive industries, such as plane travel. Computers also create heat and storing files on the internet requires data to travel thousands of miles. Recycling facilities and repair opportunities are under-developed, leading to digital waste.

  • Close coupling the data and digital innovations strategy with the renewables agenda is vital
  • Develop a strategy for managing digital waste and sponsoring repair
  • Keep Scottish e-waste in Scotland
  • Invest in professional e-waste recycling units
  • Mandate purchase of digital products that allow for repair and recycling
  • Work with technology companies in Scotland to support efforts to reduce emissions.

Digital Inclusion

Access to digital technologies can reduce barriers to participation (e.g. supporting the disabled to work from home, enabling rural patients to receive healthcare). It can also perpetuate or even magnify existing disparities, both through affordability gaps or lack of empowerment and skills (e.g. participation in online finance.)

Digital technologies may also hinder some groups’ ability to exercise their rights or freedoms, either ‘by design’ or as an unintended consequence. Using fully automated systems to calculate people’s eligibility for services, likelihood of committing crimes, or ability to benefit from treatment may fail to recognise individual circumstances and have unfair outcomes. Certain technologies may be more unfair for specific groups (e.g., a Digital Identity may be convenient for many people but aversive to refugees with experience of their misuse). Use of broad-brush data mining or surveillance may unfairly stigmatise certain groups or compromise their rights (e.g. area- based labelling for health risks may impact insurance costs; facial recognition in shops violates privacy, stigmatizes the innocent and can subject people to unequal treatment by ethnicity and gender.)

  • Actions are needed to anticipate and mitigate any risks of exacerbating the digital divide, when planning new government projects, as well as to flag non-governmental sources of digital inequality that may require intervention
  • Ensure equal access to public and essential services is maintained by guaranteeing alternative options are always available, without this being a disadvantage in terms of cost, eligibility or quality
  • Regulate pricing on network data provision
  • Encourage businesses providing data, devices and software to take some responsibility to ensure access to their digital services is affordable and accessible to all consumers that go beyond the existing very basic responsibilities.

Reliable, Representative Data & Technologies Underpinning Algorithmic Decision Making

Both to build trust into algorithmic decisions and to comply with guidance in place in data protection laws, organisations need to be able to demonstrate that their algorithms are robust, reliable and meet a set of required standards. If the algorithm is determining whether someone can access education, buy a house or get a job, they need to feel confident that the outcome is justified. It is not just the outputs that need to be considered, but also how the algorithms are deployed.

  • Algorithms should always be applied fairly and equitably, which is best done through human oversight and appropriate, potentially participatory, governance
  • There should always be a way to determine how a decision was made.

Total transparency may not always be appropriate. Some companies may want to keep their algorithms secret to avoid competition, or sometimes the data used could be sensitive and personal, and information that warrants high levels of protection.

  • Improve regulatory practices around transparency that will push companies to provide more accessible explanations of how their algorithms work
  • Introduce external audits to provide an extra layer of scrutiny around algorithmic decision making that can reassure the public that agreed standards are being met
  • Human ‘sense check’ to validate the decisions the algorithm is making.

Ethical Limits to Monitoring and Surveillance

Values can clash in the digital space. A company may desire the best tech to help their users, the police may wish the best surveillance cameras to help them protect the public, and a government may wish the best data to optimise public services. However, data over-reach, lack of consultation and consent can be seen as disproportionate, disrespectful and dishonest, damaging the public trust needed for success.

  • Active involvement and lay consultation and participation are needed, especially for the most sensitive types of data mining or surveillance
  • Transparent, inclusive and democratic engagement in strategic planning and decision-making to ensure a fair deal for data usage and calibrate uncomfortable realities
  • Enforcement of legal duties around data protection and strong actions to minimise the use of person-level data and ensure anonymisation
  • Development of ethical guidance around group privacy and demographic-level harms due to the gap in data protection law unable to adequately address the risks and harms related to use of aggregate data.

The Future of Work in a Digital Economy

A profitable economy can have a positive influence on a number of societal factors. However, it is important to balance this with an awareness of how a digital economy promotes fairness and inclusion. There is a concern that the push to digital will come at a disadvantage to those in lower paid or lower skilled roles. Whilst efficiency and automation can be useful, human flourishing should be the priority to guard against:

  • Net Job losses
  • Lack of human interaction
  • Impacts on vulnerable groups
  • Risks to safety of individuals
  • Harms to employee rights
  • Unequal access to digital technologies
  • Lack of consumer rights.

As Scotland’s homegrown tech sector thrives, much has been made of the new ‘high value’ jobs that are emerging but this affects a tiny minority of elite earners, with no guarantee of trickle-down.

  • Track whether these lead to broader career paths to ensure this leads to wider scope of employment opportunities
  • Provide appropriate and accessible upskilling opportunities to minimise net job losses caused by automation and other digital advances
  • Support digital infrastructures that improve wellbeing and societal flourishing, such as cultural heritage.

New forms of employee monitoring are causing alarm due to a lack of transparency and a sense of over-reach and privacy violation. The use of data analytics may help businesses and some workers to understand performance better but could also affect their autonomy.

  • Work with organisational leaders and unions to ensure clear and consensus-based policies
  • Ensure surveillance and monitoring technologies are used in a controlled, transparent way
  • Establish regular reviews of surveillance technologies to ensure their use is still justified and proportionate
  • Develop and adopt codes of conduct that make a commitment to fair and ethical practices
  • Need to work with the cultural, financial and environmental agenda to research this area further.

Contact

Email: digitalethics@gov.scot

Back to top