Data is a hugely valuable resource for any business, particularly since the pandemic has accelerated the digitalisation of the economy and reduced personal interaction. Whilst the smart use of data can offer great growth opportunities for businesses it needs to happen in a controlled manner to avoid lawsuits, regulatory fines and reputational damage.
Applied to combined data sets, artificial intelligence (AI) or “machine learning” tools can create valuable insights by cleaning, modifying and clustering the information. Simply put, AI aggregates data and creates new insights from it.
The opportunities are tangible and numerous:
a retailer can use insights to improve customer experience and personalisation
an organisation analysing emissions can assist in the reduction of its carbon footprint
a healthcare service provider developing new software tools based
on new data insights can increase the impact of health services in
predictive analysis of transport patterns can reduce congestion and
improve public transport services as well as guide policy decisions
the collection of pandemic data to identify the effects of lockdown on children’s wellbeing can prompt future safeguarding
data-sharing between public authorities and supermarkets can help prioritising food delivery slots during the pandemic
the analysis of housing needs across the country can help balance affordability against demand.
But to make use of personal data, companies need to consider a few rules to reduce their risk exposure.
The data owners need to be able to accept the use of data, and be sure that it is being used fairly, avoiding harmful impacts. Customers are far more inclined to share data in such circumstances, thereby enabling the benefits of data sharing to be fully realised.
Historically, private data has not always been well-managed and if misuse is confirmed this can provoke strong reactions in people.
Equally, organisations embracing the ethical aspects of data use can develop a competitive advantage as it creates trust, a critical and highly valuable asset for every business.
An important part of the concept of trust is that parties using the data have a right to know that it is accurate and can be trusted, both in form and content. IBM has estimated that data quality issues costs $3.1 trillion per year in the US alone. The A-levels grade debacle that hit the UK over summer 2020 whereby a poorly-designed algorithm downgraded around 40 % of predicted results, considerably undermined the public’s trust. The outcry was tangible and many would say, entirely warranted. Algorithms are inherently morally neutral and performed
exactly as they were designed. However, algorithms can be poorly-designed at the outset.
It may be useful to mention topical “data trusts” at this point. Whilst there seems to be some inconsistency around the meaning of this term, one thing appears to be universally accepted: The term “data trust” encapsulates the very concept of trust in the collection, maintenance, sharing of and access to data. The way data is used should generate trust - about the information itself and the data trust’s activities – thereby reinforcing the ethical principles.
Any consideration of ethics and trust starts with an analysis of how data is collected, shared and used. Business tools are available to assist with the management of this process.
Global privacy laws continue to emerge and mature, providing legal and regulatory frameworks to support these ethical considerations. Carefully managed processes are fundamental to ensure ongoing compliance with applicable data protection principles.
Understanding the full extent of these laws can be complex: the rules themselves are often complicated but the implications will be compounded by the various data parties involved and the nature of the data itself. There are many different types of data users, those who
collect, store, manage, analyse, modify, or otherwise use data in various ways. A full analysis of potential legal liability against the legal framework is critical.
Consider the following:
Does the data institution “touch” the raw data or simply
extrapolate information that is provided to them in an anonymised
What agreements / contractual obligations / confidentiality provisions are in place with respect to data?
How was the raw data obtained: Is the data recipient confident that it has received the data ethically?
What actions have been taken to reduce the risk of re-identification?
Is data “anonymised” or “pseudonymised” with a view to it being not re-identifiable?
Is it possible to de-anonymise the data? What are the risks of this happening?
Ransomware and the effect on data privacy
During the pandemic, criminal activity threatening the safety of a business’s digital assets has increased substantially. Ransomware is the weapon of choice. 2020 has seen a huge surge in ransomware attacks, exacerbated by the pandemic where network vulnerabilities have been more exposed with businesses operating remotely. Earlier forms of ransomware involved encrypting systems and offering a decryption key in exchange for a paid ransom demand.
Some businesses were able to side-step such threats due to the ability to
reinstate digital assets from back-ups and avoid payment of any ransom.
But that came at a cost.
We are now seeing the “two-pronged” attack. Cyber-criminals deploying ransomware variants such as Ryuk and Maze often exfiltrate data as well, demanding payment of a second ransom, without which data will be released into the public domain. If the data in question has political, economic or environmental significance, cyber-terrorists may show particular interest, perhaps installing malware to monitor data, or threatening the release of that data, for commercial or political gain.
Whilst it is outside the scope of this paper to list the full range of mitigating measures available to organisations, at the very least a bespoke risk assessment, including cyber risk identification, analysis, and potential transfer by way of standalone cyber / privacy insurance would be advisable.
Appropriate “data stewardship” is vital to ensure that businesses operate above the law and reduce cyber risks, particularly as they relate to personal data, and the protection, use of and access to that data. Reputations are made and broken in this arena: the commercial sensitivities ought not to be underestimated.