UK passes ‘adequacy’ test but are calls for GDPR protection cuts a step too far?
Aoife Sexton, Chief Privacy Officer and Chief of Product Innovation at Trūata
For a nerve-racking period of six months, the UK’s data protection practices were under close examination. A review was being undertaken, as the government was given time to prove it could continue to operate to the rigours of the GDPR following its exit from the EU. Following this review process, and demonstrating the importance of international transfers of personal data to the new data economy and global privacy framework, the European Commission adopted a decision backed by representatives of the EU countries in favour of granting the UK data ‘adequacy’, paving the way for personal data to be transferred freely from the EU to the UK.
This is a major boost for businesses who operate on a continental scale, especially following a difficult year with the Covid-19 pandemic having a detrimental impact across many industries. However, the adequacy ruling comes with a warning from the European Commission that this is not a binary decision, and the UK will be under close scrutiny.
This is in reaction to the country’s continuing signals of an intention to diverge from the European consensus on data protection and, in particular, the GDPR. There is an emerging view in the UK that it has an opportunity to cement its position as a world leader in data. The UK Government is seemingly being advised that the GDPR is prescriptive and inflexible, with a view that reform of the GDPR could accelerate growth in the digital economy.
The battle over AI
One point of contention is the implications of the GDPR for the use of artificial intelligence (AI) such as machine learning. As a technology-agnostic and principles-based regulation, the GDPR primarily addresses top-level issues, such as the repeatable governance processes and frameworks that exist within organisations.
However, in the area of AI, the UK Government is being advised that the current legislation is hampering progress in the industry, making it costly and impractical for organisations to use AI to automate routine processes. This is because the GDPR stipulates safeguards that ensure that an individual should not be subject to a decision based solely on automated processing, including profiling, which produces legal effects or similarly, significantly affects that individual.
The Taskforce on Innovation, Growth and Regulatory Reform (TIGRR), set up by Downing Street, has highlighted that this requirement makes it burdensome, costly and impractical for organisations to use AI to automate routine processes because they must also have a manual process for individuals who opt out of automatic processing. As such, TIGRR has recommended scrapping this safeguard. Notably, the taskforce’s 130-page report has received an endorsement from Prime Minister Boris Johnson. A statement made earlier this year from the Secretary of State for Digital, Culture, Media and Sport, Oliver Dowden, made it clear that he wants to encourage more businesses and organisations to use personal data for innovation and eventually appoint the next Information Commissioner who would be charged with ensuring that “people can use data to achieve economic goals”.
Recognising fundamental human rights
By proposing to cut key protections from the UK’s implementation of the GDPR and by overtly supporting a stripped down, less burdensome regulatory approach to the commercial exploitation of personal data, the Government is being advised that this would result in a business and innovation-friendly environment. Undoubtedly, this could be the much-needed boost for the UK private sector as it recovers from the impact of both the pandemic and Brexit.
In order to drive innovation, growth, and competitiveness for the post-Brexit UK, the taskforce suggests a reform of the GDPR to place greater emphasis on the legitimacy of data processing and shifting the focus to whether or not data usage is really in the interests of the data owner and society. However, finding the right balance between this focus and the rights of the citizens or consumers is not easy. The EU Charter of Fundamental Rights views privacy and data protection as key human rights and vital foundations on which any data economy should be built.
Indeed, around the world, there has been an exponential increase in new data protection laws, many of which are being modelled on the GDPR. Also existing laws, such as those in Canada and Australia are being revised to make them fit for purpose for the new data-driven world that we find ourselves in.
If the UK diverges too much from the GDPR or indeed the emerging GDPR-like laws, it runs the risk that it might find itself out of step with data practices of many other countries, which could complicate its future trade deals— in particular, the international flows of data.
What consumers want
Trūata's Global Consumer State of Mind Report 2021 found that consumers feel they have been forced to expand their digital footprint during the pandemic and that they have already lost control over how much data is stored about them. In recent months, three-quarters (77%) of global consumers have taken steps to reduce their digital footprint for fear that they are losing control of their privacy.
Businesses and governments need to recognise that people crave and demand privacy in the IoT and big data era – or surveillance capitalism as it is sometimes referred to. The fundamental human right to privacy needs to be protected by strong regulations, now more than ever.
Placing privacy as a priority on the business strategy agenda should not only stem from the threat of consumers looking to take back control of their data or limit their loyalty to those who act responsibly with that data, but also from the knowledge that data privacy regulations do not curb innovation. Instead, as necessity is often the mother of invention, data protection laws that are fit for the challenges of the digital economy can fuel the search for new solutions.
In correlation with the rise in global privacy laws, we have seen the emergence of tools and technologies that now empower businesses to innovate with data in a privacy- compliant way; this is powerful proof that consumer rights and regulatory authorities are not innovation blockers. These privacy-enhancing technologies point to a future where progress through data-driven transformation can be, and will be, achieved without the need to infringe the privacy rights of individuals.
Our research shows that privacy will continue on its upward trajectory of importance to consumers around the world, and the requirement for strong data protection is not going away. Now is the time for businesses to embrace technologies that enable them to innovate and remain competitive without compromising consumer privacy. We should seek the continual evolvement of data privacy laws, rather than debate lowering our standards under the guise of the “greater good”.
Comments