Data Protection in the UK

Part
01
of three
Part
01

Data Regulation After GDPR

While Brexit negotiations are still underway, and consequently Britain will be at least temporarily subject to the EU's General Data Protection Regulation (GDPR) when it goes into effect in May 2018, it is widely acknowledged that the UK will ultimately have to craft its own privacy laws which it is hoped will continue to promote the smooth transfer of data with the EU. To that end, the UK is currently debating the Data Protection Bill (DPB) in its Parliament, and the focus of both corporate lobbyists and privacy advocates are in having their concerns addressed in the final form of that bill, which will shape the UK's privacy regulations for years or even decades to come.
Below you will find a deep dive of our findings.

BRITAIN'S CURRENT COMMITMENT TO DATA PROTECTION

At present, the primary source for the regulation of personal data in Britain is the Data Protection Act (DPA), which was passed in 1998 to bring the UK in line with the EU Data Protection Directive 1995.It reads: "Whilst the DPA has provided a sufficient regulatory framework, there is no doubt that it is somewhat dated and long overdue a refresh... The refresh has already been penned by the European Commission and is due to come into effect in early 2018 under the ‘General Data Protection Regulation (GDPR).’ The GDPR is aimed at unifying individual protection across the EU, as well as, the movement of personal data outside of the EU."

The UK "played a full and active part in negotiations for the new GDPR," and with the separate Data Protection Directive (which covers personal data in connection with law enforcement). This resulted, for example, in the GDPR imposing obligations on data controllers that are proportionate to the risk posed by the processing activity in question rather than a one-sized-fits-all approach. The GDPR will go into effect in May 2018, long before Britain will complete its negotiations to leave the EU, and therefore will become "the law of the land" for the UK. Nevertheless, due to Brexit the UK will need to develop its own "refresh" of its data protection laws.
An official position paper released by the UK government in mid-2017 states the UK's commitment to the protection of personal data: "Data flows are important for the UK and the EU economies and for wider cooperation, including on law enforcement matters. To ensure that individuals have control over and transparency as to how their personal data is being used, and that their personal data is protected from misappropriation and misuse, robust safeguards are needed."

The UK government paper goes on to state, "At the point of our exit from the EU, the UK’s domestic data protection rules will be aligned with the EU data protection framework." The UK is already taking the steps necessary with the announcement of the Data Protection Bill (DPB) in the 2017 Queen's Speech. (We will discuss the specifics of the DPB below.) Thus, the "UK starts from an unprecedented point of alignment with the EU," and announces its intentions to have the UK"s Information Commissioner's Office (ICO) continue to partner with EU regulators "to maintain effective regulatory cooperation and dialogue for the benefit of those living and working in the UK and the EU after the UK’s withdrawal." This is critical because 75% of the UK's "cross-border data flows" are with EU member countries.

THE DATA PROTECTION BILL

The DPB was published on September 14, 2017, and is currently in Parliament, having passed through the House of Lords and having its first reading in the House of Commons on January 18, 2018. The official introduction to the DPB by the ICO contains more information about the specifics of the bill than are within scope of a single Wonder request but may be useful for additional context. The key takeaway is that this bill, and the UK's commitment to upholding EU standards even as a partner instead of a member, means that "the UK will be compliant with EU data protection law and wider global data protection standards on exit, and given the important role of continued regulatory cooperation as part of a future economic relationship, the UK believes that a UK-EU model for exchanging and protecting personal data could provide for regulatory cooperation and ongoing certainty for businesses and public authorities. This could build on the existing adequacy model."

FUTURE REGULATION

Despite a thorough search, we were unable to locate an article or white paper by a government official or industry expert discussing any other probable future regulations that might be passed. The focus of nearly all current articles on data privacy protection in the UK appears to be on the details and ramifications of the DPB, unsurprisingly. While the GDPR is not completely absent from the discussion, it is no longer the focal point in discussions about the future of the UK's privacy regulation.
This is not to say that the GDPR receives no comment at all. For example, Scott Millis, CTO at Cyber adAPT, believes that the GDPR will "increase the pressure on companies to secure information," resulting in those companies using artificial intelligence and machine learning "to stay one step ahead of the criminals." Since the concern of the UK is making sure that it can continue to exchange data freely with EU members, the final form of the DPB will most likely be very similar to the GDPR, "tweaked" to reflect British terminology, concerns, and existing laws.

DATA RIGHTS ADVOCATES VS. INDUSTRY LOBBYISTS

Since the DPB is currently being debated in the House of Commons, advocates on all sides have focused their concerns on shaping the new law. The primary concern of lobbyists for corporations has been that the "free flow of data" between the UK and the EU not be disrupted by new regulations in the DPB, which as shown above is also of primary concern to the British government. This includes lobbying for a new e-privacy provision to replace the current Privacy and Electronic Communications Regulations (PECR), which require an individual's consent for direct electronic marketing by phone, text and email. ICO commissioner Elizabeth Denham has argued that too much "energy and effort is being spent on trying to find a way to avoid consent. That energy and effort would be much better spent establishing informed, active, unambiguous consent."
Incidentally, the official position of the British government is that, "the UK would be open to exploring a model which allows the ICO to be fully involved in future EU regulatory dialogue." This means that the ICO's position on marketing consent could have ramifications on the Continent as well, especially since the British government estimates that 43% of all large EU digital companies started in the UK.
On the privacy advocate side, there are concerns that Schedule 2 of the DPB "would potentially remove entire industries dedicated to vetting, profiling and blacklisting private individuals from the reach of the law." This is particularly concerning because the task of vetting individuals, for example, attempting to open a bank account is increasingly outsourced to third-party companies. The credibility of the databases these sources rely on has come under question, yet the vetting companies would be under no obligation to allow an individual to access their records or object to how the data was processed. Indeed, the individual would have no right to "seek any form of redress in the event that the data they hold is false, inaccurate or misleading."
Consequently, there have been calls to redraft Schedule 2 by advocacy groups such as Open Democracy, which claims to "We have represented dozens of individuals and organizations who suffered devastating consequences as a result of being falsely identified as posing a terrorism risk." They further claim that these dozens of cases are just "the tip of the proverbial iceberg."
Another point of concern is whether the EU's "Right to be Forgotten" and similar laws will continue to have precedence in the UK. These concerns have already been largely laid to rest. The UK's "Great Repeal Bill will also provide for UK courts to refer to EU court rulings when interpreting the UK's EU-derived laws. In effect, the Bill proposes that existing case law from the Court of Justice of the European Union (CJEU) will have the same binding status as UK Supreme Court rulings, and expects that the Supreme Court will only ever depart from CJEU precedent in very rare cases." [5]

CONCLUSION

With the Data Protection Bill (DPB) being under consideration by the UK's Parliament, all current discussion on the topic of the future of privacy regulation in the UK is focused on that bill. On the corporate side, lobbyists are most concerned about maintaining the free flow of data with the EU member states that the UK enjoys under the General Data Protection Regulation (GDPR) while possibly loosening some regulations pertaining to marketing consent. On the privacy advocate side, the greatest concern is the phrasing of Schedule 2, which as written might exempt the entire vetting industry from regulation and deprive citizens of their rights to even know what data is being held by those companies. Revisiting this topic in a few months when the final form of the DPB is known may yield a much clearer picture about the future of data protection regulation in the UK.

Part
02
of three
Part
02

Data Regulation and the UK Government

Introduction

Current data transfers between the UK and the EU bring billions of pounds in trade, and account for approximately 75% of the UK's external data transfers. The EU's recently implemented General Data Protection Regulation (GDPR) increases regulations of data protection. The UK must implement the GDPR to maintain trade until Brexit; however, post-Brexit the UK will be considered a "third country" which requires additional approval from the EU, approving UK's data protection laws as being adequate levels of protection. The UK is pushing for a new data protection bill to incorporate most of the GDPR into UK law; however, this bill includes certain exemptions that the GDPR does not, and places stricter guidelines on data protection for children. This bill is currently being debated in the House of Lords. The EU must consider this new UK law as providing adequate protection to continue data transfer post-Brexit.

The UK's Stance on Data Protection

The GDPR gives regulators the ability to fine companies who are non-compliant with these regulations; the UK has indicated that the fines are not truly about the money, but are about "putting the consumer and citizen first." However, the Conservative Party appears to suggest the party will abandon the GDPR, stating that the new rights the party supports resemble the UK data law rather than the GDPR. The UK government has issued statements that support this possibility, especially regarding data from social media information posted by minors, claiming that the UK will enforce a new data protection law "fit for our new data age" and appointing the National Data Guardian for Health and Social Care as the enforcer.

UK Government Legislation (Parliament)

The Data Protection Bill must be passed by the House of Commons and House of Lords after completing a "Report Stage" where members of the House can again review and modify the bill. It is currently in the House of Lords Report Stage. Once in place, the Information Commissioner's Office (ICO) is responsible for issuing data rights and enforcing compliance. All consultations and press releases are completed by the Department for Digital, Culture, Media & Sport; it is likely that data protection is covered by this department.

The Labor Party

The Labor party has expressed that they recently attempted to increase data protection with the Digital Economy Act and are grateful that the government is addressing data protection, especially regarding social media data for minors. The Shadow cabinet member responsible for this area is the Shadow Secretary of State for Digital, Culture, Media and Sport, Tom Watson.

The Labor Party's Shadow Digital Minister, Liam Byrne, has stated that the new data protection legislation is based on trust of the new system to ensure that companies can innovate only if they use data as regulated. The Shadow Minister for Exiting the EU, Paul Blomfield, states that the Party has repeatedly advocated for increased privacy laws and data protection to investigate and prevent terrorism. The Party's manifesto supported allowing young adults to remove content posted online before they were 18. In 2016, Labor Party leader Jeremy Corbyn announced that he was committed to guaranteeing citizens privacy rights online. Overall, the Labor Party supports data protection, specifically as outlined in the UK's data protection bill but also in the GDPR.

The EU's Stance

The EU has reminded everyone involved that the UK will no longer be part of the EU in March 2019, and that "any cross-border data flows between the EU and the UK may no longer carry automatic adequate safeguards." After Brexit, the EU will thoroughly examine the UK's legal workings. An anticipated specific challenge is the UK's Investigatory Powers Act, which allows for data monitoring in opposition to sites that offer end-to-end encryption, suggesting creation of a backdoor for such sites.

Conclusion

The UK, while agreeing to the need for increased data protection does not plan to fully incorporate all parts of the GDPR, instead adding exemptions and stricter regulations for child data protection. If the UK's regulations are not deemed adequate by the EU post-Brexit, the UK could lose the source of more than 75% of their data transfers.
Part
03
of three
Part
03

Data Regulation and Artificial Intelligence

The British government and companies that utilize big data are the main groups discussing the relationship between data regulation and artificial intelligence (A.I.). A.I. needs to access user data to understand and learn human behavior. However, some experts fear that the implementation of GDPR in May 2018 and future regulation, will adversely affect the development of A.I. The British government and other experts argue that regulation will create greater trust between the populace and companies, which would provide new opportunities for A.I. development.

BIG DATA AND ARTIFICIAL INTELLIGENCE

The Information of Commissioner’s Office is leading the argument in favor of data regulation for the government. They are supported by the Department for Digital, Culture, Media, and Sport as well as the Article 29 Data Protection Working Party.

The Information Commissioner’s Office (ICO) produced a report on how to preserve an individual’s privacy and dignity while enabling companies to collect the necessary data.
The report highlights the following characteristics as unique to big data collection and may potentially infringe upon an individual’s privacy:
1. How algorithms are used — Algorithms are being run against large swaths of data, allowing the A.I. system to find correlations and create profiles. It also allows the A.I. to learn and store relevant search criteria based on the request.
2. Opaque processing — The A.I. system’s decisions are not always understandable. This is because vast amounts of data are being fed through a non-linear neural network and classifies data on outputs built on successive layers. This complex process creates a ‘black box’ effect, making it difficult for us to understand the reasons the system made certain decisions.
3. Collecting all the data — Big data requires that all data be analyzed as opposed to only a statistically representative sample.

4. Re-purposing of data — Once data has been collected, it can be used for a variety of reasons different from the reason that prompted the collection. For example, by taking geolocated Twitter data, the Office for National Statistics (ONS) can infer people’s residence and mobility patterns, supplementing official population estimates.
5. New types of data — Data no longer needs to be consciously provided by the user. Instead, it can be generated automatically. For example, sensors in the street can capture unique MAC addresses of the cellphones of pedestrians.
While these different characteristics allow A.I. to learn how to spot correlations in large data sets, understand human behavior through profiling, and manipulate data to arrive at answers that were previously impossible to arrive at, they have also created a culture of suspicion and reluctant resignation amongst the population. There are concerns about the lack of transparency on what kind of data is being collected, how it is being used, and how it affects the individual. Additionally, there is concern about how the individual’s expectations are being addressed.
The ICO’s report argued that General Data Protection Regulation (GDPR) was passed to address these concerns.

GENERAL DATA PROTECTION REGULATION

General Data Protection Regulation (GDPR) will be implemented in the U.K. in May 2018 and will replace existing data protection legislation for corporations and law enforcement agencies. If a company is selling goods and services to customers who are citizens in other E.U. countries, then they will have to comply with GDPR. The U.K. has agreed to adhere to GDPR for a year as a member state and will have to demonstrate compliance after Brexit. This may be difficult, because once the U.K. leaves the E.U. it will be a third-party country, opening it up to the E.U. scrutinization of its human rights record and powers of surveillance. It is predicted that the Investigatory Powers Act and U.K.'s refusal to incorporate the Charter of Fundamental Rights of the E.U. (of which Articles 7 and 8 enshrine privacy rights and data protection rights) will be sources of tension.

GPDA stipulates the following:
1. A company must require explicit consent from the individual.
2. A company should be transparent on how the data will be used.
3. A company must prove that the data collecting is necessary for its legitimate interests, not just because it has the potential to be interesting. It must then balance its legitimate interests with that of the individual involved.
4. A company must be able to provide logs and present details to an auditor on the use of profiling.
5. An individual has the right not to be subjected to a decision based on an automated process.
6. If an automated decision is made, the individual has a right to ask for human intervention and justification for the decision.
7. When a breach or cyber attack occurs, the individual whose data was involved must be notified.

HOW REGULATION WILL AFFECT DEVELOPMENT OF ARTIFICIAL INTELLIGENCE

Those most concerned about how regulations will affect big data and the development of A.I. are CIOs for companies that utilize big data collection and analysis, as well as technology experts and lawyers.
There are two trains of thought when it comes to understanding how data regulation will affect the development of artificial intelligence. One school thinks that data regulation restrictions access to data and data analysis, thus slowing down or ‘killing’ A.I. development. The other school thinks that the regulations will present new opportunities for A.I. development.
Those who believe that regulation will negatively affect A.I. development argue that A.I. is too complex to conform with GPDA’s stipulations. While requesting consent from an individual sounds feasible, big data and A.I. based systems collect and analysis so much data it will be hard to determine whether data was collected lawfully, let alone if consent was given for that specific data collection.

Additionally, the A.I. decision-making process is so complicated that explaining why a decision was made will not always be possible, data logs outlining the process used by the A.I. are often not produced, and data sets that contain trade and/or business secrets may have been used to make the decision. While A.I. primarily uses algorithms, they also utilize machine learning which may be difficult to understand and explain. GPDA states that when a breach occurs, the individual whose data was involved must be notified, but the size of data collected will make this difficult to discern. Finally, the new regulations restrict what, when, and how data can be collected, limiting the size of the data sets A.I. will be able to interact with.
Those who believe that regulations will positively affect A.I. development argue that A.I. can be used to ensure GPDA’s stipulations are being followed. A.I. can be used to track and catalog data and data relationships, present companies with a comprehensive understanding of where they are meeting GPDA’s regulations and where they re not, and keep track of when consent was given, for what pieces of data, and when consent expired or was taken back.

A.I. can be programmed to prompt for consent, could map data lineage and evaluate decision-making process to record any bias, and to alert a human when someone requests that an automated-decision be justified. Instead of killing A.I. development, regulations like GPDA could present A.I. with an opportunity to learn new ways of dealing with data and regulating itself. It also prompts programmers and companies that deal with A.I. to be more aware of how they are programming A.I. to interact with the data it collected and who it collected the data from.
CONCLUSION
The debate about data regulation and A.I. development has occurred mostly between companies that utilize big data and the British government. There are fears that regulation will slow down A.I. development and require companies to provide impossible to provide information. However, others believe that regulations will provide new opportunities for A.I. develop.


Sources
Sources

From Part 02