What is GDPR? Everything you need to know, from requirements to fines

A map of Europe with nodes to represent data hotspots
(Image credit: Shutterstock)

The EU’s General Data Protection Regulation (GDPR) came into force in May 2018 as a piece of legislation that aimed to give people more control over their own data, and draw up strict limitations on how organisations can use data.

This has become increasingly pertinent in the modern era as more and more businesses, such as retailers and social media companies, are processing data in massive quantities and monetising these activities.

GDPR also set out to ensure that data flows between all member states are safeguarded by a consistent set of rules, and that EU citizens can be assured over their data rights regardless of where the data is based.

The UK adopted the principles of GDPR into law through the Data Protection Act 2018, which supersedes the Data Protection Act 1998. Under the previous rules, organisations could only be fined a maximum of £500,000 for violations, with the new laws awarding the Information Commissioner's Office (ICO) the power to issue significantly larger fines. Businesses that violate the laws face penalties of up to €20 million, or 4% of global annual turnover, whichever is higher.

Why was the GDPR drafted?

Prior to GDPR, the 1998 legislation formed the basis for data protection and regulation in the UK. Like GDPR, these laws were also based on EU rules, specifically the EU Data Protection Directive of 1995. The principles and terms of these regulations, however, date back to a historic era, by today’s standards. Technology has evolved drastically through the years, and the way that businesses use data today might have seen incomprehensible to legislators in the 90s. This is is why a subsequent generation of data protection rules was placed high on the agenda.

Over the course of the last quarter-century, the web and the movement of data has cemented itself as a fixture in the operations of countless businesses. The rise of social media platforms, which host and exhibit vast amounts of personal data, have also called into question the need for a set of data protection laws that are fit-for-purpose in the modern age.

It’s clear why rules such as those outlined with GDPR were required, given how some of the world’s biggest tech companies conduct their business. Several platforms, for many years, have offered services that are free to use but request personal and private data from their users, which are then processed and monetised. People aren’t paying when they use Google’s search engine - but their actions and movements are recorded, and converted into data points. These are seen as being very valuable for third parties and are especially sought-after for purposes such as targeted advertising


Harness data to reinvent your organisation

Build a data strategy for the next wave of cloud innovation


In the past, this type of data collection was often masked by unclear tick boxes or opt-in buttons. You might not even remember agreeing to them, and you almost certainly wouldn't have read the associated terms and conditions, but its the reason you receive emails that aren't completely in line with your interests.

Perhaps the most egregious example of data misuse was Facebook's Cambridge Analytica scandal, which dominated news headlines in early 2018. In that case, user data was found to have been improperly shared with a third party app, which then used this to target users with advert campaigns said to influence the outcome of the 2016 US election.

The Facebook logo on a phone in front of a large background with Cambridge Analytica

(Image credit: Shutterstock)

A separate aim of GDPR is to make it easier and cheaper for companies to comply with data protection rules. The EU's 1995 directive allowed member states to interpret the rules as they saw fit when they turned it into local legislation. This meant that data protection laws were inconsistent across the bloc, making data transfers overly cumbersome. The nature of GDPR as a regulation, and not a directive, means it applies directly without needing to be turned into law, creating fewer variations in interpretation between member states. The EU believes GDPR will not only create smooth data flows but also collectively save companies £2.3 billion a year.

When did GDPR come into effect?

GDPR came into effect on 25 May 2018, applying automatically to all member states and any international organisation that deals with customers and clients that are residents of the EU. Because GDPR is a regulation, not a directive, the UK did not need to draw up new legislation instead, it applied automatically.

With the UK now preparing to leave the European Union, the UK has also introduced new data protection legislation under the Data Protection Act 2018. This new act covers certain provisions that are not part of GDPR, such as processing relating to immigration and automatic processing in public bodies. GDPR will be implemented into UK law as part of the European Union (Withdrawl) Act, and will sit alongside the DPA 2018 going forward. This has been necessary in order to demonstrate the UK has robust enough data protection laws in place to protect EU data - needed in order to secure an adequacy agreement with the EU post-Brexit.

Who does the GDPR apply to?

If you don't think you need to respect the GDPR legislation, you're likely to find yourself in hot water sooner or later. Whether your business operates with clients in the EU or outside it, it's vital you respect the rules and make sure you're compliant with regulations.

Pretty much every business must comply with the EU's data laws, even if they're based in the US. This is because most companies have at least some data belonging to EU citizens stored on their servers. In order to process that data, the organisation must comply with GDPR principles.

However, if you truly have no dealings with the EU, you can avoid having to comply using a traffic filter. By blocking any EU traffic to your website, you can make sure that only non-EU traffic is allowed to your website and only those outside Europe can enter their details onto your site.

It obviously a technique only relevant for businesses that do not need contact with EU citizens, such as US-based news sources. The LA Times is one company that has implemented this GDPR avoidance scheme.

What are data controllers and data processors?

There's a distinct difference between a data controller and a data processor, as stipulated by the EU.

A data controller is responsible for setting out how and why data is collected but doesn't necessarily collect the data itself.

That means a controller could be any organisation, from a high street retailer to a global manufacturing giant to a charity, while a processor could be an IT services firm they employ.

It's the controller's job to make sure the processor complies with data protection law, while processors must maintain records of their processing activities to prove they abide by rules. Unlike older data protection laws, both the controller and the processor are jointly liable for financial penalties in the event of a data breach or if the processor is found to have handled data illegally.

It is possible for a non-EU-based controller to use an EU-based processor, in which case all parties need to be compliant with GDPR.

How can I process data under the GDPR?

GDPR states that controllers must make sure it's the case that personal data is processed lawfully, transparently, and for a specific purpose.

That means people must understand why their data is being processed, and how it is being processed, while that processing must abide by GDPR rules.

What do you mean by 'lawfully'?

'Lawfully' has a range of alternative meanings, not all of which need apply. Firstly, it could be lawful if the subject has consented to their data being processed. Alternatively, lawful can mean to comply with a contract or legal obligation; to protect an interest that is "essential for the life of" the subject; if processing the data is in the public interest, or if doing so is in the controller's legitimate interest such as preventing fraud.

At least one of these justifications must apply in order to process data.

How do I get consent under the GDPR?

Consent must be an active, affirmative action by the data subject, rather than the passive acceptance under some models that allow for pre-ticked boxes or opt-outs.

Controllers must keep a record of how and when an individual gave consent, and that individual may withdraw their consent whenever they want. If your current model for obtaining consent doesn't meet these new rules, you'll have to bring it up to scratch or stop collecting data under that model when the GDPR applies in 2018.

Consent is generally thought of as being the weakest legal basis for processing data, as consent can be removed at any time and thus grind processing to a halt. Unless your business is involved with media or marketing (or those similar industries where consent is required), consent should be your last option.

What counts as personal data under the GDPR?

The EU has substantially expanded the definition of personal data under the GDPR. To reflect the types of data organisations now collect about people, online identifiers such as IP addresses now qualify as personal data. Other data, like economic, cultural or mental health information, are also considered personally identifiable information.

Pseudonymised personal data may also be subject to GDPR rules, depending on how easy or hard it is to identify whose data it is.

Anything that counted as personal data under the Data Protection Act also qualifies as personal data under the GDPR.

When can people access the data we store on them?

The new data protection laws strengthened one key aspect of the legislation that gives citizens the right to access the data organisations hold. Anybody, under GDPR, can submit a subject access request (SAR) to an organisation. This data controller will then have 30 working days in which to provide a full response.

The provision was already part of UK law under the Data Protection Act 1998, but the time period stood at 40 working days. Failure to comply with the reduced windows also exposes companies to regulatory action under the stricter terms of the GDPR. Twitter, for example, was subject to a GDPR investigation for failing to provide users with the information they requested under this provision. The rule only applies, however, if the requests are deemed reasonable, as there are certain exemptions.

The data protection laws say controllers and processors must identify clearly how users' data is collected, what it's used for, and how it's processed. Any communications outlining this information, moreover, must be in clear and plain English so there's no risk of confusion on the part of users.

Submitting SARs are, in effect, a mechanism individuals can use to express their power under the law, to hold companies to account over how they use their data. It gives them the right to understand how their information is handled, and for what reasons. Customers can also ask for data to be removed, completed or brought up to date at any time if deemed incorrect.

What's the 'right to be forgotten'?

GDPR makes it clear that people can have their data deleted at any time if it's not relevant anymore - i.e. the company storing it no longer needs it for the purpose they collected it for. If the data was collected under the consent model, a citizen can withdraw this consent whenever they like. They might do so because they object to how an organisation is processing their information, or simply don't want it collected anymore.

The controller is responsible for telling other organisations (for instance, Google) to delete any links to copies of that data, as well as the copies themselves.

What if they want to move their data elsewhere?

Then you have to let them and swiftly: the legislation means citizens can expect you to honour such a request within four weeks. Controllers must ensure people's data is in an open, common format like CSV, meaning that when it moves to another provider it can still be read.

How to report a data breach under GDPR

Under GDPR, a data breach constitutes any breach of security that leads to the accidental or unlawful loss, destruction, alteration, disclosure of, or unauthorised access to personal data.

However, only those breaches that are likely to lead to the infringement of the rights and freedoms of people are required to be reported to the ICO organisations are not required to report every incident.

Regardless of the nature of the breach, an organisation is required to take steps to contain it and establish its severity. As part of this process, the company is required to undertake a self-assessment.

If the breach happened to a data processor, they are required to inform the controller without delay as soon as they become aware of an incident. It's important to establish these obligations as part of a contract, as both controller and processor will be liable for any failure to communicate the facts of a data breach.

Affected parties have up to 72 hours to inform the ICO if they feel the breach poses a risk to the rights and freedoms of data subjects. Any failure to adhere to this timeframe will need to be justified.

Affected parties can call the ICO on 0303 123 1113 it's also possible to report a breach online, but only if you feel you have dealt with the incident appropriately already.

When reporting the breach, the following information must be provided:

  • A description of the breach, including (if possible) the approximate number of people affected and the types and volume of personal records involved
  • The contact details of the affected organisation's data protection officer, or those of a contact that can provide further information
  • A description of the potential consequences of the breach
  • A description of the various measures the organisation has taken to deal with and mitigate the effects of the breach

Some of this information may not be available within the 72-hour timeframe, so article 33(4) allows for affected parties to provide details in phases, provided this is done without undue delay. However, any delay will need to be explained, and the party is still required to inform the ICO of a breach within 72 hours if deemed severe enough.

If the incident is likely to result in a "high risk" to the rights and freedoms of data subjects, then companies are required by GDPR to inform affected individuals directly, without undue delay. It's important to note that this threshold is higher than that for reporting a breach to the ICO and therefore you are not required to alert data subjects to every breach, even if the ICO has been informed.

When informing data subjects, you are required to explain exactly what has happened in clear and plain language, as well as provide the name and contact details of your data protection officer or a point of contact where they can gain further information. You are also required to describe what is likely to happen as a result of the breach, as well as what steps have been taken to secure a system and to mitigate the effects of a breach.

Important: Regardless of whether you report an incident to the ICO, all security incidents must be documented fully in the event of a future investigation or for use as part of a training programme.

Failure to notify the ICO within the 72-hour window, without justification for a delay, can result in a fine of up to £10 million or 2% of your global turnover, whichever is higher.

What are the fines for breaches of GDPR?

Two tiers of fines exist under GDPR, but both are much bigger than any the UK has seen before. Under the Data Protection Act 1998, the UK regulator, the Information Commissioner's Office (ICO), was able to fine companies a maximum of £500,000.

The exterior of the ICO's offices

(Image credit: The Information Commissioner's Office)

GDPR massively increases the ceiling of fines. First of all, your organisation faces a penalty of up to 2% of their annual turnover, or £10 million, for failing to report a data breach to the ICO within 72 hours of becoming aware of it. That initial contact should outline the nature of the data that's affected, roughly how many people are impacted, what the consequences could mean for them, and what measures you've already actioned or plan to action in response. It's worth noting that the window is a fixed 72 hours after the discovery of an incident, and not 72 working hours, as some companies have been led to believe.

Then there is the fine for a breach of personal data itself. Data breaches under GDPR could be punished by a maximum fine of 4% of your organisation's annual turnover, or £20 million, whichever is higher.

You can read our article on GDPR fines for more information on this, but the regulation does make clear that fines must be "proportional", therefore you're unlikely to face the most severe penalty if it's a minor breach, or if you can demonstrate you are largely compliant with the legislation. The ICO itself has said it views fines as a "last resort".

Have companies already been fined under the GDPR?

Companies have already fallen foul of GDPR, resulting in a handful of multimillion-pound fines being issued by the Information Commissioner's Office.

The most recent, and the largest fine to date, was levelled against British Airways in July. The airline was fined £183 million by the ICO for a series of data breaches in 2018 that led to the theft of data belonging to some 560,000 users in total, including payment information relating to flight bookings.

This was immediately followed by a £99 million fine against the Marriott International hotel chain, after an unpatched vulnerability in its Starwood booking system led to the exposure of £339 million user records. It's believed that some £30 million of these belonged to residents of 31 EEA countries, including seven million UK records.

The Marriott International logo as seen on one of its hotels

(Image credit: Shutterstock)

French data protection regulator CNIL also fined Google £50 million in January, following complaints of forced consent inside the company's Android operating system. In this case, the regulator ruled that the act of forcing users to opt-out instead of opt-in to data processing was a breach of GDPR. The largest prospective GDPR fine to be administered to date is by the authorities in Luxembourg, which issued Amazon with a £637 million fine.

Some companies narrowly avoided a GDPR-scale fine, as their data incident occurred prior to GDPR's implementation date. Both Equifax and Facebook received the maximum fine possible - £500,000 - as per the previous Data Protection Act 1998. Although GDPR has been in play for nearly two years, these cases continue to trickle on. Most recently, Cathay Pacific was fined the maximum possible under the previous legislation for a data breach in 2018.

While the ICO has only issued two notices of intent to fine, the Irish Data Protection Commission (DPC) has been reluctant to issue penalties since GDPR came into force. This is despite the fact it has launched dozens of probes since May 2018, including several high-profile investigations against the biggest tech giants such as Facebook and Google. It's widely anticipated that the Irish DPC will begin to collect fines during 2021.

But what about Brexit?

Because the UK government only triggered Article 50 in March 2017, which had set in motion the act of leaving the EU within a two-year timeframe, GDPR was actually implemented before the legal consequences of the Brexit vote. The UK was still required to comply, and subsequently enshrined the principles of GDPR into UK law.

Brexit and the Data Protection Act 2018

A new Data Protection Act 2018 put forward by the UK government in August 2017 and which received Royal Assent on 23 May 2018, essentially replicates the tenants of GDPR but includes a number of additional provisions not covered by the EU law.

Much like the stipulations of GDPR, the act sets out sanctions for non-compliant organisations, permitting the Information Commissioner's Office (ICO) to issue fines of up to £17 million, or 4% of global turnover, whichever is highest (compared to €20 million or 4% of turnover under GDPR).

It also provides provisions for the right to be forgotten, adding the ability for data subjects to demand social media companies erase any posts they made during childhood, a good opportunity for embarrassed adults to delete things they said in their teenage years.

The act also proposes to modernise current data protection regulations by expanding the definition of personal data to include IP addresses, internet cookies, and DNA.

By aligning with GDPR, the UK had hoped to build an enhanced data protection mechanism that goes beyond the adequacy model the EU imposes on 'third' countries. The aim was to ensure the EU would grant the UK a data adequacy agreement, meaning data could continue to flow from EU territories to the UK undisrupted.

Former digital minister Matt Hancock said at the time: "Bringing EU law into our domestic law will ensure that we help to prepare the UK for the future after we have left the EU. We are committed to ensuring that uninterrupted data flows continue between the UK and the EU and other countries around the world."

This agreement was finally granted on a provisional basis in February 2021, and was later finalised several months later.

Data transfers post-Brexit

GDPR was signed into UK law as part of the European Union (Withdrawal) Act, which very much formed the basis for the adequacy agreement being struck, as expected. There had been speculation it would take many months, and possibly years, to guarantee, although it recently came into force earlier this year.

Had it not been assured, UK businesses would have been required to find alternative legal mechanisms for receiving data from the EU (UK commitments mean data will continue to flow to the EU regardless of any deal).

Many businesses already relied on standard contractual clauses (SCCs) to bake data protections into deals with other organisations - and therefore comply with GDPR without a national agreement in place. The legal status for these were questionable at the time, and it was widely thought that these would be ruled invalid. Thankfully, for UK businesses, SCCs weren't outlawed by European authorities, meaning this avenue continues to exist.More information on the standard contractual clause ruling can be found here.

GDPR's future development

Because the UK is no longer part of the EU, our government will be unable to contribute to the development of data protection law in the EU after Brexit - which is particularly ironic given that the UK was one of its chief authors. In fact, the EU's chief Brexit negotiator poured cold water on that notion by ruling out any UK involvement in the board set up to apply and regulate GDPR after the UK left the bloc in 2020.

The ICO had hoped to continue to participate in the European Data Protection Board (EDPB) post-Brexit, with information commissioner Elizabeth Denham saying a seat at the table of EU data protection authorities would be "really advantageous to business".

But Michel Barnier responded in May that Brexit "is not, and never will be, in the interest of EU business", and the UK must accept the consequences of its decision to leave, including that it cannot participate in the EDPB.

Denham spelt out what that means for the UK to the Parliamentary Committee for Exiting the EU, saying: "We will be a less influential regulator." That means the UK won't have a say on interpreting GDPR, or how it applies to AI, and how big tech companies are regulated.

Is the Investigatory Powers Act compatible with GDPR?

However, what's unclear is whether other new legislation will be deemed compatible with GDPR once the UK leaves the EU. For example, under the UK's Investigatory Powers Act, ISPs are compelled to collect personal web histories and hold them for up to 12 months. The government is currently having to rewrite some of these laws after identical powers in old DRIPA legislation were found to be illegal.

But Hancock wrote in October 2017 that "UK national security legislation should not present a significant obstacle to data protection negotiations."

Do we need a data protection officer?

Any public body carrying out data processing needs to employ a data protection officer, as do companies whose core activities involve data processing that requires they regularly monitor individuals "on a large scale", according to the GDPR legislation, though public bodies are at an advantage, in that several can share the same data protection officer. Organisations should give the contact details of this person to their data protection authority.

The data protection officer's job is to inform and advise the organisation about meeting GDPR requirements, and monitoring compliance. They'll also act as the data protection authority's primary point of contact, and will be expected to cooperate with the authority. Read a bit more about the role here.

Keumars Afifi-Sabet

Keumars Afifi-Sabet is a writer and editor that specialises in public sector, cyber security, and cloud computing. He first joined ITPro as a staff writer in April 2018 and eventually became its Features Editor. Although a regular contributor to other tech sites in the past, these days you will find Keumars on LiveScience, where he runs its Technology section.