French Regulator Issues First Major GDPR Violation Fine
It was the moment the global data privacy world had been waiting for. A regulator fined Google €50 million ($56.7 million USD) on 21 January for violating the European Union’s General Data Protection Regulation (GDPR)—the first major fine against a company following the regulation’s compliance deadline.
The fine amount was justified, France’s National Data Protection Commission (CNIL) said, because of the “severity of the infringements” it observed of the principles of the GDPR—transparency, information, and consent.
“Despite the measures implemented by Google, the infringements observed deprive the users of essential guarantees regarding processing operations that can reveal important parts of their private life since they are based on a huge amount of data, a wide variety of services, and almost unlimited possible combinations,” CNIL said in a press release. The commission added that the violations are “continuous breaches of the regulation as they are still observed to date. It is not a one-off, time-limited infringement.”
Two associations—None Of Your Business and La Quadrature du Net—filed complaints with CNIL in May 2018. They claimed Google did not have a valid legal basis under the GDPR to process users’ personal data for ad personalization purposes.
CNIL then began investigating Google and found that when users created Google accounts using Android smartphones, the tech company’s practices violated the GDPR in two ways: transparency and legal basis for ads personalization processing.
CNIL’s analysis found that the notices Google provided to users about what information it sought to collect on them were not easily accessible.
“Essential information, such as the data processing purposes, the data storage periods, or the categories of personal data used for the ads personalization, are excessively disseminated across several documents, with buttons and links on which it is required to click to access complementary information,” CNIL explained. “The relevant information is accessible only after several steps, sometimes requiring up to five or six actions.”
And when users were able to access this information about Google’s data practices, it was not always “clear or comprehensive,” CNIL said.
“Users are not able to fully understand the extent of the processing operations carried out by Google,” the regulator added. This was because the terms Google used were “too generic and vague in manner, and so are the categories of data processed for these various purposes…the information communicated is not clear enough so that the user can understand that the legal basis of processing operations for the ads personalization is consent and not the legitimate interest of the company.”
CNIL also found that Google violated the GDPR because it does not validly obtain users’ consent to process their data for ad personalization. This is because Google does not sufficiently inform users about the company’s data practices, and when users go through the process to make changes to what data is collected about them for ad personalization, the options are prechecked.
This violates the GDPR, CNIL explained, because “consent is ‘unambiguous’ only with a clear affirmative action from the user—by ticking a non-pre-ticked box, for instance.”
Google responded to the fine with a statement that users expect transparency and control from it and that it is “deeply committed to meeting those expectations and the consent requirements of the GDPR,” according to the BBC.
At a February 2019 event in Dublin, Ireland, Google’s Chief Privacy Officer Keith Enright said the company will appeal CNIL’s finding. (It had not filed an appeal by Security Management’s press time.) Google did not respond to requests for comment on this article.
While Google plans to appeal the finding, the initial fine marks a turning point in the first year of the GDPR compliance enforcement. The regulation—drafted in 2012 and approved in 2016—went into effect on 25 May 2018.
The GDPR was designed to give EU citizens greater control of their personal data and rights to what data they choose to share and have retained by organizations. It required organizations that collected EU citizens’ data to obtain “clear and affirmative consent” for data collection, have privacy policies in clear and understandable language, inform citizens when their data was compromised, allow citizens to transfer their data to other organizations, and request to have their data deleted—commonly known as “the right to be forgotten.”
Despite the two-year time frame companies were given to comply with the regulation, many were not compliant by the deadline. Others made only limited efforts to become compliant, preferring to wait instead until the first major fine was issued to see how serious regulators were about enforcement.
“…Many companies have simulated compliance with the law while manipulating users into granting them consent by means of deceptive interface design and behavioral nudging,” Alan Toner, Electronic Frontier Foundation (EFF) special advisor, wrote in an article for EFF. “If a major company is seeking to get a free pass from another national data protection authority, that decision will now be critically contrasted with the approach of the CNIL.”
One year after enforcement began, many companies are still not compliant with the GDPR, says Andrea Little Limbago, chief social scientist at Virtru, who gave a presentation at the RSA Conference 2019 on privacy laws.
A core challenge many companies face in achieving compliance is meeting the Right of Access requests, she explains in an interview with Security Management.
“As consumers are increasingly wary of how data is collected and used, the number of data requests is increasing and many companies may not be able to respond securely within the deadline with data in a readable format,” Limbago says.
Companies are also struggling with the GDPR’s consent requirements, especially when it comes to transparency and accessibility.
“Under the GDPR, companies can’t bundle multiple uses of data within a single consent form, which had been the standard in many industries,” she adds.
This practice was one of the reasons that Google was found to be in violation of the GDPR, and Limbago says she was not surprised that CNIL focused on consent—and that Google was using data for practices other than those users had consented to—for its first major fine.
To avoid similar fines, Limbago suggests that companies review how they obtain consent to collect users’ personal information.
“Each data use requires its own consent, and it must be very clear to the user exactly what they are opting into,” she explains. “This transparency is essential and requires shifting away from those standard business practices, such as auto-populating boxes to assume consent, that are not considered compliant as the user must actively confirm their consent.”
And while companies continue moving towards compliance with the GDPR, California-based companies will soon face new data privacy challenges. In June 2018, then-California Governor Jerry Brown signed the California Consumer Privacy Act (CCPA), which goes into effect in 2020.
The CCPA applies to for-profit organizations that collect and process California residents’ personal information or do business in the state. Organizations must also meet one of three additional criteria for the CCPA to apply to them: generate an annual gross revenue in excess of $25 million, receive or share personal information of more than 50,000 California residents each year, or obtain at least 50 percent of its annual revenue by selling California residents’ personal information.
The CCPA shares some similarities with the GDPR, such as requiring companies to demonstrate after a data breach that they took reasonable steps to protect that data from a breach, but it also differs from the European regulation.
“The CCPA ignores certain categories of personal data that are covered by the GDPR because they are covered under other U.S. regulations, such as HIPAA,” Limbago says. “The penalties are also a bit different, with the GDPR focused on up to 4 percent of annual global turnover, whereas the CCPA focuses on a per violation fine.”
Unlike the GDPR, the CCPA does not require organizations to hire data protection officers or conduct impact assessments—in some cases.
“The good news is that for companies that are GDPR compliant, it will be much easier to be compliant with the various patchwork of laws coming into effect in the United States,” Limbago adds. “In fact, the lack of harmonized data privacy laws across the United States may pose the biggest compliance challenge given how quickly these policies are emerging.”
Vermont recently passed data privacy measures that apply to data brokers, Colorado passed a consumer data protection law, and Massachusetts is considering privacy legislation. The U.S. Senate Committee on Commerce, Science, and Transportation also held a hearing earlier this year on creating a federal data privacy framework.
“In an age of rapid innovation in technology, consumers need transparency in how their data is collected and used,” said Committee Chairman U.S. Senator Roger Wicker (R-MS). “It is this committee’s responsibility and obligation to develop a federal privacy standard to protect consumers without stifling innovation, investment, or competition.”
However, Limbago says she thinks the United States will work to federally harmonize data breach laws before adopting federal privacy provisions, which are likely to meet more resistance.
“There are currently over 50 data breach notification laws in the United States, each with slightly different provisions and requirements,” she says.
And in the international arena, nations will likely move in one of two directions—embracing the GDPR with similar regulations or adopting a more authoritarian view. Vietnam, for instance, recently enacted a law that requires Internet companies to remove content that communist authorities contend is against the state. Companies that collect data on Vietnamese citizens are also required to have an office in the country.
India is also considering data privacy legislation that would require companies that collect Indians’ user data to store that data in the country, and for personal data to be processed in the interest of the state.
“These new laws tend to limit free speech and mandate government access to data,” Limbago explains. “While China and Russia best exemplify this competing framework, Vietnam’s recent data law and the one under debate in India both reflect aspects of authoritarian versions of data protection, and it is a model that continues to spread globally.”