Skip to content

Photo illustration by Security Technology; iStock

Partnering With Privacy

Like many companies, Romanian-based Uttis Industries decided to implement a video surveillance system in its facilities to enhance security. Cameras were installed and a method of recording the footage was put in place in 2016.

But the owners forgot a crucial step: notifying employees in a transparent way that they were being recorded in the workplace. Uttis also used the system to capture the names and Personal Numerical Codes of its staff members, which were later placed in a training report.

A complaint was filed with Romania’s National Authority for the Supervision of Personal Data Processing, which fined Uttis roughly €2,500 for violating terms of the European Union’s General Data Protection Regulation (GDPR).

Fines like these show how critical it has become for security professionals to partner with their privacy counterparts when adopting or implementing new technology, particularly in video surveillance, says Caitlin Fennessy, CIPP, research director at the International Association of Privacy Professionals (IAPP).

“GDPR and other privacy laws require certain notice and transparency about recordings,” says Fennessy, who formerly worked in the National Security Division of the U.S. Office of Management and Budget (OMB). “They require data subjects and individual access rights, and deletion rights. We’ve seen—increasingly—over the past couple of years regulators are enforcing these rights in the physical security space.”

The fine in Romania centered on the inappropriate use of video surveillance—not properly informing employees that they were being filmed. Fennessy says she’s seen similar fines issued since the GDPR went into effect in 2018—sparking a transformation in the security industry in how it approaches technology and data collection to abide by privacy regulations and public expectations.

“Security has long been understood as something material to someone’s business,” Fennessy explains. “Privacy is now being included along [with] security practices and risk—and that is significant.”

Another area of GDPR enforcement actions related to the security profession has revolved around access: who has access to data and images. For instance, before the United Kingdom left the European Union, its data authority regulators provided guidance on how individuals could access or obtain a copy of their data captured by surveillance systems.


“This is a real challenge for folks working in this space; it requires building in processes, steps to enable access to that data, and provision of that data where required by some of these privacy laws—up front,” Fennessy explains. “In small ways, some of these actions and requirements are bringing the two professions together and hopefully over the longer term, it will enable more creative conversations about how to design technology upfront in a way that will enable users to act on their rights.”

One major conversation that was taking place as of Security Technology’s press time was about the security and privacy challenges related to technology use to enable contact tracing to prevent the spread of the coronavirus. Contact tracing is used to trace and monitor contacts of people infected with a disease so they can be notified about their potential exposure, according to the U.S. Centers for Disease Control and Prevention.

Many countries are struggling to conduct contact tracing using decades-old systems developed by public health departments and are looking for technology solutions to aid their efforts.

“Government officials and companies are grappling with how to support efforts to limit the spread of COVID-19 while complying with data protection laws and respecting users’ rights,” Fennessy says.

Those rights, and individuals’ expectations of privacy, vary greatly depending on where they live. For instance, in China, individuals are more accustomed to sharing their personal data with government officials than individuals in Western nations are.

China “has long employed technology like facial recognition to control the activity of its citizens, including to target ethnic minorities,” according to an analysis by Recode. “Now, in response to the pandemic, it is partnering with major tech companies to expand that mass digital surveillance network and tie it to people’s health data.”

However, Americans’ expectations of privacy may change because of the crisis and the need to stop the spread of the coronavirus—making them more willing to share their personal data, said Edward Davis, former commissioner of the Boston Police Department during the Boston Marathon bombing and subsequent man hunt.

“The privacy issue has changed a little bit in the face of this, just as it did in the face of the Boston Marathon,” Davis explained in a webinar on privacy and security related to COVID-19. “No one contested our use of facial recognition to find the bombers.”

Tech companies are already moving to provide solutions that public health institutions could leverage to enhance their contact tracing methods. On 10 April 2020, Apple and Google announced a partnership to enable Bluetooth technology to help institutions and health agencies reduce the spread of the coronavirus.


“A number of leading public health authorities, universities, and NGOs around the world have been doing important work to develop opt-in contact tracing technology,” Google said in a statement. “To further this cause, Apple and Google will be launching a comprehensive solution that includes application programming interfaces (APIs) and operating system-
level technology to assist in enabling contact tracing. Given the urgent need, the plan is to implement this solution in two steps while maintaining strong protections around user privacy.”

The first step—planned for May 2020—was to release the APIs to allow interoperability between Google’s Android devices and Apples iOS devices, using apps from public health authorities that users can download to their devices.

Next, the two tech giants plan to enable broader Bluetooth-based contact tracing platforms by building this functionality into underlying platforms.

“This is a more robust solution than an API and would allow more individuals to participate, if they choose to opt in, as well as enable interaction with a broader ecosystem of apps and government health authorities,” Google explained. “Privacy, transparency, and consent are of utmost importance in this effort, and we look forward to building this functionality in consultation with interested stakeholders.”

China, South Korea, and Taiwan are all using various smartphone-related methods to trace individuals’ movements and issue notices of when they have come into contact with someone who tested positive with COVID-19.

The GDPR also clarifies how, during a pandemic, data can be collected and stored to enable such technologies. For instance, the regulation includes that a person’s data can be processed—without his or her consent—if the life of the data subject is threatened or there is substantial public interest.

“The GDPR specifically addresses epidemics and it makes clear that processing of personal data should always be necessary and proportionate,” Fennessey says. “Those are the key things that come into play with regard to this effort—anonymity should be the name of the game wherever possible.”

However, there might be concerns with the U.S. government’s ability to access the data that the proposed system by Apple and Google might generate.

“Obviously, the government would like to know if someone is going around infecting everybody,” said Michael Chertoff, former secretary of the U.S. Department of Homeland Security, in the panel discussion with Davis. “But that could easily be considered too Big Brother.”

Instead, considerations may need to be made to anonymize the data as much as possible and carefully define how it can be used—to encourage people to get testing to stop the spread of the coronavirus, Chertoff explained.

“We don’t want to use it the way they do in China, where if you were in contact with someone the regime doesn’t like your score goes down,” he added. “We have to be very disciplined not to let that happen.”

While U.S. law does not directly address privacy parameters for how such a technology would be used, the California Consumer Privacy Act (CCPA)—which went into effect 1 January 2020—does provide some guidance.

Organizations collecting personal data of Californians related to COVID-19 would need to notify individuals of the categories of personal information collected, the sources from which that data was collected, the purpose of the collection, and who that data was shared with. There are some exceptions for deidentified and aggregate consumer information, Fennessy says, as well as for compliance with state and local laws.

“One of the biggest impacts of CCPA will be the need to understand and put protections around data sharing with service providers and third parties,” she adds. “Understanding those relationships, where data is going and by whom it can be accessed, and the protections service providers have in place will be critical. Getting that right will require conversations between security and privacy teams.”

Megan Gates is editor-in-chief of Security Technology. Connect with her at [email protected]. Follow her on Twitter: @mgngates.