Skip to content
Menu
menu

A New Perspective on Protecting Personal Data

While the ability to access databases from anywhere via the Internet enhances everyone’s efficiency, it also raises the risk that personal and proprietary information will fall into the wrong hands.

A trio of computer scientists at Stanford University is developing a conceptual framework for understanding privacy expectations and their implications using the tenets of “contextual integrity.” Developed by Helen Nissenbaum, Ph.D., associate professor, Department of Culture and Communications, New York University, contextual integrity defines privacy using complex social principles. It then expresses those principles in algorithms that can be written into software to monitor data use. The goal is to help companies comply with privacy laws and their own corporate privacy policies.

To understand how this works, consider that individuals are typically willing to share their personal information only for certain purposes. This idea is expressed in a policy referred to as “minimal necessary disclosure.”

The Stanford scientists have termed this idea as “utility.” Led by John Mitchell, Ph.D., professor of computer science and electrical engineering, they have devised a way to analyze a business process to determine whether it discloses more information than is necessary to achieve its utility goal.

Contextual integrity can also be applied to traditional views of access control. Mitchell, Nissenbaum, and colleagues Adam Barth and Anupam Datta, Ph.D., have developed a model that adds past and future action or inaction, or mathematical expressions of temporal logic, to the general “allow” or “deny” access control rules.

The model uses both positive and negative norms in its temporal logic formulas. A positive norm permits communication “if” its temporal condition is satisfied. A negative norm permits communication “only if” its temporal condition is satisfied.

For example, a positive norm would state that Dr. Jones can send test results on patient Smith to Mr. Richardson, a researcher, if Richardson keeps them confidential, although he is not obliged to do so. A negative norm states that the test results can be shared only if the temporal condition is met, meaning Richardson must keep them confidential.

The model also relies on four variables: behavioral norms that apply to the transmission of personal information, the appropriateness of the information involved, the roles of the individuals sending and receiving the information, and the principles of transmission.

This last variable focuses on the constraints that regulate the flow of information from one entity to another. For example, Ms. Anderson might receive information from Mr. Johnson through a commercial exchange because she deserves to know it, because Mr. Johnson chooses to share it, or because Ms. Anderson promises to keep it confidential.

The scientists put their model to the test by capturing the many notions of privacy found in American privacy laws, including the Health Insurance Portability and Accountability Act (HIPAA) and the Financial Modernization Act, commonly known as the Gramm-Leach-Bliley Act (GLBA).

HIPAA concerns the transmission of an individual’s health information by “covered entities,” such as hospitals, healthcare providers, and insurance companies. The law largely forbids the disclosure of health information to individuals or organizations acting in certain roles.

Most of HIPAA’s many privacy provisions can be expressed as positive transmission norms. For example, one norm allows a covered entity to provide an individual’s health information to that individual; Dr. Jones can show patient Smith an x-ray of his broken leg. Another norm allows Ms. Taylor, an x-ray technician, to provide patient Smith’s x-ray to Dr. Jones.

An example of a negative norm under HIPAA provides protection for the disclosure of psychotherapy notes, even to the individual patient. In this case, the insurance company is prevented from disclosing psychotherapy notes to patient Smith without the prior approval of Dr. Norton, the psychiatrist.

The model uses the following formula to express this action:

IF send (covered-entity, individual, psychotherapy-notes), THEN PREVIOUSLY send (psychiatrist, covered-entity, approve-disclose-psychotherapy-notes).

GLBA contains privacy provisions that limit the extent to which financial institutions can share nonpublic personal information with nonaffiliated companies and affiliates. In some of its provisions, however, the law distinguishes between the information held by the institution on customers versus consumers.

For example, the law requires financial institutions to inform their customers of their privacy practices and to allow customers to opt out of certain kinds of disclosures. The temporal norm, then, requires institutions to periodically send privacy notices to customers.

GLBA’s requirements on interacting with consumers are less strict. Institutions are required to notify consumers of their privacy practices only if they share a consumer’s personal information with nonaffiliated companies, and they may do so either before or after the disclosure.

The model expresses this transaction as follows: IF send (financial-institution, third-party, personal-information), THEN PREVIOUSLY send (financial-institution, consumer, notification); OR EVENTUALLY send (financial-institution, consumer, notification).

The scientists have also compared their model with other access control languages used to define privacy rights. The element missing from other models, in their view, is the ability to define temporal conditions, specifically those associated with denying access, or negative norms, and future constraints.

According to Mitchell, the team is working with Vanderbilt University Hospital to incorporate the results of their formal analysis in the hospital’s MyHealth@Vanderbilt patient portal. The Stanford group has also been working with Tata Consultancy Services, a large software company involved in business process outsourcing, to apply their research to managing personal information in call centers and credit-card processing.

By Mary Alice Davidson, who heads a communications consultancy with offices in Spartanburg, South Carolina, and Tampa, Florida.

You can learn more about this issue by reading the full research paper “Privacy and Contextual Integrity: Framework and Applications,” by Barth, Datta, Mitchell, and Nissenbaum.real_id0507.pdf

arrow_upward