Skip to content
Menu
menu

Illustration by Security Management; iStock

Redesigning Pandora’s Box: Another Reason to Leave Siloed Security Behind

The official report concerning the assassination of U.S. President John F. Kennedy—The Warren Commission Report—was published in 1964 by a special commission established by President Lyndon B. Johnson. The commission’s investigation and subsequent report determined that, acting alone, Lee Harvey Oswald shot and killed Kennedy from a sniper’s nest on the sixth floor of the Texas School Book Depository in Dallas, Texas.

However, the JFK assassination has generated one of the largest enduring conspiracy theories, with theorists asserting that Oswald did not act alone, or that the CIA, the Italian Mafia, the KGB, or several other individuals or organizations were the true perpetrators behind Kennedy’s death. With the exception of Kennedy’s death, few other details of the incident are considered a certainty by conspiracy theorists. In fact, an Amazon search for books related to “jfk kennedy assassination” produces more than 2,000 results.

It’s safe to say that investigators and security professionals are not unused to conspiracy theories and the attraction they can hold over many. But a growing number of modern-day conspiracy theories have been weaponized and widely spread through social media.

During the 2016 U.S. presidential election, conspiracy theorists claimed that secretly coded messages via emails to or from Hillary Clinton’s campaign chair connected various officials in the Democratic party to a human trafficking and child sex ring. One of the restaurants was Comet Ping Pong, a pizzeria in Washington, D.C.

The allegations against the pizza joint (which also sometimes hosts live music events) found fertile ground on social media sites, including Twitter, Reddit, and 4chan. Social media posts included speculations and allegations that the pizzeria was running the trafficking operation out of the basement, launching a conspiracy theory known broadly as Pizzagate.

Pizzagate has been extensively—perhaps even exhaustively—discredited and debunked by several organizations, including law enforcement. However, the theory connected with so many believers that the restaurant’s owner and staff were harassed and threatened. In December 2016, Edgar Welch traveled from his home in North Carolina to Comet Ping Pong with an automatic-style rifle, and he opened fired in the restaurant, hitting the walls. Welch ultimately surrendered to responding law enforcement, and no one was injured in the incident. He received a four-year prison term and was later released in May 2020.

But Pizzagate remains. Gen Z users on TikTok discovered this conspiracy theory in 2020, right at a time when many people could not go to work or school and could only connect online due to COVID-19 pandemic lockdowns.

Disinformation—false information intentionally spread—is not a new concern for businesses and organizations. However, the ability to spread disinformation globally, aggressively, and abundantly is a more recent development thanks to the Internet and social media.

Multiple recent studies have found that there is a correlational relationship where social media use is positively associated with conspiracy theories and belief in misinformation or disinformation.

“Opinion polarization and echo chambers appear as pivotal elements of communication around conspiracy theories.…The insurgence of echo platforms is a new online phenomenon that…could foster many dangerous phenomena that we observe online, including the spreading of conspiracy theories,” according to a study published in the October 2022 issue of Current Opinion in Psychology.

In part, social media platforms’ ability to operate as an echo chamber that can pull users into a specific narrative, such as a conspiracy theory.

“Disinformation is as old as time. But now, with social media, anybody has a platform,” says Jeremy Plotnik, who previously worked in corporate communications and crisis communications and has studied the ramifications of fake news on businesses. “If they’re clever and know how to use hashtags effectively and they can do some basic SEO techniques, they can make much bigger noise than they normally would.”

By June 2020, the World Health Organization coined the term “infodemic” to describe a glut of disinformation and misinformation about COVID-19 that spread globally, Security Management reported


Disinformation is as old as time. But now, with social media, anybody has a platform.


In 2021, QAnon—a political conspiracy theory group whose origins stem from Pizzagate—once again took up the call to save children from a Satanic cabal. This time the group alleged that online retailer Wayfair was trafficking children in the furniture depicted on the website.

The theory gained traction as price anomalies for various pieces of furniture appeared on the e-seller’s website. And just like with Pizzagate, Wayfair employees were soon receiving threats, but the company was not the only organization that suffered from the disinformation campaign. According to The Washington Post, anti-trafficking organizations were flooded with false tips and accusations while law enforcement investigating active and legitimate human trafficking rings were pulled into these allegations. This gained the conspiracy theory more media attention, exacerbating the stress on Wayfair and other organizations.

Harassment of a business or organization via social media disinformation or Internet rumors has not been solely directed at Wayfair or Comet Ping Pong. Nor are these tactics exclusively used by groups like QAnon. Instead, the Internet has been weaponized by various groups and people, displaying a wide array of different agendas. Quite simply, Plotnik says, “Social media has really democratized the ability for people to attack a company.”

With social media, an attacker no longer has to rely solely on breaching an organization’s physical or cyber perimeter—Twitter trolls can get the ball rolling, damaging a group’s reputation while the additional stress creates cracks that other attacks can widen.

“In the social media realm, you have this environment where people are angry, are actively distrustful, cynical, and that give credence to what would otherwise be ridiculous rumors or stories without any evidence whatsoever. And then it builds on itself,” says Plotnik.

Janet Lawless, CEO and founder of the Center for Threat Intelligence, adds that any organization should assume that it has a target on its back—whether it’s a competitor, nation state, or other organization fundamentally opposed to the organization.

The Center for Threat Intelligence aims to support companies as they build strategies and frameworks that can help identify and buffer against a sophisticated attack. Such attacks often include some combination of elements of cyber, social media, insider threat, and physical security.

“We try to get people to realize that it’s not just one thing,” Lawless says. With numerous tactics and potential social media sources or platforms for disinformation campaigns to leverage against an organization, it’s no easy task to protect an organization. But it is possible.

“As more and more adversaries become more sophisticated, it becomes important for companies to become more sophisticated and for boards and executives to focus intention on it,” says Lawless.  “You have to anticipate things.”

Monitor Smarter, Not in Silos

Although reputation is its own form of currency for an organization, security expert Michael Gips, CPP, of Global Insights in Professional Security, LLC, notes that most companies lack a person or department that is clearly responsible for crafting and protecting brand reputation. “Reputation risk goes beyond security. It kind of touches upon every department in an organization, and no one is really responsible for it,” Gips says.

This means security must reach out to other departments as part of an organization-wide campaign to create a holistic response against threats to the organization’s reputation. The issue, Gips says, needs to be addressed as one that affects and is affected by the organization’s entire culture.

“You have to have an intelligence to track everything going on in your organization,” Lawless adds. “You can’t have silos anymore.”

Stepping out of a siloed structure can do more than inform all staff how they can either support or damage an organization’s reputation with their words or actions. If done right, coordinating with marketing, human resources, or other departments could potentially identify appropriate language to use in response to various scenarios, recruit employees who can double as grassroots advocates, or even help identify and curb potential insider threats.

And when it comes to misinformation or disinformation attacks lobbed from social media platforms, Lawless suggests that an organization should create a team tasked with identifying and responding to such threats.

Given both the breadth and depth of social media platforms, security experts agreed that monitoring for threats in such environments is best left to machines and software. There are several services that can provide regular and tailored reports on emerging threats identified via social media and elsewhere on the Internet.


You have to have an intelligence to track everything going on in your organization. You can’t have silos anymore.


However, at least one person should be responsible for analyzing the intelligence received from such services. Plotnik recommends that this person or team have a security background and be accustomed to running threat assessments.

Beyond this team, all employees should be trained on how to spot and report a potential threat, even on social media, says Lawless.

“If your staff isn’t trained in how to deal with the problem and how to give accurate answers and no misleading answers, you can open a Pandora’s Box,” says Gips.

Gips pointed to a recent example where medical providers and other employees of Children’s National Hospital in Washington, D.C., were harassed and threatened after a now-defunct TikTok account claimed the healthcare group performed hysterectomies on transgender minors.

Adding fuel to that fire, one of the hospital’s staff seemed to publicly indicate that such procedures have been performed on minors. A hospital spokesperson later told Fox News that the staffer was not someone who delivered care to actual patients.

Both Gips and Lawless noted that the more successful responses to such attacks were ones where security teams coordinate with other departments, addressing the issue as a matter that impacts an organization’s entire culture.

Know Thy Enemy…and Thy Stakeholders

What was true when Sun Tzu wrote The Art of War in the 5th century BCE remains true in the 21st—it is essential to understand your enemy, even on the virtual battlefield where the ammunition is tweets, likes, and shares.

“It’s important to understand who the adversaries are and what their motivations are,” says Lawless. An organization’s response to damaging disinformation or misinformation is going to be very different if the attack is coming from a competitor or a disgruntled employee or a nation state.

“There’s a plethora of channels that anybody can access; there’s a wide range of technologies that people can use, too. And the idea of making a deep fake of a corporate executive...is not outside the realm of possibility,” says Plotnik.

While all hypothetical attackers might spoof a CEO’s account on social media in an effort to tarnish the organization’s brand, the motivations behind such an attack can vary. If, for example, a competitor is responsible, perhaps its interest is in influencing customers.

But perhaps more importantly, once motivations behind an attack are made clear, an organization can tailor its recovery plan.

The organization needs to understand which stakeholders were targeted and impacted by a disinformation attack, as well as what platforms these stakeholders are using. With that awareness, the messaging the organization delivers after an incident can help rebuild trust and reemphasize its culture and priorities.

But it’s not a solution that appears by simply throwing money around. True efforts to rebuild trust with stakeholders go beyond corporate social responsibility (CSR) announcements. “If I had a dime for every time an executive told me, ‘Well, we’re doing CSR to rebuild trust,’ I’d be a wealthy person,” says Plotnik. CSR efforts after an organization’s brand has been damaged might make a stakeholder see the group as generous, but liking someone is not the same as trusting that person or group.

“They might like you, they might think you’re nice people, but they might think that the local mafia boss is a nice person, too, because he gives a lot of money to the community. But they don’t necessarily trust him,” Plotnik says “...CSR only helps you with one element of trust.”


Reputation risk goes beyond security. It kind of touches upon every department in an organization, and no one is really responsible for it.


Instead, identifying what kind of trust may have been lost can help in determining how trust is rebuilt or regained. Touting a product’s quality will not mitigate the social media blasts against a company that stands accused of violating labor laws. Directly tailoring the response to the issue will be much more effective.

Depending on the issues, organizations can also work with certain partners to help rebuild trust with stakeholders. “Coordinate and work with groups that are respected by your stakeholders,” Plotniks says.

Rapid Response to the Rabid Retweeting

Once a rumor goes viral, it’s hard for it to slow down—even more so online. Which is why there should be a chapter in an organization’s crisis management book on responding to social media disinformation campaigns.

“In this environment, speed is very important, which is why you should—as much as possible—have your protocols down for responding first,” Plotnik says.

This section of the handbook should detail the team tasked with responding to the incident, include any pre-written documents that can help in a response to the public or other stakeholders, and identify which significant stakeholders should be contacted and in what order.

The response team should also identify or include certain people within the organization who can act as a spokesperson, depending on the nature of the incident. While it might be appropriate for a company executive to represent an organization in one instance, the organization might make greater strides in regaining public trust in a different incident if a company scientist or engineer is answering media questions. In fact, according to the 2023 Edelman Trust Barometer, more people today trust scientists over CEOs, journalists, and government leaders.

But along with having an established set of protocols for a serious disinformation problem, an organization should be training its staff and running drills. When companies host crisis training events, disinformation should be one of the practiced scenarios, readying potential spokespersons and staff at-large, according to Plotnik.

It’s unlikely that an organization can prepare for every aspect of a disinformation incident, but this kind of preparation can allow a company to generate a faster and more effective response.

And when it comes to determining whether something will evolve into a full-blown incident, Plotnik recommends not instantly writing off what seems to be a ridiculous post, pointing to Pizzagate, the Wayfair scandal, and other conspiracy theories. “Companies might make the mistake that people are acting rationally and thinking rationally,” he adds.

But rational thinking and behavior, especially in a security lens, should never be taken for granted. Instead, try to remember some of the more outlandish claims that went viral and damaged a business’s reputation, such as Wayfair, or societal or national campaigns, like the COVID-19 vaccine, he notes.

People can concoct bizarre stories, and other people will listen to them, Plotnik says. “You might have important things on your mind…and yet you’re going to have to take time to deal with what you might reasonably consider to be lunacy. But take it seriously and deal with it. Otherwise, it will not go away.” 

Sara Mosqueda is associate editor for Security Management. You can connect with her on LinkedIn and on Twitter, @ximenawrites.

 

arrow_upward