Disinformation attack
Disinformation attacks are strategic deception campaigns involving media manipulation and internet manipulation, to disseminate misleading information, aiming to confuse, paralyze, and polarize an audience. Disinformation can be considered an attack when it involves orchestrated and coordinated efforts to build an adversarial narrative campaign that weaponizes multiple rhetorical strategies and forms of knowing—including not only falsehoods but also truths, half-truths, and value-laden judgments—to exploit and amplify identity-driven controversies. Disinformation attacks use media manipulation to target broadcast media like state-sponsored TV channels and radios. Due to the increasing use of internet manipulation on social media, they can be considered a cyber threat. Digital tools such as bots, algorithms, and AI technology, along with human agents including influencers, spread and amplify disinformation to micro-target populations on online platforms like Instagram, Twitter, Google, Facebook, and YouTube.
According to a 2018 report by the European Commission, disinformation attacks can pose threats to democratic governance, by diminishing the legitimacy of the integrity of electoral processes. Disinformation attacks are used by and against governments, corporations, scientists, journalists, activists, and other private individuals. These attacks are commonly employed to reshape attitudes and beliefs, drive a particular agenda, or elicit certain actions from a target audience. Tactics include circulating incorrect or misleading information, creating uncertainty, and undermining the legitimacy of official information sources.
An emerging area of disinformation research focuses on the countermeasures to disinformation attacks. Technologically, defensive measures include machine learning applications and blockchain technologies that can flag disinformation on digital platforms. Socially, educational programs are being developed to teach people how to better discern between facts and disinformation online. Journalists publish recommendations for assessing sources. Commercially, revisions to algorithms, advertising, and influencer practices on digital platforms are proposed. Individual interventions include actions that can be taken by individuals to improve their own skills in dealing with information, and individual actions to challenge disinformation.
Goals
Disinformation attacks involve the intentional spreading of false information, with end goals of misleading, confusing, and encouraging violence, and gaining money, power, or reputation. Disinformation attacks may involve political, economic, and individual actors. They may attempt to influence attitudes and beliefs, drive a specific agenda, get people to act in specific ways, or destroy credibility of individuals or institutions. The presentation of incorrect information may be the most obvious part of a disinformation attack, but it is not the only purpose. The creation of uncertainty and the undermining of both correct information and the credibility of information sources are often intended as well.Convincing people to believe incorrect information
If individuals can be convinced of something that is factually incorrect, they may make decisions that will run counter to the best interests of themselves and those around them. If the majority of people in a society can be convinced of something that is factually incorrect, the misinformation may lead to political and social decisions that are not in the best interest of that society. This can have serious impacts at both individual and societal levels.In the 1990s, a British doctor who held a patent on a single-shot measles vaccine promoted distrust of combined MMR vaccine. His fraudulent claims were meant to promote sales of his own vaccine. The subsequent media frenzy increased fear and many parents chose not to immunize their children. This was followed by a significant increase in cases, hospitalizations and deaths that would have been preventable by the MMR vaccine. It also led to the expenditure of substantial money on follow-up research that tested the assertions made in the disinformation, and on public information campaigns attempting to correct the disinformation. The fraudulent claim continues to be referenced and to increase vaccine hesitancy.
In the case of the 2020 United States presidential election, disinformation was used in an attempt to convince people to believe something that was not true and change the outcome of the election. Repeated disinformation messages about the possibility of election fraud were introduced years before the actual election occurred, as early as 2016. Researchers found that much of the fake news originated in domestic right-wing groups. The nonpartisan Election Integrity Partnership reported prior to the election that "What we're seeing right now are essentially seeds being planted, dozens of seeds each day, of false stories... They're all being planted such that they could be cited and reactivated... after the election." Groundwork was laid through multiple and repeated disinformation attacks for claims that voting was unfair and to delegitimize the results of the election once it occurred. Although the 2020 United States presidential election results were upheld, some people still believe the "big lie".
People who get information from a variety of news sources, not just sources from a particular viewpoint, are more likely to detect disinformation. Tips for detecting disinformation include reading reputable news sources at a local or national level, rather than relying on social media. Beware of sensational headlines that are intended to attract attention and arouse emotion. Fact-check information broadly, not just on one usual platform or among friends. Check the original source of the information. Ask what was really said, who said it, and when. Consider possible agendas or conflicts of interest on the part of the speaker or those passing along the information.
Undermining correct information
Sometimes undermining belief in correct information is a more important goal of disinformation than convincing people to hold a new belief. In the case of combined MMR vaccines, disinformation was originally intended to convince people of a specific fraudulent claim and by doing so promote sales of a competing product. However, the impact of the disinformation became much broader. The fear that one type of vaccine might pose a danger fueled general fears that vaccines might pose a risk. Rather than convincing people to choose one product over another, belief in a whole area of medical research was eroded.Creation of uncertainty
There is widespread agreement that disinformation is spreading confusion.This is not just a side effect; confusing and overwhelming people is an intentional objective. Whether disinformation attacks are used against political opponents or "commercially inconvenient science", they sow doubt and uncertainty as a way of undermining support for an opposing position and preventing effective action.
A 2016 paper describes social media-driven political disinformation tactics as a "firehose of falsehood" that "entertains, confuses and overwhelms the audience." Four characteristics were illustrated with respect to Russian propaganda. Disinformation is used in a way that is 1) high-volume and multichannel 2) continuous and repetitive 3) ignores objective reality and 4) ignores consistency. It becomes effective by creating confusion and obscuring, disrupting and diminishing the truth. When one falsehood is exposed, "the propagandists will discard it and move on to a new explanation." The purpose is not to convince people of a specific narrative, but to "Deny, deflect, distract".
Countering this is difficult, in part because "It takes less time to make up facts than it does to verify them." There is evidence that false information "cascades" travel farther, faster, and more broadly than truthful information, perhaps due to novelty and emotional loading. Trying to fight a many-headed hydra of disinformation may be less effective than raising awareness of how disinformation works and how to identify it, before an attack occurs. For example, Ukraine was able to warm citizens and journalists about the potential use of state-sponsored deepfakes in advance of an actual attack, which likely slowed its spread.
Another way to counter disinformation is to focus on identifying and countering its real objective. For example, if disinformation is trying to discourage voters, find ways to empower voters and elevate authoritative information about when, where and how to vote. If claims of voter fraud are being put forward, provide clear messaging about how the voting process occurs, and refer people back to reputable sources that can address their concerns.
Undermining of trust
Disinformation involves more than just a competition between inaccurate and accurate information. Disinformation, rumors and conspiracy theories call into question underlying trust at multiple levels. Undermining of trust can be directed at scientists, governments and media and have very real consequences. Public trust in science is essential to the work of policymakers and to good governance, particularly for issues in medicine, public health, and the environmental sciences. It is essential that individuals, organizations and governments have access to accurate information when making decisions.An example is disinformation around COVID-19 vaccines. Disinformation has targeted the products themselves, the researchers and organizations who develop them, the healthcare professionals and organizations who administer them, and the policy-makers that have supported their development and advised their use. Countries where citizens had higher levels of trust in society and government appear to have mobilized more effectively against the virus, as measured by slower virus spread and lower mortality rates.
Studies of people's beliefs about the amount of disinformation and misinformation in the news media suggest that distrust of traditional news media tends to be associated with reliance on alternate information sources such as social media. Structural support for press freedoms, a stronger independent press, and evidence of the credibility and honesty of the press can help to restore trust in traditional media as a provider of independent, honest, and transparent information.