Trace Id is missing

Propaganda in the digital age: How cyber influence operations erode trust

A close-up of a blue and white paper

Survey the world of cyber influence operations, where nation states distribute propaganda designed to threaten the trustworthy information democracy requires to flourish.

Today’s foreign influence operations utilize new methods and technologies, making their campaigns designed to erode trust more efficient and effective.

Nation states are increasingly using sophisticated influence operations to distribute propaganda and impact public opinion both domestically and internationally. These campaigns erode trust, increase polarization, and threaten democratic processes.

Skilled Advanced Persistent Manipulator actors are using traditional media together with internet and social media to vastly increase the scope, scale, and efficiency of their campaigns, and the outsized impact they are having in the global information ecosystem.

Synthetic media is becoming more prevalent due to the proliferation of tools that easily create and disseminate highly realistic artificial images, videos, and audio. Digital provenance technology that certifies media asset origin holds promise to combat misuse.

A holistic approach to protecting solutions against cyber influence operations is required. Microsoft is building on its already mature cyber threat intelligence infrastructure to combat cyber influence operations. Our strategy is to detect, disrupt, defend, and deter propaganda campaigns by foreign aggressors.

900% year-over-year increase in proliferation of deepfakes since 20191

Cyber influence operations are becoming increasingly sophisticated as technology evolves at pace. We are seeing an overlap and expansion of the tools used in traditional cyberattacks being applied to cyber influence operations. Additionally, we have observed increased coordination and amplification among nation states.

Authoritarian regimes around the world have been working together to pollute the information ecosystem to their mutual advantage. By amplifying one another, state-operated media outlets create an ecosystem in which negative coverage of democracies—or positive coverage of allies—produced by one state media outlet is reinforced by others.

To add to the challenge, private sector technology entities might unwittingly enable these campaigns. Enablers can include companies that register internet domains, host websites, promote material on social media and search sites, channel traffic, and help pay for these exercises through digital advertising.

Organizations must be aware of the tools and methods employed by authoritarian regimes for cyber influence operations so they can detect and then prevent the spread of campaigns.

There is also a growing need to help consumers develop a more sophisticated ability to identify foreign influence operations and limit engagement with their narratives or content.

Increased coordination and information sharing across government, the private sector, and civil society is needed to increase transparency and to expose and disrupt these influence campaigns.

  • Foreign cyber influence operations introduce propaganda – sometimes including false narratives- in the public domain on the internet and, at times via real world events or provocations. False narratives that lie unnoticed on the internet can make subsequent references seem more credible.

    Example 1

    Early in the pandemic, fringe conspiracy websites affiliated with Russian and Iranian state media published an interview with a law professor who suggested that COVID-19 is a bioweapon created by the United States.

     

    Example 2

    On March 7, Russa pre-positioned a narrative through filing with the United Nations (UN) that a maternity hospital in Mariupol, Ukraine, had been emptied and was being used as a military site.

  • A coordinated campaign is launched to propagate narratives through government-backed and influenced media outlets and social media channels.

    Example 1
    In February 2020, PRESSTV, an outlet sponsored by the Iranian government, published an English language story on the interview, and Russian state media outlets and Chinese government accounts soon began commenting on the piece.

     

    Example 2
    On March 9, Russia bombed the hospital. When the news of the bombing breaks, Russia’s UN representative tweeted the coverage of the bombing is “fake news” and cites Russia’s earlier claims about its alleged use as a military site.
  • Finally, nation state-controlled media and proxies amplify narratives inside targeted audiences. Often, unwitting tech enablers extend the narratives reach.

    Example 1
    Russia Today (RT) – state owned outlet – also published at least one story that promoted statements from Iranian officials claiming COVID-19 might be a “product of ‘US biological attack’ aimed at Iran and China “and pushed out social media post suggesting as much.

     

    Example 2
    Russia then pushed this narrative broadly across Russian – controlled websites for two weeks following the attack on the hospital. The amplification of these stories online provides Russai the ability to deflect blame on the international stage and avoid accountability.

Synthetic media

We are entering a golden era for AI-enabled media creation and manipulation. Microsoft analysts note this is driven by two key trends: the proliferation of easy-to-use tools and services for artificially creating highly realistic synthetic images, videos, audio, and text, and the ability to quickly disseminate content optimized for specific audiences.

The field of synthetic text and media is advancing incredibly fast, and we are very close to reaching the point at which anyone can create a synthetic video of anyone saying or doing anything.

Deepfakes: Synthetic text and media techniques

These techniques can be used to attempt blackmail of an individual, company, or institution, or to place individuals in embarrassing locations or situations. Such advanced AI-based techniques are not yet widely used in cyber influence campaigns today, but we expect the problem to grow as the tools become easier to use and more widely available.

  • Replacing a face in a video with another
  • Using a video to animate a still image or another video
  • A family of techniques for generating photorealistic imagery
  • Creating rich imagery from text descriptions

The use of information operations to cause harm or expand influence is not new. However, the speed with which information can spread, and our inability to quickly sort fact from fiction, mean the impact and harm caused by fakes and other synthetically generated malicious media can be much greater.

There are several categories of harm which we consider: market manipulation, payment fraud, vishing, impersonations, brand damage, reputational damage, and botnets. Many of these categories have widely reported real-world examples, which could undermine our ability to separate fact from fiction.

A longer-term and more insidious threat is to our understanding of what is true if we can no longer trust what we see and hear. Because of this, any compromising image, audio, or video of a public or private figure can be dismissed as fake—an outcome known as The Liar’s Dividend.2

Efforts are underway across industry, government, and academia to develop better ways to detect and mitigate synthetic media and restore trust. There are several promising paths forward, as well as barriers that warrant consideration.

Microsoft’s strategy framework is aimed at helping cross-sectoral stakeholders detect, disrupt, defend, and deter against propaganda—particularly campaigns by foreign aggressors.

Detect
As with cyber defense, the first step in countering foreign cyber influence operations is developing the capacity to detect them. No single company or organization can hope to individually make the progress that is needed. New, broader collaboration across the tech sector will be crucial, with progress in analyzing and reporting cyber influence operations relying heavily on the role of civil society, including in academic institutions and nonprofit organizations.
Defend
The second strategic pillar is to shore up democratic defenses, a longstanding priority in need of investment and innovation. It should take account of the challenges technology has created for democratic societies—especially the disruption of journalism and local news—and the opportunities technology has created to defend democratic societies more effectively.

This requires ongoing investment and innovation that must reflect the local needs of different countries and continents. These issues are not easy, and they require multistakeholder approaches, which Microsoft and other tech companies are increasingly supporting

Disrupt
In recent years, Microsoft’s Digital Crimes Unit (DCU) has refined tactics and developed tools to disrupt cyber threats ranging from ransomware to botnets and nation state attacks. We have learned many critical lessons, starting with the role of active disruption in countering a broad range of cyberattacks.

As we think about countering cyber influence operations, disruption might play an even more important role, and the best approach to disruption is becoming clearer. The most effective antidote to broad deception is transparency. That is why Microsoft increased its capacity to detect and disrupt nation state influence operations by acquiring Miburo Solutions, a leading cyber threat analysis and research company specializing in the detection of and response to foreign cyber influence operations. Combining these analysts with Microsoft’s threat context analysts, Microsoft formed the Digital Threat Analysis Center (DTAC). DTAC analyzes and reports on nation state threats, including both cyberattacks and influence operations, combining information and threat intelligence with geopolitical analysis to provide insights and inform effective response and protections.

Deter
Finally, we cannot expect nations to change behavior if there is no accountability for violating international rules. Enforcing such accountability is uniquely a governmental responsibility. Yet increasingly, multistakeholder action is playing an important role in strengthening and extending international norms.

More than 30 online platforms, advertisers, and publishers—including Microsoft—signed on to the recently updated European Commission’s Code of Practice on Disinformation, agreeing to strengthened commitments to tackle this growing challenge. Like the recent Paris Call, Christchurch Call, and Declaration on the Future of the Internet, multilateral and multistakeholder action can bring together the governments and public among democratic nations. Governments can then build on these norms and laws to advance the accountability the world’s democracies need and deserve.

Through rapid radical transparency, democratic governments and societies can effectively blunt influence campaigns by attributing the source of nation state attacks, informing the public, and building trust in institutions.

Source: Microsoft Digital Defense Report, November 2022

Related articles

Special Report: Ukraine

Microsoft shares insights into cyberattacks against Ukraine, highlighting details in the attack and context around the scope, scale, and methods of Russia-based nation state attackers.

Cyber Signals: Issue 1

Identity is the new battleground. Gain insights into evolving cyberthreats and what steps to take to better protect your organization.

Defending Ukraine: Early Lessons from the Cyber War

The latest findings in our ongoing threat intelligence efforts in the war between Russia and Ukraine, and a series of conclusions from its first four months reinforces the need for ongoing and new investments in technology, data, and partnerships to support governments, companies, NGOs, and universities.

Follow Microsoft Security