Writing on the eve of the outbreak of the Second World War in Europe, British diplomat and academic E.H. Carr described three forms of power wielded in the international system: military power, economic power, and power over opinion. “Power over opinion,” he argued, “is not less essential for political purposes than military and economic power, and has always been closely associated with them.”3 His conception of power was for an analog world where mass media was the dominant means for disseminating information. Nearly a century later, things have changed mightily in information. The information processing and computer reasoning imagined in cyberpunk science fiction stories appear closer to reality. Unfortunately, the ubiquity of computing and connectivity has done enormous d…
Writing on the eve of the outbreak of the Second World War in Europe, British diplomat and academic E.H. Carr described three forms of power wielded in the international system: military power, economic power, and power over opinion. “Power over opinion,” he argued, “is not less essential for political purposes than military and economic power, and has always been closely associated with them.”3 His conception of power was for an analog world where mass media was the dominant means for disseminating information. Nearly a century later, things have changed mightily in information. The information processing and computer reasoning imagined in cyberpunk science fiction stories appear closer to reality. Unfortunately, the ubiquity of computing and connectivity has done enormous damage disseminating false information for political purposes in the world’s democracies, including the U.S.
Key Insights
Advances in computing and information technology are a dynamic component of interactions in the international system, although firms in the field may be unwilling to accept that reality.
Democratic norms and institutions are increasingly under threat from influence campaigns by malignant foreign actors using Internet platforms to alter beliefs of individuals in open societies.
Information influence rests on the pillars of computer hardware, software, and digital content, and it will likely be revolutionized by advances in artificial intelligence.
Employment of power over opinion across borders, embodied in propagandist broadcasts and diplomatic cultural programs from the Bolshoi to Ellington, was a hallmark of the Cold War era.6 Before the Internet’s explosive period of growth, information was scarce. It could be expensive. And controls on information, from censorship to forceful coercion, were the province of governments. From the end of the Second World War until the fall of the Berlin Wall, the world’s information ecosystem could be seen as an argument between two poles: east and west; capitalist and communist; democracies and dictatorships. And with the end of the 1980s, the divide disappeared.
Three decades after the collapse of the Soviet Union, deep political fissures on the global map are reappearing, while information and computing matters rise to the highest levels of importance. The digital ecosystem which surrounds humanity is a forum for competing narratives. In a time of profound computational innovation, nations pursue power over opinion. They grapple with how to employ information power, a topic with manifold components. Information influence delivered by ever more sophisticated computing technologies is directed at altering opinion and belief. Understanding this power over opinion is key to any nation or bloc of nations seeking greater global prominence.1 Computing has a critical role in the future of geopolitical power, even if that role is not always seen from within the industry.
What Is Information Power?
Defining information power or power over opinion is largely drawn from understanding narratives: the stories that attract attention and alter belief.19 In international relations, power is typically defined as the ability of a country or other international actor to influence the behavior, actions, or interests of other countries or actors in ways that advance their own objectives.13 Leaders attempt to influence the public at home but also those abroad, as well as governments and elites. When the influence crosses international boundaries, operations of information or public diplomacy are undertaken. For example, the Hillary Clinton State Department promoted Internet Freedom, basically an extension of U.S. free speech values, globally. The initiative largely collapsed, however, when Edward Snowden fled first to Hong Kong and then Russia with heaps of data detailing the intelligence operations of his employer, the National Security Agency. For those reading the leaked intelligence documents, Internet freedom became hypocrisy overnight.5
A profound irony in the Snowden Affair was that for all the damaging information leaked about the U.S., nothing particularly harmful appeared regarding either Russia or China. Of course, China and its allies emplace controls to block suspect or subversive information. They attempt to hermetically seal their societies from outside information that may undermine the political control of their authoritarian regimes.10 For decades, China has attempted to control information access while at the same time reaping the economic benefits of a robust computing and information sector. Where once China was seen as the destination for low margin technology assembly, companies higher up the information food chain have emerged, perhaps most significant among them TikTok (owned by ByteDance), which serves up video content to more than 1 billion users, as many as 150 million of them in the U.S. Critics of the company opine on how its algorithm may deliver micro-videos able to influence users of the platform outside China. Meanwhile, TikTok’s domestic variant, Douyin, is censored by the Chinese government.
Information power is employed by many actors. The Microsoft Threat Analysis Center produces guidance on information operations undertaken against the U.S. and other democracies by China, Russia, North Korea, and Iran.18 Due to its massive constellation of offerings, from the Windows operating system to the LinkedIn social network and Github source code repository, Microsoft possesses data to make connections between cybersecurity campaigns that often are undertaken by foreign intelligence services and influence operations designed to alter public beliefs. Profits of more than $20 billion per quarter over the last year underwrite its capacity to identify and publicize the influence operations aimed at subverting the democratic process. Not every information technology firm can afford to provide such services, and some clearly see no point in doing so.
How Computing Alters Information Power
Carr’s conception of power over opinion was influenced by the mass media of his time. As a journalist, he no doubt saw how newspapers could mobilize public opinion. Soon after the outbreak of the Second World War, Carr joined Eric Blair, a former colonial policeman, in producing reporting for the BBC’s programs to its worldwide audience. Blair would publish his first best-seller, Animal Farm, a few weeks before the war’s end under his pen name, George Orwell. As propagandists, they created broadcasts to inform or influence mass audiences from Calgary to Calcutta. While Orwell’s final work, 1984, offered a warning on the capacity of dictatorial regimes to surveil and censor, computing technology—ubiquitous, mobile, and ever connected—may well exceed his wildest expectations for ideological compliance.22 The contemporary infrastructure of information power is constructed upon information and computing technology (ICT) of ever-increasing sophistication and complexity, delivering massive flows of digitized information in text, images, and sound tailored to each person choosing to connect to it.
Advanced computing hardware is one building block of information power. The capacity to innovate technologically, for instance, in semiconductor design and the manufacture or fiber-optic networking, advances the capacity to distribute data at ever-increasing volumes. A second component of information power is in software and information services. Although it is not wise to lump Microsoft Office and Instagram together as generic forms of software, it remains consistent to divide ICT into hardware and software. Knowing how to manipulate software, whether by computer hacking or pushing posts on social media sites, is where utility exists in including this area into a model for information power. A third category is in narratives or content, which are digitized and available online. This last area can be best described as an enormous, global competition for human attention. Building the world’s best microprocessor or smartphone isn’t necessarily information power but knowing how to use those technologies with different forms of media to influence others most certainly is.
A potential fourth component of information power is likely in artificial intelligence (AI), which requires advanced computing, sophisticated algorithms instantiated in software, and massive pools of data. Capturing the commanding heights of AI technologies and applications is a goal for the U.S. and China. This has important ramifications for economic growth in the coming decades, especially for those nations that become the world’s exporters of the technology. In addition, AI leadership may well also create unrivaled military capabilities of the sort we are just beginning to see in the Russo-Ukraine War and Israel’s conflicts with Iran and its proxies. From drone swarms to hypersonic offensive and defensive missiles, AI innovation will change international conflict.14
While technology is needed to deliver information power, there is an art to its employment. Practitioners must create messages that will influence audiences, altering opinion. Finding the technical means to deliver those messages is straightforward. Hardware and software can be acquired. Witting or unwitting third parties can be hired to deliver the narratives. Adding authenticity, cyberattacks can purloin information that may be repurposed and manipulated to convince audiences of a particular idea. The key for the influencer is operating the systems that amplify messages. On social media platforms enhanced virality, or the rapid spread of certain pieces of information, can be engineered by computational bots or bought outright. Still, there is an art to the messages themselves.
The Political-Social-Cognitive Complex of Information
Information power is exerted in the largely unregulated platforms for sharing user-generated content, such as YouTube, Facebook, TikTok, and others. Nearly two decades ago, this shift to a more open form of participatory Internet was celebrated by Time magazine in its naming You, that is, the contributors to social media, as its 2006 Person of the Year. Feted then, social media is increasingly viewed with suspicion or disdain. The platforms gobble up time, with typical users spending more than two hours per day, and American teens spending nearly five.9 New celebrities of the medium are labeled influencers, but influence on these platforms is not limited to popular culture. Politics and the mechanisms of political influence are interwoven into social media.
What is being created, unfortunately, is a form of political unreality, where the absurd and ridiculous abound. After Hurricane Helene struck the southeastern U.S., Congressperson Marjorie Taylor Greene leveled an accusation that the catastrophic storm had been created and steered into a region where voter support for her party was ostensibly high. “Yes, they can control the weather,” she announced on the Xa social media platform regarding the disaster centered in Asheville, a city in which the vote in 2020 was evenly split across party lines on national and statewide races. Other statements by politicians during the 2024 election season indicated a wholesale departure from facts for an alternate form of reality. Even worse, foreign propaganda is appearing regularly in policy discussions in the U.S. Capitol. Michael McCaul, former Republican chair of the House Foreign Affairs committee, lamented the efficacy of amplified external narratives, admitting, “Russian propaganda has made its way into the United States, unfortunately, and it’s infected a good chunk of my party’s base.”15 As former New York state senator Daniel Patrick Moynihan once wrote, “Everyone is entitled to his own opinion, but not his own facts.” Not anymore, it appears.
Political psychologist Margaret Hermann attributes this problem of post-truth to the creation of media complexes serving the supporters of each of the major political parties, primarily on cable news networks that grew up in the 1990s.12 This allowed Americans to tune in for news catering to their political worldview. Participatory social media coupled with the development of the iPhone and its imitators meant political news was not just arriving in living rooms but anywhere, anytime. Novelist Gary Shteyngart shrewdly observed this trend of mobile devices and social media becoming an addictive lure, a state of digital posthumanism.24 Norbert Wiener, who wrote on the topic of cybernetics, the science of communications between machines and living creatures, argued that “the brain, under normal circumstances, is not the complete analogue of the computing machine, but rather the analogue of a single run on such a machine.”26 Computers and people are obviously different, but when they are fused together in always-on, Internet-connected mobile devices, the two become cybernetically integrated.
In international politics, cybernetic political saturation has become an enormous vulnerability. What the revolution in social computing brought to the power over opinion is the specificity to which not only the masses, but also individuals can be targeted with messages, repeated often, that may change what that individual believes. Meta’s Facebook holds an incredible amount of demographic data that can place information in desired user feeds. Want to tell Black female voters in Detroit or Philadelphia to stay home on election day? Meta can deliver those advertisements, refined by computer-learning models that evaluate the most effective prior engagements with its users, who number in the billions. The company is not likely to be persuaded to end the practice, despite only a small (albeit growing) share of its revenues coming from political ads. Politics is good business because news is a major draw to its platforms. Yet, Meta wants to hold no responsibility for what is posted on them.
The 2016 U.S. national elections were the environment in which awareness on computational information power became widely known. Operations undertaken by Russia against Hillary Clinton’s presidential campaign taught us an enormous amount how mastery of computing both for espionage and propaganda could potentially impact the functioning of democracy. By then, Russia under Putin was increasingly expansionist, but constrained in options. Limited in both military and economic power, the country’s leader(s) chose to use an information strategy to sow discord in Western democracies. If Russia could not rise to meet the military and economic power of the West, it could certainly try to innovate in information power. The U.K.’s referendum on exiting the EU, or Brexit, was the pilot for Russian information operations against Western countries. Operations to influence the U.S. election were more sophisticated and involved three main activities: illegally gaining access to systems relevant to election campaigns, stealing information from compromised computers, and repackaging and leaking purloined information “to interfere with the 2016 U.S. presidential election.”8 There is an explanation as to why this activity influenced political processes in democratic countries, and it connects politics and computing with cognitive psychology.
Hot Thinking and the Exercise of Democracy
We have answers in theory for how information influence works and how people react to the messages they see and hear in online posts, videos, and podcasts. The first comes from Daniel Kahneman, who explained that the human mind divides its time between rapidly generalizing and quickly reacting or deeper thinking and reflection.16 The goal of the computational propagandist is to keep audiences in the fast or hot thinking of rapid reaction rather than the slow state in which people may consider larger ideas and interconnect them. In a saturated environment, the propagandist’s target is not given the time to think through what they hear or see. There is another relevant psychological concept for our über-connected society. While economists and political scientists often subscribe to the idea of human beings making rational choices on the information they possess, Herbert Simon refuted that idea.25 Instead, he argued, rationality is bounded or limited by other circumstances. This, coupled with an environment that repeatedly pushes people into the mode of thinking fast rather than taking the time for reasoned decisions, begins to explain where problems emerge in what defense theoreticians call cognitive security.2
Kept on edge by omnipresent computing technologies, society grows vulnerable to injections of specious information. As individuals select an information bubble of political or narrative bias and gradually seal themselves inside, an echo-chamber effect amplifying the unreal takes hold. Things that television news viewers saw and heard (and might find out of bounds) in a Walter Cronkite newscast from the 1970s may not seem so radical to someone informed by modern-day podcasts and Reddit. And then there is the issue of what an individual or group may consider to be true. Particularly nefarious in this area is the illusory truth effect, a condition in which repetition of a statement reinforces its perceived truthfulness.7
In the world’s democracies, attention is largely focused on misinformation and disinformation. The children’s game of Telephone illustrates that being misinformed is natural. News media organizations get facts wrong regularly and correct those mistakes by retraction. Disinformation, on the other hand, is intentional. We have a simpler word for it: lying. But disinformation is not mere lies but rather a melding of truth and untruth in a convincing format designed to alter opinion. False narratives reside in a noise of other information, and their creators seek to amplify those messages to the greatest degree possible. It’s no wonder AI systems, which are trained on massive pools of online text, are accused of being prone to hallucination.
Ways Forward for Computing and Society on Information Influence
Before Facebook and the iPhone existed, political scientist and senior defense official Joseph Nye assessed that “[t]he paradox of American power in the 21st century is that the largest power since Rome cannot achieve its objectives unilaterally in a global information age.”20 Addressing global issues, whether in countering terrorism or confronting climate change, requires international commitment. In democratic societies, such efforts depend on broad consensus. The intense polarization in the world’s democracies between political factions still worsens, making consensus building harder, however.
Contending with information operations by dictatorial regimes, including China, Russia and Iran, is fairly simple in concept. Needed is a method by which to quiet or limit narratives that are false and created for malicious purposes. Unfortunately, approaches such as fact-checking, debunking, and social-media-platform interventions on false information will not work when the producers of false narratives and their adherents have no interest in accepting corrections. The owners of the major U.S.-based platforms for narrative amplification, Mark Zuckerberg and Elon Musk, have largely abandoned content-moderation policies and practices. Musk closely associated himself with post-truth political actors while Zuckerberg is reputed to have soured on politics altogether (although not the associated advertising revenues).23
Google, whose founders have maintained a low profile on political issues, advocates for “prebunking” disinformation by “forewarning people and equipping them to spot and refute misleading arguments.”11 This attempt to explain away disinformation to the public through YouTube videos is noble enough, but the company’s efforts are largely geared toward elections in Europe, not the U.S., and may be driven by the company’s ongoing antitrust litigation there, which amounts to some 8.5 billion euros in fines in the last decade. For the moment, prebunking appears nothing more than a feel-good project, although some efforts, such as those undertaken during the 2018 U.S. elections to isolate Russian disinformation actors, show promise.
A more practical strategy for building a society able to spot and dispense with false narratives and malign information influence campaigns is in media literacy. Media literacy is the ability to access, analyze, and evaluate information. It involves understanding who constructs messages, recognizing bias, evaluating the credibility of sources, and understanding the impact of narratives on individuals and society.17 This is a critical skill for humanity and not just for its most educated. In Finland, a global leader in public education, the process begins in elementary education and continues through university programs. This should not come as a surprise for a country that neighbors Russia, the world’s preeminent offensive information power of the moment.
Unfortunately, any media-literacy curriculum for public education will require a generation or more to implement in the U.S. and would likely be greeted with considerable hostility by opponents embracing wrongheaded notions of free speech. A more immediate and effective course is likely to be found in the courts. When Fox News personality Tucker Carlson made false claims against voting-machine manufacturer Dominion Voting Systems, Dominion sued Fox. Carlson was fired and Fox made a $787-million settlement with Dominion. Alex Jones, operator of the now-defunct InfoWars media firm, received punishing judgments totaling $1.5 billion when the families of the children and school employees of the Sandy Hook Elementary School mass shooting won civil cases against Jones in 2022 and 2023. As in so many other cases, it may be necessary to litigate on false information or even use other legal actions to sanction those employing it, both foreign and domestic.
Traditional news media can also work to break the connection between their outlets and social media platforms. They can enact technical measures that make sharing stories on social media more difficult and erect paywalls but doing so risks losing shrinking advertising revenues to their websites. A more effective policy may be in Australia’s News Media Bargaining Code, instituted in 2021, which requires social media platforms to negotiate with news publishers on paying for their stories, imposing binding arbitration if necessary. Since coming into force, the code has produced 30 agreements between news outlets and platforms, with payments totaling roughly $140 million Australian dollars. While many countries may not fight for the protection and preservation of their traditional news organizations, strategies that do so may benefit societal well-being.
Information Power’s AI Future
The Geneva-based NGO the World Economic Forum states that “misinformation and disinformation is the most severe short-term risk the world faces.”27 With trust in information on the shakiest of foundations, global society is rushing headlong into the mass application of AI anyway. Trained on the massive quantities of Internet-hosted text, AI models appear educated more by volume than accuracy. Certainly, the large language model (LLM) AI variants remain flawed in their ability to generate prompt responses free of error. This may change in time, but the likelihood of AI serving as an omnipotent, truth-saying oracle soon is not great.
It can and will be used, however, to enable information-influence operations. Indeed, it already is. AI is employed to drive automated bots that serve up narrative elements to those determined by the machine to be likely most influenced by them. It can drive traffic, “astroturfing” what appears to be popular interest and subverting social media algorithms to deliver favored information. AI can ingest massive volumes of user data to conduct psychographic profiling of targeted individuals and populations producing optimal delivery of influence narratives. Fascinatingly, MIT and Cornell researchers argue that dialog with AI intelligence may reduce belief in conspiracy theories as well.4
Beyond beliefs, there is the unreality that AI can manufacture. Deep-faked photographs and videos increasingly appear online, and while fakes may be detected by humans or machines today, the quality of those fakes is improving. Working on the issue of AI and trust, political scientist Mark Raymond asks, “What happens when we can’t believe video anymore?” Imagining a world where not a single frame of a newscast could be trusted might shatter global society as we know it.21 Nonetheless, competition between the U.S. and China for AI supremacy is a contest widely considered in strategy circles as able to deliver unrivaled power to the winner on the global stage. This is not necessarily something Sand Hill Road ponders but the Pentagon certainly does, which says something about the future of information power.
Acknowledgments
The author thanks Wm. Arthur Conklin, Dan Engster, Margaret Hermann, Martti Lehto, Mark Raymond, Josef Schroefl, Dan Wallach, Michael Webb, and Zachary Zwald for their input and feedback on this article.
Submit an Article to CACM
CACM welcomes unsolicited submissions on topics of relevance and value to the computing community.
You Just Read
Information Power!
View in the ACM Digital Library
This work is licensed under Creative Commons Attribution International 4.0 license. © Copyright 2025 held by the owner/author(s). |