𝔖 Scriptorium
✩   LIBER   ✩

📁

Propaganda: From Disinformation and Influence to Operations and Information Warfare

✍ Scribed by Lukasz Olejnik


Publisher
CRC Press
Year
2024
Tongue
English
Leaves
285
Edition
1
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✩ Synopsis


The book is a modern primer on propaganda―aspects like disinformation, trolls, bots, information influence, psychological operations, information operations, and information warfare. Propaganda: From Disinformation and Influence to Operations and Information Warfare offers a contemporary model for thinking about the subject.

The first two decades of the 21st century have brought qualitative and quantitative technological and societal changes, and the subject of information influence needs to be re-ordered. Now is the time.

The book explains the origins of the meaning and phenomenon of propaganda―where it came from and how it has changed over the centuries. The book also covers modern methods, including artificial intelligence (AI) and advertising technologies. Legal, political, diplomatic, and military considerations ensure that the material is covered in depth.

The book is recommended for security and cybersecurity professionals (both technical and non-technical), government officials, politicians, corporate executives, academics, and students of technical and social sciences. Adepts with an interest in the subject will read it with interest.

✩ Table of Contents


Cover
Half Title
Endorsement Page
Title Page
Copyright Page
Table of Contents
Preface
Author
Chapter 1: The concept of propaganda—the evolution of meaning
1.1 The problem from the beginning
1.2 We do not valuate
1.3 Nature of change in the information space
1.4 It’s the “p” word
1.4.1 Propaganda in Antiquity
1.5 The power of words
1.5.1 Pamphlets versus spreading the word
1.6 The origin of “propaganda” is due to the Catholic Church
1.6.1 Revolutionary propaganda—the French Revolution
1.6.2 Caricature, guillotining, heads on a stick
1.6.3 The effects of propaganda—can future events be predicted?
1.6.4 Lenin and social media propaganda
1.6.5 Propaganda at the United Nations
1.6.6 Propagation of ideas? We decode further
1.6.7 What does Cardinal Richelieu teach about propaganda, and what do Napoleon and De Gaulle teach?
1.6.8 Attention paid to the rumors
1.7 Why do we treat propaganda as a bad thing?
1.8 “Propaganda” closer to us
1.8.1 Engaged poetry
1.9 “Propaganda” as social campaigning and educating?
1.9.1 An approach to defining propaganda
1.9.2 Commercial information campaigns
1.9.3 Does 2 + 2 = 5?
1.9.4 Who has determined that we think that 2 + 2 = 4?
1.10 Emerging terminology
1.11 Social or perhaps scientific problems
1.12 Countering propaganda?
1.12.1 Censorship
1.13 Informational use of anti-science
1.14 How DO WE understand information security, cyber security?
1.15 Principles of operation and the role of technology
1.15.1 Satellite imagery
1.15.2 Censorship of satellite imaging
1.16 Cultural censorship, or commercial censorship?
1.17 A brief summary and let’s move on
Notes
Chapter 2: Theoretical background: Information environments, and propaganda as modulation of information space
2.1 Propaganda
2.1.1 “Academic” (political) definition
2.1.2 Military definition
2.1.3 Nature of the propaganda content
2.1.4 Computational propaganda
2.2 Misleading, false information
2.2.1 Why does it work? One explanation
2.2.2 So what to do?
2.3 Disinformation
2.4 Actors, producers of propaganda, false information, disinformation
2.5 Censorship
2.5.1 Content moderation and its limitations
2.6 Public relations, advertising, public affairs
2.6.1 Public relations
2.6.2 Public affairs
2.6.3 Public affairs in the military sense
2.7 APM and FIMI—manipulation and foreign interference
2.7.1 Advanced and Persistent Manipulator (APM)
2.7.2 Foreign information manipulation and interference (FIMI)
2.8 Psychological operations (PSYOP)
2.9 Information operations (IO)
2.9.1 Information warfare
2.10 Information environment
2.10.1 Information advantage
2.10.1.1 What is information?
2.10.2 Example of an operation to discredit a local dairy
2.11 Information space
2.12 Weather forecast versus reliability
2.12.1 The mechanics of public debate
2.13 Overton’s window
2.14 Micro-targeting
2.14.1 The question of content
2.15 Psychology of the crowd
2.15.1 Effects of psychological bias
2.15.2 When do we accept a point of view as true?
2.15.3 Case study: missile hitting a hospital
2.16 Trolls, trolling—more on the phenomenon
2.16.1 Hypothetical situation when trolls take over the country
2.16.2 Don’t feed the trolls
2.17 Social groups
2.17.1 Radical effects—thresholds for action
2.17.2 Crossing the thresholds of radical action
2.18 How many people does it take to bring about a coup and revolution in a State? Subversion of the State by information methods
2.19 Fight against disinformation and propaganda
2.19.1 Reactive action
2.19.2 Proactive action
2.19.3 Social engineering
Notes
Chapter 3: Technological methods of digital propaganda and information operations
3.1 Dual-use and dual-purpose methods
3.2 Some say disinformation is nothing new
3.2.1 Look, here is the novelty
3.2.2 Okay, but so what?
3.3 Bots and trolls
3.3.1 Bots
3.3.1.1 Recipe for creating a bot farm, botnet
3.3.2 So now we have a functioning botnet—final remarks
3.3.3 Trolls
3.3.3.1 Arena of action—social media
3.3.4 Amplification
3.3.5 Algorithmic amplification and gaming of recommender systems
3.3.6 Influencers, public figures, politicians
3.4 Deepfake, generative content
3.4.1 Scenario of using deepfake as part of the war
3.4.2 Micro-targeting scenario of political and social groups by sending generative content (deepfake)
3.4.3 Advantages of using generative and deepfake technologies
3.5 Signature of false and disinformation content
3.6 Content creation methods and techniques
3.6.1 More on artificial intelligence, large language models
3.6.2 LLMs problems, LLM bot detection
3.7 Risk of exposure to false messages and harmful information content
3.7.1 Remember the continuous influence effect
3.8 Reaching out to the public
3.9 Technical issues—standards, notifications
3.10 Content distribution channel—ad targeting
3.11 Self-sabotage of technology to transmit information
3.12 Grouping social media communications
3.13 Techniques used in propaganda operations
3.14 Persuasion and manipulation techniques
3.14.1 Presenting irrelevant data
3.14.2 Obfuscation, intentional vagueness, confusion
3.14.3 Black and white vision (fallacy)
3.14.4 Whataboutism (relativism)
3.14.5 Use of loaded language
3.14.6 Stereotyping, labeling
3.14.7 Misrepresentation of someone’s position (Straw Man)
3.14.8 Appeal to “authority”
3.14.9 Exaggeration or minimization
3.14.10 Flag-waving (identification with the group)
3.14.11 Slogans
3.14.12 Thought-terminating cliché
3.14.13 Doubt
3.14.14 Appeal to fear or prejudice
3.14.15 Bandwagon (and jumping on one)
3.14.16 Repetition
3.14.17 Reductio ad Hitlerum—guilt by association with something or someone evil
3.14.18 Oversimplification
3.14.19 Summary
3.15 Foreign interference and information manipulation
3.15.1 Plan strategy
3.15.2 Plan objectives
3.15.3 Developing people
3.15.4 Developing the network
3.15.5 Micro-targeting
3.15.6 Developing content
3.15.7 Channel selection
3.15.8 Pumping
3.15.9 Exposure
3.15.10 Physical activities
3.15.11 Persistence
3.15.12 Measuring efficiency
3.15.13 Summary
3.15.14 Returning to FIMI
3.16 Summary
Notes
Chapter 4: Commercial propaganda and PR
4.1 Persuasion and audience outreach
4.1.1 Real-time auctions
4.2 What came first—advertising or propaganda?
4.2.1 Propaganda methods in World War I—the birth of modern public relations
4.2.2 Calibrating the right level of fear
4.2.3 Negative messages use evolutionary adaptation
4.3 The seed of information sown can have long-term effects and is difficult to remedy
4.3.1 Challenges of straightening falsehoods
4.4 False business-commercial information
4.4.1 Cigarette smoking, gender equality, heroin, coke, pot
4.4.2 Abuse, opinion manipulation, celebrities, and influencers
4.4.3 False product reviews and ratings
4.4.4 Influencer manipulation and manipulation of employees or subcontractors
4.5 Dark patterns—digital subliminal manipulation
4.6 Examples of manipulation of commercial activities, impact on stock market, company listings
4.7 Summary
Notes
Chapter 5: Norms, rules, international law—legality of propaganda and disinformation
5.1 Deepfake and war crimes
5.1.1 Deepfake and regulation
5.1.1.1 Artificial Intelligence Act and Digital Services Act versus deepfake
5.1.1.2 Deepfake in China
5.1.2 Deepfake, war propaganda, war
5.2 Types of propaganda
5.2.1 Propaganda in war is legal—except in the case of perfidy
5.3 War-mongering propaganda is prohibited
5.4 Regulation of propaganda
5.5 Polish-German propaganda conflict and regulations mitigating it
5.6 Propaganda and radio broadcasting
5.6.1 Satellite broadcasting versus information interference
5.7 States’ obligation to use propaganda
5.7.1 “Hate radio” in Rwanda and “Der StĂŒrmer” in Germany
5.8 Censorship
5.8.1 UN Security Council and the right of information blockade against a State
5.8.2 The State itself can introduce information blockades and censorship at home
5.9 Rights against censorship
5.9.1 Why did I mention all this?
5.10 EU law and disinformation
5.11 Summary
Notes
Chapter 6: Political and state propaganda
6.1 Basic distinction again: PR versus propaganda, information pluralism
6.2 Multi-channel information efforts
6.2.1 Disinformation and propaganda is not primarily an issue of news reporting or of journalism
6.2.2 Ability to impose an agenda—topics that are discussed
6.2.2.1 Shaping the messages about global warming
6.2.2.2 Shaping the message about the ozone hole and Y2K
6.3 Fading state control of the information environment?
6.3.1 Amorphism of the information environment and its formation in a democratic society and the rule of law
6.3.2 Narratives
6.3.3 The beginning of modernity—the radio
6.3.4 For consideration—Hitler won the election despite being barred from using radio
6.3.5 “Hate radio,” Balkans, information blockades
6.3.6 Contemporary stuff—social media and Internet from satellite
6.4 Audience segments, information bubbles
6.4.1 Customer groups, segments
6.4.2 Information bubbles insulate from information, or maybe they don’t exist
6.4.3 Does pluralism combined with social media lead to polarization?
6.4.4 Polarization and violence
6.4.5 Breaking through information isolation bubbles
6.4.6 Is the influence of digital platforms on policy decisions exaggerated?
6.4.6.1 To have your cake and eat it too?
6.5 Polarization
6.5.1 Natural human predisposition
6.5.2 Why people share fakes and it doesn’t affect their credibility
6.6 Who uses advertising methods in propaganda?
6.6.1 Moves against manipulation and abuse versus the right to freedom of expression
6.6.2 Measurability of Russia’s informational impact
6.6.3 Informational influence of Russia—methods of hacktivism and information
6.7 Source credibility
6.8 Foreign influence operations—miscellaneous
6.8.1 State context of the precedent of a missile falls on the territory of a NATO Member State (2022)
6.8.2 Wired and moral panic
6.8.3 NATO version, Ukraine version—choose wisely
6.8.4 Impact and cyber activities in the context of missile fall at Przewodowo
6.8.5 Source credibility
6.8.6 Operation Infektion
6.8.7 Impact on neighboring States
6.8.8 What about digital platforms—one of the arenas of State information activities?
6.9 Impact on the domestic situation
6.9.1 French upheaval in 2023 and digital coordination
6.9.2 Tank videos
6.10 Election campaigns
6.10.1 What to talk about and are there those wishing to get PR training?
6.10.2 Election silence and its circumvention
6.10.3 Election silence as an element of State information security
6.11 Diplomacy
6.12 Propaganda and international policy
6.12.1 How many years after the war are relations returning to “business as usual”?
6.12.2 Exaggeration in PR and its effects—the case of Iraq
6.12.3 Rumors, gossip, unofficial information
6.12.4 Information on the illness of the leader of the enemy State
6.12.5 Information operations in the service of diplomacy
6.12.6 Public and silent cyber attribution
6.12.7 When opinion shaping fails
6.12.8 Blocking and cutting channels of information influence of the Russian Federation
6.13 Communication control and censorship
6.13.1 Content moderation as a form of censorship
6.14 Diversion and informationally undermining the confidence in a State—subversion
6.15 New technologies as methods of organizing or executing revolutions, riots, upheavals
6.15.1 Just engage 3–5% of the population
6.15.2 Starting a coup in a State—it’s complicated
6.15.3 Arab Spring
6.15.4 Media control and propaganda through artists, creators, writers
6.16 Case study—encryption
6.17 Case study—5G, coronavirus, pandemic
6.18 Case study—Korean pop as a threat to state security?
6.18.1 Can a popular influencer lead to riots in a State?
6.19 Case study—culinary preferences and consumption of insects
6.20 A case study of the activities of the International Committee of the Red Cross (ICRC)
6.20.1 Criticism of ICRC activities in Ukraine (2022)
6.21 Propaganda in elections
6.21.1 Memes in the service of politics and diplomacy
6.21.2 Astroturfing
6.21.3 Identity communication
6.21.4 Reverse use of search engines for political purposes
6.21.5 Technologization of politics, neurotechnology
6.21.6 Political marketing
6.22 Armies of trolls
6.23 Bringing online expression to the streets—diversion
6.23.1 Industrial-scale content creation—deepfake, generative AI
6.24 Political and propaganda issues in biology, geography, and agriculture
6.24.1 Propaganda targeting vaccines
6.24.2 Decisions amid widespread moral panic are difficult
6.24.3 Biological names
6.24.4 Geography, cartography, maps, and geopolitics
6.24.5 Agricultural propaganda, potato beetle attacks
6.25 Political buzzwords, neutron bomb, and conclusion
Notes
Chapter 7: Propaganda and military affairs, war propaganda, and information warfare
7.1 Information warfare—peacetime, armed conflict, war
7.1.1 Precision
7.1.2 Attack and propaganda
7.1.2.1 The ugly word “hybrid”—or PMESII-PT
7.2 Propaganda before, during, and after the armed conflict
7.2.1 Limited war, communication channels
7.3 Information operations enabled by cyber attacks
7.4 Five fundamental criteria for measuring seriousness of operations
7.5 Preparing society for war
7.5.1 The wisdom and preferences of the people—and Talleyrand
7.5.2 Preparing for war from the top, such as in statements by state leaders
7.5.3 Pro-war PR—why die for the State?
7.5.4 Absurd war scare versus pluralism
7.5.5 The illuminating examples of pandemic or COVID-19 events
7.6 Situation during war—targeting communications to audience groups
7.7 Deception
7.8 PSYOP—psychological operations and activities
7.8.1 PSYOP—from the Mongols to “DAS BOOT SINKT”
7.8.2 Why the Internet was created—the untrue version, even if harmless
7.8.3 Psychological impact of events
7.8.3.1 PSYOP and superstition
7.8.3.2 Psychological effects of weapons used—night bombardment, lack of food
7.8.3.3 PSYOP versus use of nuclear weapons—psychological effects of detonation
7.8.4 Information warfare versus ceasefire or achieving peace
7.8.5 What does it feel like when a “Peacemaker” arrives?
7.8.6 PSYOP and commercial products, gadgets
7.8.7 Good advice from Uncle Sam
7.8.8 PSYOP to defend IT systems against cyberattacks
7.8.9 Harry Potter and Woland versus propaganda
7.9 Response to the information warfare
7.10 Information operations units formation—attack and defense
7.10.1 Continuous conflict
7.10.2 Information dominance
7.11 Strategic communications—StratCom
7.11.1 StratCom and impact
7.11.2 Operation scheme
7.11.3 StratCom levels below
7.12 States and unique powers—censorship and absurdities in newspapers during war
7.12.1 France during World War I
7.12.2 September 1939—Poland, the first victim of World War II
7.12.3 Breakthrough of modern times—satellite imaging
7.13 Content delivery
7.13.1 Leaflet bombs
7.13.2 Unmasking the leaders of hostile States
7.14 Military information operations
7.15 Decoding propaganda before and during the war
7.15.1 The transmitted message must be received
7.16 Atrocity propaganda
7.17 Use of international treaties in propaganda—the case of Russia
7.17.1 Exploiting Biological Weapons Convention
7.17.2 What needs to be done? What has been done?
7.18 War in Ukraine
7.18.1 The beginning—information activities toward Georgia and Ukraine
7.18.1.1 Dehumanization and denial of statehood
7.18.2 Pre-invasion situation (2022)
7.18.2.1 Recordings of traveling depots with weapons visible to anyone
7.18.2.2 U.S. disarms pretext-potential with early warning of what may approach
7.18.2.3 Russian diplomatic ultimatum—NATO to turn back the clock to 1997
7.18.2.4 Television speech
7.18.3 Prelude to the armed conflict—cyber and information operations
7.18.4 Fog of war, information blockade—attention to erroneous conclusions
7.18.4.1 Caution not to believe your own propaganda
7.18.4.2 Caution about premature conclusions
7.18.4.3 Falsehoods about weapon damage—when the math doesn’t add up
7.18.5 Winning narratives, information dominance
7.18.6 Psychology in a conflict situation
7.18.6.1 Diversionary train with Lenin and its consequences
7.18.6.2 Holding a hostile society under occupation is difficult, though some don’t fight back
7.18.7 Russian propaganda—inside and outside Russia
7.18.7.1 Social media view
7.18.7.2 Measure of impact
7.18.7.3 A practical measure?
7.18.8 New technologies in use
7.18.8.1 The need to draw conclusions—they better be correct
7.18.8.2 Moral panic and fears of World War III—a mental drift
7.18.8.3 Information overload? Attention to copy-paste analysis
7.18.8.4 Recipe for defense against false information in an environment of rapid change
7.18.9 Influence via the media
7.18.10 Bot farms
7.18.11 Unconventional activities
7.18.11.1 What to do after receiving a “surrender!” message?
7.18.11.2 Soldier, civilian, unlawful combatant—consequences and risks
7.18.11.3 Throwing Molotovs at the recruitment center—via retirees
7.18.12 Ukraine, PR, information and propaganda activities, and memes
7.18.13 Making fun of Western politicians
7.18.14 Targeting Western States
7.18.15 Influence and social engineering methods
7.19 Summary
7.19.1 OODA
7.19.2 Strategic culture
Notes
Chapter 8: The end is near
Appendix: Author’s postscript
Invisible disability in the world of technology


📜 SIMILAR VOLUMES


Propaganda and Information Warfare in th
✍ Scot Macdonald 📂 Library 📅 2007 🌐 English

This is the first book to analyze how the technology to alter images and rapidly distribute them can be used for propaganda and to support deception operations. In the past, propagandists and those seeking to conduct deception operations used crude methods to alter images of real people, events and

Information Warfare Principles and Opera
✍ Edward Waltz 📂 Library 📅 1998 🌐 English

Here's a systems engineering-level introduction to the growing field of Information Warfare (IW) -- the battlefield where information is both target and weapon. This book provides an overview of rapidly emerging threats to commercial, civil, and military information systems -- and shows how these th

Propaganda and Information Warfare in th
✍ Scot Macdonald 📂 Library 📅 2007 🌐 English

This is the first book to analyze how the technology to alter images and rapidly distribute them can be used for propaganda and to support deception operations. In the past, propagandists and those seeking to conduct deception operations used crude methods to alter images of real people, events and

Disinformation and You: Identify Propaga
✍ Marie D. Jones 📂 Library 📅 2021 🏛 Visible Ink Press 🌐 English

They provoke you with anger because fear-filled people are easier to manipulate. The tricks, tools and tactics used to influence you and your loved ones―along with the history of propaganda―explained and explored.We live in an age of disinformation, misinformation, and outright lies. The

Optimising Emotions, Incubating Falsehoo
✍ Vian Bakir, Andrew McStay 📂 Library 📅 2023 🏛 Palgrave Macmillan 🌐 English

<p><span>This open access book deconstructs the core features of online misinformation and disinformation. It finds that the optimisation of emotions for commercial and political gain is a primary cause of false information online. The chapters distil societal harms, evaluate solutions, and consider