|
Social media is an
effective way of influencing human society and behavior and shaping public
opinion. Cyber influence operation means using cyber tools and methods in order
to manipulate public opinion. Today, many countries use cyberspace, and
specifically social media, to manage cyber influence operations as part of
holistic information warfare. Most of these operations are done covertly and,
therefore, identifying them is challenging; moreover, it is not an easy task
to differentiate between legitimate or malicious influence operations. This
paper will describe cyber influence operations, the potential damages that
they could incur, and how they are conducted. Furthermore, the paper will
analyze the challenges of identifying such operations and will detail several
indicative parameters with which cyber influence operations can be
identified. Keywords: Cyber
influence, influence operation, social media, social engineering,
cyberwarfare The digital era
has changed the way we communicate. Nowadays, relationships and conversations
between people take place through the web and digital communication. Using
social media—such as Facebook and Instagram— and social applications—such as
WhatsApp and Telegram—we can keep in touch with our friends and family; share
posts, messages, pictures, and David Tayouri is
deputy director of Engineering, the National and Aviation Cyber Programs
Directorate, Cyber Division, ELTA Systems Ltd. at the Israel Aerospace
Industries (IAI). The author would like to thank Mr. Aaron (Ronnie) Eilat and
Mr. Mark Ellins for reviewing this article and for their thoughtful comments. |
|
Cyber, Intelligence, and Security | Volume 4 | No. 1 | March 2020 |
|
3 |
|
Cyber, Intelligence, and Security | Volume 4 | No. 1 | March 2020 |
|
videos; share our
experiences with each other, be updated on our friends’ statuses, and read
their posts. Social media, which is vastly used by many people around
the world, is also an effective way of influencing human society and behavior
and shaping public opinion. By sharing a post, tweeting an opinion,
contributing a discussion in a forum, and sharing a sentimental or political
picture, we can influence others and sometimes convince them with our
opinion. Now imagine that you could participate in hundreds and thousands of
digital conversations—you would have the chance of influencing large
communities. Using cyber tools and methods to manipulate public
opinion is called a cyber influence operation. These operations may have
different purposes: influencing psychologically, hurting morale, influencing
public awareness, instilling a lack of control and the inability to protect
the normative way of life, and more. Since these operations may cause
(psychological) damage, they are also known as disinformation cyberattacks. Today, many countries use cyberspace, and specifically
social media, to manage cyber influence operations as part of holistic
information warfare. Most of these operations are done covertly; in cases
where the operation is revealed, it would be difficult to prove who stands
behind them. Influence operations can be aimed at the general public with
generic statements or can be directed at a specific audience with targeted
messages in order to achieve more effective influence and to control their
responses. An example of a response could be voting for a specific candidate
or party in an election as was witnessed during the US presidential elections
in 2016. Identifying cyber influence operations is challenging. It
is not an easy task to identify influence and specifically to differentiate
between legitimate and malicious influence operations. Promoting a product or
a decent idea is legitimate, even as an influence operation. Incitement,
promotion of radical or violent acts, and intervention in democratic
elections are examples in which malicious influence operations could be used.
Nevertheless, it is important for governments, through defense organizations
and law enforcement agencies, to identify malicious influence operations, in
order to prevent them or, at least, to reduce their damages. Today, there is
no systematic way of identifying cyber influence operations and
differentiating between legitimate and malicious influence operations. |
|
The following
sections describe cyber influence operations and their potential damages, how
cyber influence operations are conducted, and which tactics they use. The
challenges of identifying cyber influence operations are analyzed and several
indicative parameters with which cyber influence operations can be identified
are detailed. The final section presents a case study of a cyber influence
operation. A cyber influence
operation can be defined as focused efforts to understand and engage key
audiences in order to create, strengthen, or preserve conditions favorable
for advancing interests, policies, and objectives, through the use of
coordinated programs, plans, themes, messages, and products.1 To
put it simply, cyber influence operations create communications and
interactions with the aim of influencing target audiences in order to change
their opinion and/or behavior. If the purpose is controlling the responses of
the group members, this is called perception management. A theory similar to perception management, studied mainly
in Russia, is reflexive control.2 Reflexive control is
defined as a means of conveying to a partner or an opponent specially
prepared information to incline him/her to voluntarily make the predetermined
decision desired by the initiator of the action. A “reflex” involves the
specific process of imitating the opponent’s reasoning or the opponent’s
possible behavior, thereby causing one to make an unfavorable decision. In
order to influence a state’s information resources, reflexive control
measures can be used against its decision-making processes. This aim is best
accomplished by formulating certain information or disinformation designed to
affect a specific information resource. If successfully achieved, reflexive
control over the opponent makes it possible to influence their plans, their
view of the situation, and how they would fight. In other words, one side can
impose its will on the other and cause them to make a decision inapposite to
a given situation. A close term to cyber influence in the military context
is influencing maneuver, which is the process of using (cyber)
operations to get inside an enemy’s decision cycle or even forcing that
decision cycle to direct or |
|
1 Eric V.
Larson, and others, Understanding Commanders’ Information Needs for
Influence Operations (Santa Monica: Rand Corporation, 2009). |
|
2 Timothy L.
Thomas, “Russia’s Reflexive Control Theory and the Military,” Journal of
Slavic Military Studies 17, no. 2 (2004): 237-256. |
|
Cyber,
Intelligence, and Security |
Volume 4 | No. 1 | March 2020 |
|
Cyber, Intelligence, and Security | Volume 4 | No. 1 | March 2020 |
|
indirect actions.
It is a broad form of maneuvering intended to gain and maintain information superiority
and dominance and to maintain freedom of maneuver in cyberspace.3
Influencing maneuver can be used in direct or indirect operations. A direct
example of influencing maneuver could include actions such as compromising
command and control systems and manipulating data subtly in order to degrade
the confidence that a commander has in the systems and to slow down decision
cycles. Indirect actions might include feeding compromised and manipulated
data to the media in order to force a desirable reaction from an enemy. In
this article we will focus on indirect actions. Influence operations have emerged as a major concern
worldwide. They come under different names and in various flavors—fake news,
disinformation, political astroturfing, information attacks, and so forth.
They may arrive as a component of hybrid warfare—in combination with
traditional cyberattacks (use of malware)—and with conventional military
action or covert kinetic attacks.4 An influence operation may have different purposes and
potential effects/damages. In times of peace, the purpose of influence
operations can be promoting desired ideas or leading groups to preferred
directions. An example is a political party that manages a campaign to
convince its constituents to vote for the party. If the same operation is
performed by a foreign country, this, of course, will be deemed as
intervening in a sovereign country's domestic affairs. Foreign intervention
could damage the trust that the citizens have in their government, because
they cannot be sure that the same government would be elected without the
foreign intervention. In times of conflict or war, the purpose of influence
operations can be to create anti-government discussions, turn public opinion
against government actions (e.g., actions of war), hurt public morale (e.g.,
creating a feeling of insecurity because of government actions), and so
forth, all with the aim of giving a sense that the government has no control
or ability to protect the |
|
3 Scott D.
Applegate, “The Principle of Maneuver in Cyber Operations,” 2012 4th
International Conference on Cyber Conflict (CYCON 2012), (Tallinin: NATO
CCD COE Publications, 2012), https://www.ccdcoe.org/publications/2012proceedin gs/3_3_Applegate_ThePrincipleOfManeuverInCyberOperations.pdf. |
|
4 “Army
Researchers Join International Team to Understand, Defeat “Disinformation’
Cyberattacks,” ARL Public Affairs, December 5, 2017, https://www.army.mil/
article/197316/army_researchers_join_international_team_to_understand_defeat_
disinformation_cyberattacks. |
|
normative way of
life, which eventually may weaken the country's army in the battlefield. Influence
operations can be aimed at the general public or at a specific audience,
which can be targeted using online databases or social networks. Influence
operations aimed at the general public will include generic statements, which
will have a minimal influence at the micro level on individuals but can still
reach the desired effect at the macro level. Influence operations aimed at
specific audiences will use statements tailored to that audience in order to
be more effective. How Cyber Influence Operations Are
Conducted The first step in
conducting an effective cyber influence operation is defining the goal of
either building one—by promoting a subject, strengthening it, improving
public opinion of it—or harming one by attacking the opponents, weakening the
adversaries, and creating negative public opinion. The second step is
determining the coverage and audience: a wide audience, targeted groups, or a
small group of influencers; radical or consensus groups; and which gender, age,
race, religion, and so forth will best serve the goal. The third step is
selecting the social networks and forums in which the influence operation
will be conducted and determining the interaction between the selected medium
and other intermediaries. The fourth step is determining the tools for
spreading the messages: fake profiles, bots, or trolls. Fake profiles may
have a better reputation, but they need manual intervention. Bots can be
programmed to reply automatically to defined content, but they may be easily
identified as bots. Trolls are used when using aggressive negative content,
usually when the goal is to attack opponents. The last step is defining the
appropriate messages and publishing them intensively, according to the
defined goal and audience. Figure 1 below depicts the steps of operating
cyber influence operations. |
|
Cyber,
Intelligence, and Security |
Volume 4 | No. 1 | March 2020 |
|
Cyber, Intelligence, and Security | Volume 4 | No. 1 | March 2020 |
|
|
|
Figure
1. The Steps of
Operating Cyber Influence Operations |
|
5 “Recognizing Propaganda,” Mind
OverMedia, https://propaganda.mediaeducationlab. com/techniques. |
|
•
Responding to
audience needs and values by conveying
messages, themes, and language that appeal directly—and many times
exclusively—to specific and distinct groups within a population. A cyber
influence operator may appeal to people using their racial or ethnic
identities, hobbies, favorite celebrities, beliefs and values, or even
personal aspirations and hopes for the future. Using different social media
profiles, this task becomes easier and more effective, since each profile can
be adjusted to the target audience in order to achieve the best influence
result. Sometimes, universal deepest human values—the need to
love and be loved, to feel a sense of belonging and a sense of place—are
activated. By creating messages that appeal directly to the needs, hopes, and
fears of specific groups, an influence operation becomes personal and
relevant. When messages are personally relevant, people pay attention and
absorb key information and ideas. •
Attacking
opponents by serving as a form of
political and social warfare to identify and vilify opponents. It can call
into question the legitimacy, credibility, accuracy, and even the character
of one's opponents and their ideas. Because people are naturally attracted to
conflict, an influence operation can make strategic use of controversy to get
attention. Attacking opponents also encourages “either-or” or “us-them”
thinking, which suppresses the consideration of more complex information and
ideas. Furthermore, influence operations can also be used to discredit
individuals, destroy their reputation, exclude specific groups of people,
incite hatred, or cultivate indifference. Challenges of Identifying Cyber
Influence Operations In order to
identify cyber influence operations, first we should identify cyber or social
influence. Therefore, one of the basic challenges is to define what social
influence is and how to measure it within a network. Social influence is
defined as “consciously or subconsciously persuading others from your
thoughts, beliefs or actions.”6 There are three categories in
defining social influence: actors, interactions, and networks. To achieve the largest possible audience, in many cases,
cyber operators approach influencers. There are different indicators for
identifying the potential |
|
Cyber,
Intelligence, and Security |
Volume 4 | No. 1 | March 2020 |
|
6 D.M. Kahan, “Social Influence, Social
Meaning and Deterrence,” Virginia Law Review 83, no. 2 (1997):
349-395. |
|
Cyber, Intelligence, and Security | Volume 4 | No. 1 | March 2020 |
|
7 Wouter
Vollenbroek, Sjoerd de Vries, Efthymios Constantinides, and Piet Kommers,
“Identification of Influence in Social Media Communities,” International
Journal of Web Based Communities 10, no. 3 (2014): 280-297. |
|
8 Dave Karpf,
“Measuring Influence in the Political Blogosphere: Who’s Winning and How Can
We Tell?” Politics and Technology Review (2008): 33-41, http://www.
the4dgroup.com/BAI/articles/PoliTechArticle.pdf. |
|
Influential Actor |
|
Influential Interaction |
|
Active Minds |
|
The Number of Times a |
|
Influential Social Network |
|
The Social
Distance between |
|
Trendsetters |
|
Social Presence and Impact |
|
Social Activity |
|
Charisma |
|
Expertise |
|
Authority |
|
Number of Friends |
|
The Number of
Reactions a |
|
The Number of
Times a |
|
The Number of
Readers/ |
|
Reciprocity |
|
Size of the Network |
|
Density |
|
Connectivity |
|
Centrality |
|
Emotional Value |
|
If the Message Evokes a Large |
|
Group Cohesion |
|
Figure 2. Social Influence Indicators These well-defined indicators can be used to find
influential actors, interactions, and networks, which, in turn, can help us
to better identify social influence. Figure 2 above summarizes the social
influence indicators. After identifying social influence, the next challenge is
differentiating between legitimate and malicious influence operations. Sometimes
the legitimacy of an influence operation is in the eyes of the beholder. Most
people will agree that incitement and promotion of radical or violent acts
constitute malicious influence operations, and that promoting a decent idea
is usually legitimate freedom of speech. But what about political ideas or
statements that are expressed against a country's leadership? Well, it may
depend on the country's values and regime. Let's take a democratic regime, in
which a person can criticize anything and anyone, including the country's
leader. If this was done by an army of bots, which were programmed to
automatically spread statements against the leading party, the legitimacy of
the statements would not be very clear, especially when using bots is
prohibited by most countries. If this army was managed by a foreign actor, it
would probably be considered as foreign intervention in a sovereign's
democracy. Sometimes, to influence effectively, fake news is used. For
instance, Saudi Arabia and the UAE worked to sway American public opinion and
other Arab countries against Qatar through online and social media campaigns,
by accusing Qatar of supporting terrorism and destabilizing the region, a |
|
Cyber,
Intelligence, and Security |
Volume 4 | No. 1 | March 2020 |
|
Cyber, Intelligence, and Security | Volume 4 | No. 1 | March 2020 |
|
charge Doha
rejected, and which eventually appeared to be false. The result of this
campaign was that during June 2017, Saudi Arabia and the UAE led other Arab
countries to cut diplomatic relations with Qatar.9 We can agree
that using fake news is not legitimate and may indicate a malicious influence
operation, but the real challenge is in identifying it. Mostly, fake news is
published together with other authentic news, making it difficult to spot.
Identifying fake pictures is also challenging, with all the advanced picture
editing tools available today. The situation becomes complicated when a
particular post may include some facts, some bogus facts, and some commentary
that naturally is subjective, depending on the writer's values and beliefs.
In social media, such a post receives comments from others, reflecting their
opinions and perspectives, which make it even harder to identify the false
elements. Another challenge in identifying cyber influence
operations is that the process should be done in near real time. In social
media, news spreads very fast; therefore sometimes until a fact is revealed
as false, the damage has already been done and influence operation goals have
been promoted. For example, spreading fake or semi-fake news about a
candidate a few days before the elections may change the results. After a cyber
influence operation is identified, we usually want to know who stands behind
it and collect evidence to prove it. The challenge here is that the people or
the group behind the influence operations usually hide their tracks and do
not reveal their true identity, by using bots and fake profiles in social
media, and by concealing their communication parameters (such as their IP)
with the use of dedicated browsers for anonymous browsing or by using proxy
servers. Indicative Parameters for Identifying
Cyber Influence Operations To identify cyber influence operations, the published
content—text, pictures, and videos—in the various social networks should be
monitored and analyzed using operations research and advanced algorithms,
taking into account many |
|
9 Josh Wood, “How a Diplomatic Crisis among
Gulf Nations Led to a Fake News Campaign in the United States,” PRI,
July 24, 2018, https://www.pri.org/
stories/2018-07-24/how-diplomatic-crisis-among-gulf-nations-led-fake-news-
campaign-united-states. |
|
Cyber,
Intelligence, and Security |
Volume 4 | No. 1 | March 2020 |
|
10 Bill Fitzgerald and Kris Shaffer, “Spot a
Bot: Identifying Automation and Disinformation on Social Media,” Data for
Democracy, June 5, 2017, https://medium.com/data-for-
democracy/spot-a-bot-identifying-automation-and-disinformation-on-social-media-
2966ad93a203. |
|
Cyber, Intelligence, and Security | Volume 4 | No. 1 | March 2020 |
|
11 Hunt Allcott and Matthew Gentzkow, “Social
Media and Fake News in the 2016 Election,” Journal of Economic
Perspectives 31, no. 2 (Spring 2017): 211-236, https://web.stanford.edu/~gentzkow/research/fakenews.pdf. |
|
15 |
|
Publishing of |
|
|
|
Publishing of a Large Number of Items on a Specific Subject |
|
|
|
Using of Avatars, |
|
|
|
Indicative |
|
|
|
A Sudden Change
of Public Opinion s’* |
|
Publishing of Posts |
|
|
|
|
|
Publishing of
Radically |
|
Figure 3. Indicative Parameters
for Identifying Cyber Influence Operations Case Study: Russian Intervention in the
US Elections in 2016 Many cases of cyber influence operations were published
over the last years, but one of the best known cases is the Russian
intervention in the US elections in 2016. Analysis of this case shows that
almost all the parameters mentioned in the previous section could be relevant
for identifying the Russian influence operation in the 2016 US election: • Russians
publishing posts—On October 7, 2016, the Office of the Director of National
Intelligence (ODNI) and the Department of Homeland Security (DHS) jointly
stated that the US intelligence community was confident that the Russian
government directed the hacking of emails in order to interfere with the US
election process.12 Two reports prepared for the Senate
Intelligence Committee by independent researchers reveal that |
|
Cyber,
Intelligence, and Security |
Volume 4 | No. 1 | March 2020 |
|
12 “Joint
Statement from the Department Of Homeland Security and Office of the Director
of National Intelligence on Election Security,” Department of Homeland
Security, October 7, 2016, https://www.dhs.gov/news/2016/10/07/joint-statement-
department-homeland-security-and-office-director-national. |
|
Cyber, Intelligence, and Security | Volume 4 | No. 1 | March 2020 |
|
13 Alex Ward, “4 Main Takeaways from New
Reports on Russia's 2016 Election Interference,” Vox, December 17,
2018, https://www.vox.com/world/2018/12/17/18144523/russia-
senate-report-african-american-ira-clinton-instagram. |
|
14 “Assessing Russian Activities and
Intentions in Recent US Elections,” Office of the Director of National
Intelligence, January 6, 2017, https://www.dni.gov/files/ documentsZICA_2017_01.pdf. |
|
15 “Joint Statement
from the Department Of Homeland Security and Office of the Director of
National Intelligence on Election Security,” Department of Homeland
Security, October 7, 2016, https://www.dhs.gov/news/2016/10/07/joint-statement-
department-homeland-security-and-office-director-national. |
|
16 Scott Shane and
Vindu Goel, “Fake Russian Facebook Accounts Bought $100,000 in Political
Ads,” New York Times, September 6, 2017, https://www.nytimes.
comZ2017/09/06/technology/facebook-russian-political-ads.html. |
|
17 Ellen
Nakashima, Karoun Demirjian, and Philip Rucker, “Top US Intelligence
Official: Russia Meddled in Election by Hacking, Spreading of Propaganda,” Washington
Post, January 5, 2017, https://www.washingtonpost.comZworld/national-security/ top-us-cyber-officials-russia-poses-a-major-threat-to-the-countrys-infrastructure-
and-networks/2017/01/05/36a60b42-d34c-11e6-9cb0-54ab630851e8_story.html. |
|
18 “Joint
Statement from the Department Of Homeland Security and Office of the Director
of National Intelligence on Election Security,” Department of Homeland
Security, October 7, 2016, https://www.dhs.gov/news/2016/10/07/joint-statement-
department-homeland-security-and-office-director-national. |
|
Cyber,
Intelligence, and Security |
Volume 4 | No. 1 | March 2020 |
|
19 Ellen
Nakashima, Karoun Demirjian, and Philip Rucker, “Top US Intelligence
Official: Russia Meddled in Election by Hacking, Spreading of Propaganda,” Washington
Post, January 5, 2017, https://www.washingtonpost.com/world/national-security/
top-us-cyber-officials-russia-poses-a-major-threat-to-the-countrys-infrastructure-
and-networks/2017/01/05/36a60b42-d34c-11e6-9cb0-54ab630851e8_story.html. |
|
20 Alex Ward, “4 Main Takeaways from New
Reports on Russia's 2016 Election Interference,” Vox, December 17,
2018, https://www.vox.com/world/2018/12/17/18144523/russia-
senate-report-african-american-ira-clinton-instagram. |
|
21 Alex Ward, “4 Main Takeaways from New
Reports on Russia's 2016 Election Interference,” Vox, December 17, 2018,
https://www.vox.com/world/2018/12/17/18144523/russia-
senate-report-african-american-ira-clinton-instagram. |
|
22 Philip N.
Howard, Bharath Ganesh, and Dimitra Liotsiou, “The IRA, Social Media and
Political Polarization in the United States, 2012-2018,” Computational
Propaganda Research Project, 2018, https://int.nyt.com/data/documenthelper/534-oxford-russia-
intemet-research-agency/c6588b4a7b940c551c38/optimized/full.pdf. |
|
Cyber,
Intelligence, and Security |
Volume 4 | No. 1 | March 2020 |
|
23 Mark Clayton, “Ukraine Election Narrowly
Avoided ‘Wanton Destruction’ from Hackers,” Christian Science Monitor,
June 17, 2014, https://www.csmonitor.com/ World/Passcode/2014/0617/Ukraine-election-narrowly-avoided-wanton-destruction-
from-hackers. |
|
24 Joe Watts, “Labour MP Claims It’s Highly
Probable’ Russia Interfered with Brexit Referendum,” Independent,
December 13, 2016, https://www.independent.co.uk/
news/uk/politics/russian-interference-brexit-highly-probable-referendum-hacking-
putin-a7472706.html. |
|
25 Pierre Haski, “Patterns of Disinformation
in the 2017 French Presidential Election,” Bakamo, 2017, https://www.bakamosocial.com/frenchelection/. |
|
26 Jay Solomon and Aftan Snyder, “Lessons for
Brands from the Anti-Nike-Kaepernick Social Effort,” PRNEWS, February
22, 2019, https://www.prnewsonline.com/ social+media-Nike-Kaepernick-APCO-bots-Twitter. |
|
Cyber,
Intelligence, and Security |
Volume 4 | No. 1 | March 2020 |
|
Cyber,
Intelligence, and Security |
Volume 4 | No. 1 | March 2020 |
|
forth. This cyber situation includes direct cyber events,
including attempts of cyberattack, actual cyberattacks, and damage, but it
should include also indirect cyber actions, such as cyber influence
operations conducted by other countries. These operations should be
considered covert wars and should be handled respectively, including
allocating resources to identify and thwart them. Recommended further work
includes determining additional indicative parameters, automating the
influence operation identification process, and suggesting ways to defend
against these operations. |
Không có nhận xét nào:
Đăng nhận xét