Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
3.MIL 4. Types of Media (Part 3)- Understanding Propaganda in the Digital World.pptx
1. MEDIA AND INFORMATION LITERACY (MIL)
Types of Media (Part 3)
Understanding Propaganda
in the Digital World
Mr. Arniel Ping
St. Stephen’s High School
Manila, Philippines
MIL PPT 8
Revised: October 6, 2017
2. Learners will be able to…
• define propaganda (SSHS);
• discuss the characteristics of propaganda
materials (SSHS);
• explain the need to be aware and
understand propaganda materials
(SSHS);
Learning Competencies
3. Learners will be able to…
• identify propaganda materials that are
spread on social media (SSHS); and
• evaluate the dangers of propaganda to a
democratic country (SSHS).
Learning Competencies
4. I- Media
A- Propaganda
1. Definitions
2. Importance of Understanding
Propaganda
3. How to Spot Propaganda
4. Dangers of Propaganda
TOPIC OUTLINE
5. I- Media
(continuation)
B. Performance Task (Individual Work):
1. Written Work-Reaction Paper
2. Research: Propaganda Materials on Facebook
TOPIC OUTLINE
6. HOW SOCIAL MEDIA FOR PROPAGANDA IS BIG
BUSINESS
YouTube Channel: ABS-CBN News,
Date Uploaded: October 6, 2016,
Date retrieved October 28, 2016 from
https://www.youtube.com/watch?v=5J0uQwY5zIQ
CURRENT EVENTS
7. • ideas or statements that are
often false or exaggerated and
that are spread in order to help a
cause, a political leader, a
government, etc.
(http://www.merriam-webster.com/dictionary/propaganda)
Photo Credit
https://mamimomikuto.wordpre
ss.com/page/16/
WHAT IS PROPAGANDA?
8. • Information, especially of a
biased or misleading nature,
used to promote a political cause
or point of view
(https://en.oxforddictionaries.com/definition/propaganda)
Photo Credit
https://mamimomikuto.wordpre
ss.com/page/16/
WHAT IS PROPAGANDA?
9. •Propaganda manipulates and
diverts you from logical analysis of
issues.
•Propaganda hides the truth.
•By understanding propaganda, you
will be able to protect yourself
from deceitful tactics.
Photo Credit
https://mamimomikuto.wordpre
ss.com/page/16/
WHY BE AWARE OF PROPAGANDA?
10. Propaganda…
•attacks a person instead of a
principle or issue
•suppresses evidence that does
not support its cause
Photo Credit
https://mamimomikuto.wordpre
ss.com/page/16/
HOW TO SPOT A PROPAGANDA?
11. Propaganda…
•relies on emotion instead of
logical evidence
•introduces irrelevant or
unproved evidence
Photo Credit
https://mamimomikuto.wordpre
ss.com/page/16/
HOW TO SPOT A PROPAGANDA?
13. Propaganda materials are…
• now in digital form
• posted, shared, liked, commented, and debated
on social media particularly on Facebook
PROPAGANDA IN A DIGITAL WORLD
15. Photo Credit
https://mamimomikuto.wordpre
ss.com/page/16/
• No to propaganda on Facebook!
• Yes in debating issues using logical
arguments.
• Yes to respecting each other over
issues with disagreement.
Photo Credit:
http://www.loripuente.co
m/wp-
content/uploads/2011/01/
agreement.jpg
16. Photo Credit
https://mamimomikuto.wordpre
ss.com/page/16/
• No to propaganda in Facebook!
• Yes to logical analysis of issues
concerning our beloved country, the
Philippines!
Photo Credit:
https://restlesswanderwoman.fil
es.wordpress.com/2015/11/flag-
map_of_the_philippines.png
17. “ Propaganda War: Weaponizing the Internet”
by Maria Ressa
Link
http://www.rappler.com/nation/148007-propaganda-
war-weaponizing-internet
CLASS ACTIVITY:
READ, ANALYZE, AND REACT!
19. •Write a reaction paper about the news article
“Weaponizing the Internet” by Ms. Maria Ressa.
•Your reaction paper must have a title, an
introduction, a body paragraph (1 main
argument with 2 examples), and a conclusion.
•Not less than 12 sentences
•Refer to the rubric for criteria and grading.
Written Output
20. Very Good Good Poor
Needs
Improvement
INTRODUCTION
Background/
Define the
Problem
Well-developed
introductory
paragraph
contains detailed
background
information, a
clear explanation
or definition of
the problem.
(5 points)
Introductory
paragraph
contains some
background
information and
states the
problem, but
does not explain
using details.
(4 points)
Introduction
does not
adequately
explain the
background of
the
problem. The
problem is
stated, but
lacks detail.
(2 points)
Introduction
was not
presented.
(0 points)
Rubric
21. Very Good Good Poor
Needs
Improvement
BODY
PARAGRAPHS
Main Argument
Argument is well
developed with
supporting
examples.
(5 points)
Argument was
presented with
some supporting
examples.
(4 points)
Argument was
presented but
lacks examples.
(2 points)
Argument was not
presented.
(0 points)
CONCLUSION Conclusion was
presented. Opinions
and suggestions for
change are logical
and well thought
out.
(5 points)
Conclusion was
presented with
some suggestions
for change are
evident.
(4 points)
Conclusion was
presented but no
suggestions for
change and/or
opinions are
included.
( 2points)
Conclusion was
not presented.
(0 points)
Rubric (cont.)
24. The Oxford Internet Institute presents Professor Philip
Howard's Inaugural Lecture.
Is Social Media Killing Democracy? Computational
Propaganda, Algorithms, Automation and Public Life
Link:
https://www.youtube.com/watch?v=J1kXdA61AQY
What is your reaction after watching the news? Why?
What is a propaganda?
information, especially false information, that a government or organization spreads in order to influence people’s opinions and beliefs (http://www.macmillandictionary.com/us/dictionary/american/propaganda)
Why does propaganda exists?
What is a propaganda?
information, especially false information, that a government or organization spreads in order to influence people’s opinions and beliefs (http://www.macmillandictionary.com/us/dictionary/american/propaganda)
Why does propaganda exists?
Fallacies are used in Propaganda.
How is propaganda being in digital form different from other its traditional forms like printed materials?
Why do you think social media has become a platform to spread propaganda?
Optional: You may ask your students to upload their essays on a blog.
Source: https://www.oii.ox.ac.uk/research/projects/computational-propaganda/
Computational Propaganda
This project will focus on how bots, algorithms and other forms of automation are used by political actors in countries around the world.
OverView:
Since 2012, we have been investigating the use of algorithms, automation and computational propaganda in public life. Political bots are manipulating public opinion over major social networking applications. This project enables a new team of social and information scientists to investigate the impact of automated scripts, commonly called bots, on social media. We study both the bot scripts and the people making such bots, and then work with computer scientists to improve the way we catch and stop such bots. Experience suggests that political bots are most likely to appear during an international crisis, and are usually designed to promote the interests of a government in trouble. Political actors have used bots to manipulate conversations, demobilize opposition, and generate false support on popular sites like Twitter and Facebook from the U.S. as well as Sina Weibo from China.
The first stage of this research is international fieldwork with the political consultants and computer experts who are commissioned to make bots. Second, the we are building an original database of political incidents involving bots. Finally, the we are using this knowledge to make better tools for detecting political bots when they appear. We are doing “real-time” social and information science, and actively disseminating their findings to journalists, industry, and foreign policy experts. By developing an a network of experts in political bot detection and an original data set, the researchers will not only have a better understanding of how bots are manipulating social networks but also advance the conversation in the social sciences, computer sciences, and industry about the size of the problem and the possible solutions.
Source: https://www.oii.ox.ac.uk/research/projects/computational-propaganda/
Computational Propaganda
This project will focus on how bots, algorithms and other forms of automation are used by political actors in countries around the world.
OverView:
Since 2012, we have been investigating the use of algorithms, automation and computational propaganda in public life. Political bots are manipulating public opinion over major social networking applications. This project enables a new team of social and information scientists to investigate the impact of automated scripts, commonly called bots, on social media. We study both the bot scripts and the people making such bots, and then work with computer scientists to improve the way we catch and stop such bots. Experience suggests that political bots are most likely to appear during an international crisis, and are usually designed to promote the interests of a government in trouble. Political actors have used bots to manipulate conversations, demobilize opposition, and generate false support on popular sites like Twitter and Facebook from the U.S. as well as Sina Weibo from China.
The first stage of this research is international fieldwork with the political consultants and computer experts who are commissioned to make bots. Second, the we are building an original database of political incidents involving bots. Finally, the we are using this knowledge to make better tools for detecting political bots when they appear. We are doing “real-time” social and information science, and actively disseminating their findings to journalists, industry, and foreign policy experts. By developing an a network of experts in political bot detection and an original data set, the researchers will not only have a better understanding of how bots are manipulating social networks but also advance the conversation in the social sciences, computer sciences, and industry about the size of the problem and the possible solutions.