Media Bias Analysis

  • Open Access
  • First Online: 06 October 2022

Cite this chapter

You have full access to this open access chapter

media bias analysis essay

  • Felix Hamborg 2  

4998 Accesses

This chapter provides the first interdisciplinary literature review on media bias analysis, thereby contrasting manual and automated analysis approaches. Decade-long research in political science and other social sciences has resulted in comprehensive models to describe media bias and effective methods to analyze it. In contrast, in computer science, computational linguistics, and related fields, media bias is a relatively young research topic. Despite many approaches being technically very advanced, we find that the automated approaches could often yield more substantial results by using knowledge from social science research on the topic.

You have full access to this open access chapter,  Download chapter PDF

2.1 Introduction

The Internet has increased the degree of self-determination in how people gather knowledge, shape their own views, and engage with topics of societal relevance [ 249 ]. Unrestricted access to unbiased information is crucial for forming a well-balanced understanding of current events. For many individuals, news articles are the primary source to attain such information. News articles thus play a central role in shaping personal and public opinion. Furthermore, news consumers rate news articles as having the highest quality and trustworthiness compared to other media formats, such as TV or radio broadcasts, or, more recently, social media [ 61 , 249 , 365 ]. However, media coverage often exhibits an internal bias, reflected in news articles and commonly referred to as media bias . Factors influencing this bias can include ownership or source of income of the media outlet or a specific political or ideological stance of the outlet and its audience [ 363 ].

The literature identifies numerous ways in which media coverage can manifest bias. For instance, journalists select events, sources , and from these sources the information they want to publish in a news article. This initial selection process introduces bias to the resulting news story. Journalists can also affect the reader’s perception of a topic through word choice , e.g., if the author uses a word with a positive or a negative connotation to refer to an entity [ 116 ], or by varying the credibility ascribed to the source [ 14 , 99 , 266 ]. Finally, the placement and size of an article within a newspaper or on a website determine how much attention the article will receive [ 37 ].

The impact of media bias, especially when implemented intentionally (see the review of bias definitions in Sect. 2.2.1 ), on shaping public opinion has been studied by numerous scholars [ 24 ]. Historically, major outlets exerted a strong influence on public opinion, e.g., in elections [ 219 , 237 , 259 ], or the social acceptance of tobacco consumption [ 9 , 362 ]. The influence of media corporations has increased significantly in the past decades. In Germany, for example, only five corporations control more than half of the media [ 189 ], and in the USA, only six corporations control 90% [ 40 , 318 ]. This naturally increases the risk of media coverage being intentionally biased [ 82 , 342 ]. Also on social media , which typically reflects a broader range of opinions, people may still be subject to media bias [ 10 , 15 , 111 ], despite social media being characterized by more direct and frequent interaction between users, and hence presumably more exposure to different perspectives. Some argue that social media users are more likely to actively or passively isolate themselves in a “filter bubble” or “echo chamber” [ 352 ], i.e., only be surrounded by news and opinions close to their own. However, this isolation is not necessarily as absolute as often assumed, e.g., Barberá et al. [ 17 ] found noticeable isolation for political issues but not for others, such as reporting on accidents and disasters. Recent technological developments are another reason for topical isolation of social media consumers, which might lead to a general decrease in the diversity of news consumption. For instance, Facebook, the world’s largest social network with more than three billion users [ 85 ], introduced Trending Topics in 2014, a news overview feature. There, users can discover current events by exclusively relying on Facebook. However, the consumption of news from only a single distributor amplifies the previously mentioned level of influence further: only a single company controls what is shown to news consumers.

The automated identification of media bias and the analysis of news articles in general have recently gained attention in computer science. A popular example are news aggregators, such as Google News , which give news readers a quick overview of a broad news landscape. Yet, established systems currently provide no support for showing the different perspectives contained in articles reporting on the same news event. Thus, most news aggregators ultimately tend to facilitate media bias [ 39 , 375 ]. Recent research efforts aim to fill this gap and reduce the effects of such biases. However, the approaches suffer from practical limitations, such as being fine-tuned to only one news category or relying heavily on user input [ 252 , 253 , 276 ]. As we show in this chapter, an important reason for the comparably poor performance of the technically superior computer science methods for automatic identification of instances of media bias is that such approaches currently tend to not make full use of the knowledge and expertise on this topic from the social sciences.

This chapter is motivated by the question of how computer science approaches can contribute to identifying media bias and mitigating the negative bias effects by ultimately making available a more balanced coverage of events and societal issues to news consumers. We address this question by comparing and contrasting established research on the topic of media bias in the social sciences with technical approaches from computer science. This comparative review thus also serves as a guide for computer scientists to better benefit from already more established media bias research in the social sciences. Similarly, social scientists seeking to apply current automated approaches to their own media bias research will also benefit from this review.

The remainder of this chapter is structured as follows. In Sect. 2.2 , we introduce the term media bias, highlight the effects of slanted news coverage, provide an understanding of how bias arises during the production of news, and introduce the most important approaches from the social sciences to analyze media bias. Then, each of the subsections in Sect. 2.3 focuses on a specific form of media bias, describes studies from the social sciences that analyze this form, and discusses methods from computer science that have been used or could be used to identify the specified form of bias automatically. In Sect. 2.4 , we discuss the reliability and generalizability of the manual approaches from the social sciences and point out key issues to be considered when evaluating interdisciplinary research on media bias. Section 2.5 summarizes the key findings of our literature review. Section 2.6 demonstrates the key findings and research gap using a practical example. Lastly, Sect. 2.7 summarizes the findings of the chapter in the context of this thesis.

2.2 Media Bias

This section gives an overview of definitions of media bias as used in social science research on the topic or as employed by automated approaches (Sect. 2.2.1 ). Afterward, we describe the effects of biased news coverage (Sect. 2.2.2 ), develop a conceptual understanding of how media bias arises in the process of news production (Sect. 2.2.3 ), and briefly introduce the most important approaches from the social sciences to analyze bias in the media (Sect. 2.2.4 ).

2.2.1 Definitions

The study of biased news coverage has a long tradition in the social sciences going back at least to the 1950s [ 253 ]. In the classical definition of Williams, media bias must both be intentional, i.e., reflect a conscious act or choice, and be sustained, i.e., represent a systematic tendency rather than an isolated incident [ 382 ]. This definition sets the media bias that we consider apart from other sources of unintentional bias in news coverage. Sources of unintentional bias include the influence of news values [ 141 ] throughout the production of news [ 276 ] and later the news consumption by readers with different backgrounds [ 266 ]. Examples for news values include the geographic vicinity of a newsworthy event to the location of the news outlet and consumers or the effects of the general visibility or societal relevance of a specific topic [ 229 ].

Many other definitions of media bias and its specific forms exist, each depending on the particular context and research questions studied. Mullainathan and Shleifer define two high-level types of media bias concerned with the intention of news outlets when writing articles: ideology and spin [ 327 ]. Ideological bias is present if an outlet biases articles to promote a specific opinion on a topic. Spin bias is present if the outlet attempts to create a memorable story. Another definition of media bias that is commonly used distinguishes between three types: coverage , gatekeeping , and statement (cf. [ 64 ]). Coverage bias is concerned with the visibility of topics or entities, such as a person or country, in media coverage. Gatekeeping bias, also called selection bias or agenda bias, relates to which stories media outlets select or reject for reporting. Statement bias, also called presentation bias, is concerned with how articles choose to report on concepts. For example, in the US elections, a well-observed bias arises from editorial slant [ 75 ], in which the editorial position on a given presidential candidate affects the quantity and tone of a newspaper’s coverage. Further forms and definitions of media bias can be found in the discussion by D’Alessio and Allen [ 64 ].

Even more definitions of media bias are found when considering research on automated bias analysis. Automated approaches tackle media bias, for example, as “subtle differences” [ 210 ], “differences of coverage” [ 278 ], “diverse opinions” [ 251 ], or “topic diversity” [ 252 ]. In sum, these definitions are rather superficial and vague, especially when compared to social science research.

To closely resemble how bias is analyzed in the social sciences, we follow in this literature review the traditional definition by Williams as mentioned previously [ 382 ]. To allow for an extensive overview of media bias literature, we also include studies that are not strictly concerned with intentional biases only. To address the different objectives of social science research on media bias and our thesis, we later provide a task-specific definition of media bias that we use in the methodology chapters of our thesis (Chap. 3 ). Specifically, classical research on media bias in the social sciences is concerned with investigating bias as systematic tendencies or patterns in news coverage on more extended time frames, e.g., to measure the influence of (biased) coverage on society or policy decisions. In contrast, our research question is concerned with biases in current coverage, e.g., to inform news consumers about such biases. Thus, to enable timely bias communication to news consumers, we explicitly allow for biases that may or may not have tendencies on larger time frames.

2.2.2 Effects of Biased News Consumption

Media bias has a strong impact on both individual and public perception of news events and thus impacts political decisions [ 24 , 69 , 97 , 100 , 159 , 166 , 399 ]. Despite the rise of social media, news articles published by well-established media outlets remain the primary source of information on current events (cf. [ 61 , 249 , 365 ]). Thus, if the reporting of a news outlet is biased, readers are prone to adopting similarly biased views. Today, the effects of biased coverage are amplified by social media, in which readers tend to “follow” only the news that conforms with their established views and beliefs [ 92 , 117 , 250 , 254 , 351 ]. On social media, news readers encounter an “echo chamber,” where their internal biases are only reinforced. Furthermore, most news readers only consult a small subset of available news outlets [ 261 , 262 ], as a result of information overload, language barriers, or their specific interests or habits.

Nearly all news consumers are affected by media bias [ 72 , 190 , 194 , 237 , 259 ], which may, for example, influence voters and, in turn, influence election outcomes [ 71 , 75 , 196 , 237 , 259 ]. Another effect of media bias is the polarization of public opinion [ 352 ], which complicates agreements on contentious topics. These negative effects have led some researchers to believe that media bias challenges the pillars of our democracy [ 166 , 399 ]: if media outlets influence public opinion, is the observed public opinion really the “true” public opinion? For instance, a 2003 survey showed that there were significant differences in the presentation of information on US television channels [ 190 ]. Fox News viewers were most misinformed about the Iraq War. Over 40% of viewers believed that weapons of mass destruction were actually found in Iraq, which is the reason used by the US government to justify the war.

According to social science research, the three key ways in which media bias affects the perception of news are priming , agenda setting , and framing [ 75 , 314 ]. Priming theory states that how news consumers tend to evaluate a topic is influenced by their (prior) perception of the specific issues that were portrayed in news on that topic. Agenda setting refers to the ability of news publishers to influence which topics are considered relevant by selectively reporting on topics of their choosing. News consumers’ evaluation of topics is furthermore based on the perspectives portrayed in news articles, which are also known as frames [ 79 ]. Journalists use framing to present a topic from their perspective to “promote a particular interpretation” [ 80 ].

We illustrate the effect of framing using an example provided by Kahneman and Tversky [ 166 ]: Assume a scenario in which a population of 600 people is endangered by an outbreak of a virus. In a first survey, Kahneman and Tversky asked participants which option they would choose:

200 people will be saved.

33% chance that 600 people will be saved. 66% chance that no one will be saved.

In the first survey, 72% of the participants chose A, and 26% chose B. Afterward, a second survey was conducted that objectively represents the exact same choices, but here the options to choose from were framed in terms of likely deaths rather than lives saved.

400 people will die.

33% chance that no one will die. 66% chance that 600 people will die.

In this case, the preference of participants was reversed. 22% of the participants chose C, and 72% chose D. The results of the survey thus demonstrated that framing alone, that is, the way in which information is presented, has the ability to draw attention to either the negative or the positive aspects of an issue [ 166 ].

In summary, the effects of media bias are manifold and especially dangerous when individuals are unaware of the occurrence of bias. The recent concentration of the majority of mass media in the hands of a few corporations amplifies the potential impact of media bias of individual news outlets even further.

2.2.3 Understanding Media Bias

Understanding not only various forms of media bias but also at which stage in the news production process they can arise [ 276 ] is beneficial to devise methods and systems that help to reduce the impact of media bias on readers. We focus on a specific conceptualization of the news production process, depicted in Fig. 2.1 , which models how media outlets turn events into news stories and how then readers consume the stories (cf. [ 14 , 69 , 146 , 147 , 276 , 277 ]). The stages in the process map to the forms of bias described by Baker, Graham, and Kaminsky [ 14 ]. Since each stage of the process is distinctively defined, we find this conceptualization of the news production process and the included bias forms to be the most comprehensive model of media bias for the purpose of devising future research in computer science. In the following paragraphs, we exemplarily demonstrate the different forms of media bias within the news production and consumption process. In Sect. 2.3 , we discuss each form in more detail. Note that while the process focuses on news articles, most of our discussion in Sect. 2.3 can be adapted to other media types, such as social media, blogs, or transcripts of newscasts.

figure 1

The news production process is a model explaining how forms of bias emerge during the process of turning an event (happening in reality ) into a news item (which is then perceived by news consumers ). The orange part at the top represents internal and external factors that influence the production of a news item and its slants. The green parts at the bottom represent bias forms that can emerge during the three phases of the news production process. The “consumer context” label (far right) additionally shows factors influencing the perception of the described news event that are not related to media bias. Adapted from [ 276 ]

Various parties can directly or indirectly, intentionally or structurally influence the news production process (refer to the motives underlying media bias shown in the orange rectangle in Fig. 2.1 ). News producers have their own political and ideological views [59]. These views extend through all levels of a news company, e.g., news outlets and their journalists typically have a slant toward a certain political direction [ 117 ]. Journalists might also introduce bias in a story if the change is supportive of their career [ 19 ]. In addition to these internal forces, external factors may also influence the news production cycle. News stories are often tailored for a current target audience of the news outlet [ 98 , 117 , 220 ], e.g., because readers switch to other news outlets if their current news source too often contradicts their own beliefs and views [ 92 , 98 , 250 , 254 , 351 ]. News producers may tailor news stories for their advertisers and owners , e.g., they might not report on a negative event involving one of their main advertisers or partnered companies [ 69 , 103 , 220 ]. Similarly, producers may bias news in favor of governments since they rely on them as a source of information [ 25 , 65 , 146 ].

In addition to these external factors, business reasons can also affect the resulting news story, e.g., investigative journalism is more expensive than copy-editing prepared press releases. Ultimately, most news producers are profit-oriented companies that may not claim the provision of bias-free information to their news consumers as their main goal [ 281 ]; in fact, news consumers expect commentators to take positions on important issues and filter important from unimportant information (cf. [ 31 , 81 ]).

All these factors influence the news production process at various stages (gray). In the first stage, gathering , journalists select facts from all the news events that happened. This stage starts with the selection of events, also named story selection. Naturally, not all events are relevant to a new outlet’s target audience, or sensational stories might yield more sales [ 117 ]. Next, journalists need to select sources , e.g., press releases, other news articles, or studies, to be used when writing an article. Ultimately, the journalists must decide which information from the sources to be included and which to be excluded from the article to be written. This step is called commission or omission and likewise affects which perspective is taken on the event.

In the next phase, writing , journalists may use different writings styles to bias news. For instance, two forms defined in the production process are labeling (e.g., a person is labeled positively, “independent politician,” whereas for the other party, no label or a negative label is used) and word choice (e.g., how the article refers to an entity, such as “coalition forces” vs. “invading forces”).

The last stage, editing , is concerned with the presentation style of the story. This includes, for instance, the placement of the story and the size allocation (e.g., a large cover story receives more attention than a brief comment on page 3), the picture selection (e.g., usage of emotional pictures or their size influences attention and perception of an event), and the picture explanation (i.e., placing the picture in context using a caption).

Lastly, spin bias is a form of media bias that represents the overall bias of a news article. An article’s spin is essentially a combination of all previously mentioned forms of bias and other minor forms (see Sect. 2.3.8 ).

Summary of the News Production Process

In summary, the resulting news story has potentially been subject to various sources of media bias at different stages of the story’s genesis before it is finally consumed by the reader. The consumer context , in turn, affects how readers actually perceive the described information (cf. [ 16 , 348 ]). The perception of any event will differ, depending on the readers’ background knowledge , their preexisting attitude toward the described event (sometimes called hostile media perception ) [ 367 ], their social status (how readers are affected by the event), and their country (e.g., news reporting negatively about a reader’s country might lead to refusal of the discussed topic), and a range of other factors. Note, however, that “consumer context” is not a form of media bias and thus will be excluded from analysis in the remainder of this chapter.

Other Bias Models

Other models exist of how media bias arises, but their components can effectively be mapped to the news production and consumption process detailed previously. For instance, Entman defines a communication process that essentially mirrors all the same steps discussed in Fig. 2.1 : (1) Communicators make intentional or unintentional decisions about the content of a text. (2) The text inherently contains different forms of media bias. (3) Receivers, i.e., news readers, draw conclusions based on the information and style presented in the text (which, however, may or may not reflect the text’s perspective). (4) Receivers of a social group are additionally subject to culture , also known as a common set of perspectives [ 79 ].

Table 2.1 gives an overview of the previously described forms of media bias, where the “medium” column shows the medium that is the source of the specific form of bias and the column “target object” shows the items within the target medium that are affected.

2.2.4 Approaches in the Social Sciences to Analyze Media Bias

Researchers from the social sciences primarily conduct so-called content analyses to identify and quantify media bias in news coverage [ 64 ] or to, more generally, study patterns in communication. First, we briefly describe the concept and workflow of content analysis. Next, we describe the concept of frame analysis , which is a specialized form of content analysis commonly used to study the presence of frames in news coverage [ 368 ]. Lastly, we introduce meta-analysis , in which researchers combine the findings from other studies and analyze general patterns across these studies [ 155 ].

2.2.4.1 Content Analysis

Content analysis quantifies media bias by identifying and characterizing its instances within news texts. In a content analysis, researchers first define one or more analysis questions or hypotheses. Researchers then gather the relevant news data, and coders (also called annotators) systematically read the news texts, annotating parts of the texts that indicate instances of media bias relevant to the analysis being performed. Afterward, the researchers use the annotated findings to accept or reject their hypotheses [ 228 , 267 ].

In a deductive content analysis, researchers devise a codebook before coders read and annotate the texts [ 68 , 227 ]. The codebook contains definitions, detailed rules, and examples of what should be annotated and in which way. Sometimes, researchers reuse existing codebooks, e.g., Papacharissi and de Fatima Oliveira [ 274 ] used annotation definitions from a previous study by Cappella and Jamieson [ 44 ] to create their codebook, and then they performed a deductive content analysis comparing news coverage on terrorism in the USA and the UK.

In an inductive content analysis, coders read the texts without specified instructions on how to code the text, only knowing the research question [ 117 ]. Since statistically sound conclusions can only be derived from the results of deductive content analyses [ 260 ], researchers conduct inductive content analyses mainly in early phases of their research, e.g., to verify the line of research or to find patterns in the data and devise a codebook [ 260 , 368 ].

Usually, creating and refining the codebook is a time-intensive process, during which multiple analyses or tests using different iterations of a codebook are performed. A common criterion that must be satisfied before the final deductive analysis can be conducted is to achieve a sufficiently high inter-coder reliability (ICR) or inter-rater reliability (IRR) [ 195 ]. ICR, also called inter-coder agreement, inter-annotator reliability, or inter-annotator agreement, represents how often individual coders annotate the same parts of the documents with the same codes from the codebook. IRR represents this kind of agreement as well, but in a labeling task, e.g., with a fixed set of labels to choose from, rather than (also) annotating a phrase in a text. In some cases, these terms and tasks may overlap. In the remainder of this thesis, we will generally use the term ICR for annotation tasks where phrases have to be selected (and labeled), such as in a content analysis. We will use the term IRR for labeling tasks, e.g., where only one or more labels have to be selected but the phrase is given, such as in sentiment classification.

Social scientists distinguish between two types of content analyses: quantitative and qualitative [ 366 ]. A qualitative analysis seeks to find “all” instances of media bias, including subtle instances that require human interpretation of the text. In a quantitative analysis, researchers in the social sciences determine the frequency of specific words or phrases (usually as specified in a codebook). Additionally, researchers may subsume specific sets of words to represent so-called salient topics, roughly resembling frames (cf. [ 63 ]). Quantitative content analyses may also measure other, non-textual features of news articles, such as the number of articles published by a news outlet on a certain event or the size and placement of a story in a printed newspaper. These measurements are also called volumetric measurements [ 64 ].

Thus far, the majority of studies on media bias performed in the social sciences conduct qualitative content analyses because the findings tend to be more comprehensive. Quantitative analyses can be performed faster and can be partially automated, but are more likely to miss subtle forms of bias [ 316 ]. We discuss both qualitative and quantitative analyses for the individual bias forms in Sect. 2.3 .

Content analysis software , generally also called computer-assisted qualitative data analysis software (CAQDAS) , supports analysts when performing content analyses [ 215 ]. Most tools support the manual annotation of findings for the analyzed news data or for other types of reports, such as police reports [ 267 ]. To reduce the large amount of texts that need to be reviewed, the software helps users find relevant text passages, e.g., by finding documents or text segments containing the words specified in the codebook or from a keyword list [ 336 ] so that the coder must review less texts manually. In addition, most software helps users find patterns in the documents, e.g., by analyzing the frequencies of terms, topic, or word co-occurrences [ 215 ].

2.2.4.2 Frame Analysis

Frame analysis (also called framing analysis) investigates how readers perceive the information in a news article [ 79 ]. This is done by broadly asking two questions: (1) What information is conveyed in the article? (2) How is that information conveyed? Both questions together define a frame . As described in Sect. 2.2.2 , a frame is a selection of and emphasis on specific parts of an event.

To empirically determine the frames in news articles or other news texts, frame analysis is typically concerned with one or more of four dimensions [ 271 ]: syntactical, script, thematic, and rhetorical. The syntactical dimension includes patterns in the arrangement of words and, more broadly, information, e.g., descending order of salience in a story. The script dimension refers to characteristics similar to those of a story, i.e., a news article may have an introduction, climax, and end. The thematic dimension refers to which information is mentioned in a news text, e.g., which “facts,” events, or sources are mentioned or quoted to strengthen the text’s argument. Lastly, the rhetorical dimension entails how such information is presented, e.g., the word choice. Using these dimensions, researchers can systematically analyze and quantify the viewpoints of news texts.

Not all frame analyses focus on the text of news articles. For instance, DellaVigna and Kaplan [ 71 ] analyzed the gradual adoption of cable TV of Fox News between 1996 and 2000 to show that Fox News had a “significant impact” [ 71 ] on the presidential elections. Essentially, the study analyzed whether a district had already adopted the Fox News channel and what the election result was. The results revealed that the Republican Party had an increased vote share in those towns that had adopted Fox News.

2.2.4.3 Meta-Analysis

In a meta-analysis , researchers combine the results of multiple studies to derive further findings from them [ 155 ]. For example, in the analysis of event selection bias, a common question is which factors influence whether media organizations will choose to report on an event or not. McCarthy, McPhail, and Smith [ 229 ] performed a meta-analysis of the results of prior work suggesting that the main factors for media to report on a demonstration are the demonstration size and the previous media attention on the demonstration’s topic.

2.2.5 Summary

News coverage has a strong impact on public opinion, i.e., what people think about ( agenda setting ), the context in which news is perceived ( priming ), or how topics are communicated ( framing ). Researchers from the social sciences have extensively studied such forms of media bias, i.e., the intentional, non-objective coverage of news events. The research has resulted in a broad literature on different forms and possible sources of media bias and their impact on (political) communication or opinion formation. In tandem, various well-established research methodologies, such as content analysis, frame analysis, and meta-analysis, have emerged in the social sciences.

The three forms of analysis discussed in Sect. 2.2.4 require significant manual effort and expertise [ 276 ], since those analyses require human interpretation of the texts and cannot be fully automated. For example, a quantitative content analysis might (semi-)automatically count words that have previously been manually defined in a codebook, but they would be unable to read for “meaning between the lines,” which is why such methods continue to be considered less comprehensive than a qualitative analysis. However, the recent methodological progress in natural language processing (NLP) in computer science promises to help alleviate many of these concerns.

In the remainder of this chapter, we discuss the different forms of media bias defined by the news production and consumption process. The process we have laid out in detail previously is in our view the most suitable conceptual framework to map analysis workflows from the social sciences to computer science and thus helps us to discuss where and how computer scientists can make unique contributions to the study of media bias.

2.3 Manual and Automated Approaches to Identify Media Bias

This section is structured into nine subsections discussing all of the forms of media bias depicted in Table 2.1 . In each subsection, we first introduce each form of bias and then provide an overview of the studies and techniques from the social sciences used to analyze that particular form. Subsequently, we describe methods and systems that have been proposed by computer science researchers to identify or analyze that specific form of media bias. Since media bias analysis is a rather young topic in computer science, often no or few methods have been specifically designed for that specific form of media bias, in which case, we describe the methods that could best be used to study the form of bias. Each subsection concludes with a summary of the main findings highlighting where and how computer science research can make a unique contribution to the study of media bias.

2.3.1 Event Selection

From the countless stream of events happening each day, only a small fraction can make it into the news. Event selection is a necessary task, yet it is also the first step to bias news coverage. The analysis of this form of media bias requires both an event-specific and a long-term observation of multiple news outlets. The main question guiding such an analysis is whether an outlet’s coverage shows topical patterns, i.e., some topics are reported more or less in one as compared to another outlet, or which factors influence whether an outlet reports on an event or not.

To analyze event selection bias, at least two datasets are required. The first dataset consists of news articles from one or more outlets; the second is used as a ground truth or baseline, which ideally contains “all” events relevant to the analysis question. For the baseline dataset, researchers from the social sciences typically rely on sources that are considered to be the most objective, such as police reports [ 119 ]. After linking events across the datasets, a comparison enables researchers to deduce factors that influence whether a specific news outlet reports on a given event. For instance, several studies compare demonstrations mentioned in police reports with news coverage on those demonstrations [ 228 , 229 , 267 ]. During the manual content analyses, the researchers extracted the type of event, i.e., whether it was a rally, march, or protest, the issue the demonstration was about, and the number of participants. Two studies found that the number of participants and the issue of the event, e.g., protests against the legislative body [ 267 ], had a high impact on the frequency in news coverage [ 119 ].

Meta-analyses have also been used to analyze event selection bias, mainly by summarizing findings from other studies. For instance, D’Alessio and Allen found that the main factors influencing media reporting on demonstration are the demonstration size and the previous media attention on the demonstration’s topic [ 64 ].

To our knowledge, only few automated approaches have been proposed that specifically aim to analyze event selection bias. Other than in social sciences studies, none of them compares news coverage with a baseline that is considered objective, but they compare the coverage of multiple outlets or other online news sources [ 34 , 307 ]. In the following, we first describe these approaches in more detail, and then we also describe current methods and systems that could support the analysis of this form of bias.

Bourgeois, Rappaz, and Aberer [ 34 ] span a matrix over news sources and events extracted from GDELT [ 201 ], where the value of each cell in the matrix describes whether the source (row) reported on the event (column) [ 215 ]. They use matrix factorization (MF) to extract “latent factors,” which influence whether a source reports on an event. Main factors found were the affiliation, ownership, and geographic proximity of two sources. Saez-Trumper, Castillo, and Lalmas [ 307 ] analyze relations between news sources and events. By analyzing the overlap between news sources’ content, they find, for example, that news agencies, such as AP, publish most non-exclusive content—i.e., if news agencies report on an event, other news sources will likely also report on the event—and that news agencies are more likely to report on international events than other sources. Media type was also a relevant event selection factor. For example, magazine-type media, such as The Economist , are more likely to publish on events with high prominence, i.e., events that receive a lot of attention in the media.

Similar to the manual analyses performed in the social sciences, automated approaches need to (1) find or use articles relevant to the question being analyzed (we describe relevant techniques later in this subsection; see the paragraphs on news aggregation), (2) link articles to baseline data or other articles, and (3) compute statistics on the linked data.

In task (2), we have to distinguish whether one wants to compare articles to a baseline, or, technically said, across different media, or to other articles. Linking events from different media, e.g., news articles and tweets on the same events, has recently gained attention in computer science [ 307 , 361 ]. However, to our knowledge, there are currently no generic methods to extract the required information from police reports or other, non-media databases, since the information that needs to be extracted depends on the particular question studied and the information structure and format differ greatly between these documents, e.g., police reports from different countries or states usually do not share common formats (cf. [ 206 , 231 ]).

To link news articles reporting on the same event, various techniques can be used. Event detection extracts events from text documents. Since news articles are usually concerned with events, event detection is commonly used in news-related analyses. For instance, in order to group related articles, i.e., those reporting on the same event [ 164 ], one needs to first find events described in these articles. Topic modeling extracts semantic concepts, or topics, from a set of text documents where topics are typically extracted as lists of weighted terms. A commonly employed implementation is latent Dirichlet allocation (LDA) [ 30 ], which is, for instance, used in the Europe Media Monitor (EMM) news aggregator [ 26 ].

Related articles can also be grouped with the help of document clustering methods, such as affinity propagation [ 91 ] or hierarchical agglomerative clustering (HAC) [ 226 ]. HAC, for example, computes pairwise document similarity on text features using measures such as the cosine distance on TF-IDF vectors [ 308 ] or word embeddings [ 197 ]. This way, HAC creates a hierarchy of the most similar documents and document groups [ 222 ]. HAC has been used successfully in several research projects [ 232 , 276 ]. Other methods to group related articles exploit news-specific characteristics, such as the five journalistic W questions (5Ws). The 5Ws describe the main event of a news article, i.e., who did what, when, where, and why. A few works additionally extract the how question [ 321 ], i.e., how something happened or was done (5W1H extraction or question answering). Journalists usually answer the 5W questions within the first few sentences of a news article [ 52 ]. Once phrases answering the 5W question are extracted, articles can be grouped by comparing their 5W phrases. We propose a method for 5W1H extraction in Chap. 4 .

News aggregation Footnote 1 is one of the most popular approaches to enable users to get an overview of the large amounts of news that is published nowadays. Established news aggregators, such as Google News and Yahoo News, show related articles by different outlets reporting on the same event. Hence, the approach is feasible to reveal instances of bias by source selection, e.g., if one outlet does not report on an important event. News aggregators rely on methods from computer science, particularly methods from natural language processing (NLP). The analysis pipeline of most news aggregators aims to find the most important news topics and present them in a compressed form to users. The analysis pipeline typically involves the following tasks [ 84 , 128 ]:

Data gathering , i.e., crawling articles from news websites.

Article extraction from website data, which is typically HTML or RSS.

Grouping , i.e., finding and grouping related articles reporting on the same topic or event.

Summarization of related articles.

Visualization , e.g., presenting the most important topics to users.

For the first two tasks, data gathering and article extraction, established and reliable methods exist, e.g., in the form of web crawling frameworks [ 246 ]. Articles can be extracted with naive approaches, such as website-specific wrappers [ 270 ], or more generic methods based on content heuristics [ 185 ]. Combined approaches perform both crawling and extracting and offer other functionality tailored to news analysis. In Sect. 3.5 , we propose news-please , a web crawler and extractor for news articles, which extracts information from all news articles on a website, given only the root URL of the news outlet to be crawled.

The objective of grouping is to identify topics and group articles on the same topic, e.g., using LDA or other topic modeling techniques, as described previously. Articles are then summarized using methods such as simple TF-IDF-based scores or complex approaches considering redundancy and order of appearance [ 294 ]. By performing the five tasks of the news aggregation pipeline in an automatized fashion, news aggregators can cope with the large amount of information produced by news outlets every day.

However, no established news aggregator reveals event selection bias of news outlets to their users. Incorporating this functionality for short-term or event-oriented analysis of event selection bias, news aggregators could show the publishers that did not publish an article on a selected event. For long-term or trend-oriented analysis, news aggregators could visualize a news outlet’s coverage frequency of specific topics, e.g., to show whether the issues of a specific politician or party or an oil company’s accident is either promoted or demoted.

In addition to traditional news aggregators, which show topics and related topics in a list, recent news aggregators use different analysis approaches and visualizations to promote differences in news coverage caused by biased event selection. Matrix-based news aggregation (MNA) is an approach we devised earlier that follows the analysis workflow of established news aggregators while organizing and visualizing articles into rows and columns of a two-dimensional matrix [ 128 , 129 ]. The exemplary matrix depicted in Fig. 2.2 reveals what is primarily stated by media in one country (rows) about another country (columns). For instance, the cell of the publisher country Russia and the mentioned country Ukraine, denoted with RU-UA, contains all articles that have been published in Russia and mention Ukraine. Each cell shows the title of the most representative article, determined through a TF-IDF-based summarization score among all cell articles [ 128 ]. Users either select rows and columns from a list of given configurations for common use cases, e.g., to analyze only major Western countries, or define own rows and columns from which the matrix shall be generated.

figure 2

News overview to enable comparative news analysis in matrix-based news aggregation. The color of each cell refers to its main topic. Source [ 135 ]

To analyze event selection bias, users can use MNA to explore main topics in different countries as in Fig. 2.2 or span the matrix over publishers and topics in a country.

Research in the social sciences concerned with bias by event selection requires significant effort due to the time-consuming manual linking of events from news articles to a second “baseline” dataset. Many established studies use event data from a source that is considered “objective,” for example, police reports (cf. [ 6 , 231 , 267 ]). However, the automated extraction of relevant information from such non-news sources requires the development and maintenance of specialized tools for each of the sources. Reasons for the increased extraction effort include the diversity or unavailability of such sources, e.g., police reports are structured differently in different countries or may not be published at all. Linking events from different sources in an automated fashion poses another challenge because of the different ways in which the same event may be described by each of the sources. This places a limit on the possible contributions of automated approaches for comparison across sources or articles.

In our view, the automated analysis of events within news articles, however, is a very promising line of inquiry for computer science research. Sophisticated tools can already gather and extract relevant data from online news sources. Methods to link events in news articles are already available or are the subject of active research [ 26 , 30 , 164 , 222 , 232 , 276 , 308 ]. In Sect. 4.2 , we propose a method that extracts phrases describing journalistic properties of an article’s main event, i.e., who did what, when, where, why, and how. Of course, news articles must originate from a carefully selected set of news publishers, which represent not only mainstream media but also alternative and independent publishers, such as Wikinews. Footnote 2 Finally, revealing differences in the selection of top news stories between publishers, or even the mass media of different countries, has shown promising results [ 128 ] and could eventually be integrated into regular news consumption using news aggregators demonstrating the potential for computer science approaches to make a unique contribution to the study event selection.

2.3.2 Source Selection

Journalists must decide on the trustworthiness of information sources and the actuality of information for a selected event. While source selection is a necessary task to avoid information overload, it may lead to biased coverage, e.g., if journalists mainly consult sources supporting one perspective when writing the article. The choice of sources used by a journalist or an outlet as a whole can reveal patterns of media bias. However, journalistic writing standards do not require journalists to list sources [ 371 ], which make the identification of original sources difficult or even impossible. One can only find hints in an article, such as the use of quotes, references to studies, phrases such as “according to [name of other news outlet]” [ 116 ], or the dateline, which indicates whether and from which press agency the article was copy-edited. One can also analyze whether the content and the argumentation structure match those of an earlier article [ 68 ].

The effects of source selection bias are similar to the effects of commission and omission (Sect. 2.3.3 ), because using only sources supporting one side of the event when writing an article (source selection) is similar to omitting all information supporting the other side (omission). Because many studies in the social sciences are concerned with the effects of media bias, e.g., [ 24 , 69 , 72 , 98 , 100 , 159 , 166 , 190 , 194 , 237 , 259 , 399 ], and the effects of these three bias forms are similar, bias by source selection and bias by commission and omission are often analyzed together.

Few analyses in the social sciences aim to find the selected sources to derive insights on the source selection bias of an article or an outlet. However, there are notable exceptions, for example, one study counts how often news outlets and politicians cite phrases originating in think tanks and other political organizations. The researchers had previously assigned the organizations to a political spectrum [ 117 ]. The frequencies of specific phrases used in articles, such as “We are initiating this boycott, because we believe that it is racist to fly the Confederate Flag on the state capitol” [ 117 ], which originated in the civil rights organization NAACP, are then aggregated to estimate the bias of news outlets. In another study of media content, Papacharissi and Oliveira annotate indications of source selection in news articles, such as whether an article refers to a study conducted by the government or independent scientists [ 274 ]. One of their key findings is that UK news outlets often referred to other news articles, whereas US news outlets did that less often but referred to governments, opinions, and analyses.

On social media , people can be subject to their own source selection bias, as discussed in Sect. 2.1 . For instance, on Facebook, people tend to be friends with likewise-minded people, e.g., who share similar believes or political orientations [ 15 ]. People who use social media platforms as their primary news source are subject to selection bias not only by the operating company [ 82 , 85 ] but also by their friends [ 15 ].

To our knowledge, there are currently no approaches in computer science that aim to specifically identify bias by source selection. One exception is NewsDeps, an exploratory approach for determining the content dependencies between news articles [ 139 ]. Our approach employs simple methods from plagiarism detection (PD) described afterward to identify which parts of a news article stem from previously published news articles.

However, several automated techniques are well suited to address this form of bias. Plagiarism detection (PD) is a field in computer science with the broad aim of identifying instances of unauthorized information reuse in documents. Methods from PD may be used to identify the potential sources of information for a given article beyond identifying actual “news plagiarism” (cf. [ 179 ]). While there are some approaches focused on detecting instances of plagiarism in news, e.g., using simple text-matching methods to find 1:1 duplicates [ 309 ], research on news plagiarism is not as active as research on academic plagiarism. This is most likely a consequence of the fact that authorized copy-editing is a fundamental component in the production of news. Another relevant field that we describe in this section is semantic textual similarity (STS), which measures the semantic equivalence of two (usually short) texts [ 5 ].

The vast majority of plagiarism detection techniques analyzes text [ 89 , 235 ] and thus could also be adapted and subsequently applied to news texts. Current methods can reliably detect copy and paste plagiarism, the most common form of plagiarism [ 89 , 405 ]. Ranking methods use, for instance, TF-IDF and other information retrieval techniques to estimate the relevance of other documents as plagiarism candidates [ 149 ]. Fingerprinting methods generate hashes of phrases or documents. Documents with similar hashes indicate plagiarism candidates [ 149 , 324 ]. Hybrid approaches assess documents’ similarity using diverse features [ 236 ].

Today’s plagiarism detection methods already provide most of the functionality to identify the potential sources of news articles. Copy-edited articles are often shortened or slightly modified and, in some cases, are a 1:1 duplicate of a press agency release. These types of slight modifications, however, can be reliably detected with ranking or fingerprinting methods (cf. [ 235 , 309 ]). Current methods only continue to struggle with heavily paraphrased texts [ 235 ], but research is extending also to other non-textual data types such as analyzing links [ 107 ], an approach that can be used for the analysis of online news texts as well. Another text-independent approach to plagiarism detection are citation-based plagiarism detection algorithms, which achieve good results by comparing patterns of citations between two scientific documents [ 105 ]. Due to their text independence, these algorithms also allow a cross-lingual detection of information reuse [ 105 ]. News articles typically do not contain citations, but the patterns of quotes, hyperlinks, or other named entities can also be used as a suitable marker to measure the semantic similarity of news articles (cf. [ 107 , 117 , 203 ]). Some articles also contain explicit referral phrases, such as “according to The New York Times .” The dateline of an article can also state whether and from where an article was copy-edited [ 140 ]. Text search and rule-based methods can be used to identify referral phrases and to extract the resources being referenced. In our view, future research should focus on identifying the span of information that was taken from the referred resource (see also Sect. 2.3.3 ).

Semantic textual similarity (STS) methods measure the semantic equivalence of two (usually short) texts [ 5 ]. STS methods use basic measures, such as n-gram overlap, WordNet node-to-node distance, and syntax features, e.g., compare whether the predicate is the same in two sentences [ 312 ]. More recent methods combine various techniques and use deep learning networks, achieving a Pearson correlation of their STS results to human coders of 0.78 [ 306 ]. Recently, these methods have also focused on cross-lingual STS [ 5 ] and use, for example, machine translation before employing regular mono-lingual STS methods [ 36 ]. Machine translation has proven useful also for other cross-lingual tasks, such as event analysis [ 368 ].

Graph analysis is concerned with the analysis of relations between nodes in a graph. The relation between news articles can be used to construct a dependency graph. Spitz and Gertz analyzed how information propagates in online news coverage using hyperlinks linking to other websites [ 333 ]. They identified four types of hyperlinks: navigational (menu structure to navigate the website), advertisement , references (links within the article pointing to semantically related sites), and internal links (further articles published by the same news outlet). They only used reference links to build a network, since the other link types contain too many unrelated sites (internal) or irrelevant information (advertisement and navigational). One finding by Spitz and Gertz is that networks of news articles can be analyzed with methods of citation network analysis. Another method extracts quotes attributed to individuals in news articles to follow how information propagates over time in a news landscape [ 203 ]. One finding is that quotes undergo variation over time but remain recognizable with automated methods [ 203 ].

In our view, computer science research could therefore provide promising solutions to long-standing technical problems in the systematic study of source selection by combining methods from PD and graph analysis. If two articles are strongly similar, the later published article will most likely contain reused information from the former published article. This is a typical case in news coverage, e.g., many news outlets copy-edit articles from press agencies or other major news outlets [ 358 ]. Using PD, such as fingerprinting and pattern-based analysis as previously described, to measure the likelihood of information reuse between all possible pairs of articles in a set of related articles implicitly constructs a directed dependency graph. The nodes represent single articles, the directed edges represent the flow of information reuse, and the weights of the edges represent the degree of information reuse. The graph can be analyzed with the help of methods from graph analysis, e.g., to estimate importance or slant of news outlets or to identify clusters of articles or outlets that frame an event in a similar manner (cf. [ 333 ]). For instance, if many news outlets reuse information from a specific news outlet, the higher we can rate its importance. The detection of semantic (near-)duplicates would also help lower the number of articles that researchers from the social sciences need to manually investigate to analyze other forms of media bias in content analyses.

In summary, the analysis of bias by source selection is challenging, since the sources of information are mostly not documented in news articles. Hence, both in the social sciences and in computer science research, only few studies have analyzed this form of bias. Notable exceptions are the studies discussed previously that analyzed quotes used by politicians originating from think tanks. Methods from computer science can in principle provide the required techniques for the (semi-)automated analysis of this form of bias and thus make a very valuable contribution. The methods, most importantly those from plagiarism detection research, could be (and partially already have been [ 309 ]) adapted and extended from academic plagiarism detection and other domains, where reliable methods already exist.

2.3.3 Commission and Omission of Information

Analyses of bias by commission and omission compare the information contained in a news article with those in other news articles or sources, such as police reports and other official reports. The “implementation” and effects of commission and omission overlap with those of source selection, i.e., when information supporting or opposing a perspective is either included or left out of an article. Analyses in the social sciences aim to determine which frames the information included in such articles support. For instance, frame analyses typically compare the frequencies of frame-attributing phrases in a set of news articles [ 98 , 120 ]. More generally, content analysis compares which facts are presented in news articles and other sources [ 326 ]. In the following, we describe exemplary studies of each of the two forms.

A frame analysis by Gentzkow and Shapiro quantified phrases that may sway readers to one or the other side of a political issue [ 98 ]. For this analysis, the researchers first examined which phrases were used significantly more often by politicians of one party over another and vice versa. Afterward, they counted the occurrence of phrases in news outlets to estimate the outlet’s bias toward one side of the political spectrum. The results of the study showed that news producers have economic motives to bias their coverage toward the ideological views of their readers. Similarly, another method, briefly mentioned in Sect. 2.3.2 , counts how often US congressmen use the phrases coined by think tanks, which the researchers previously associated with political parties [ 117 ]. One finding is that Fox News coverage was significantly slanted toward the US Republican Party.

A content analysis conducted by Smith et al. [ 326 ] investigated whether the aims of protesters corresponded to the way in which news reported one demonstrations. One of their key hypotheses was that news outlets will tend to favor the positions of the government over the positions of protesters. In the analysis, Smith et al. extracted relevant textual information from news articles, transcripts of TV broadcasts, and police reports. They then asked analysts to annotate the data and could statistically confirm the previously mentioned hypothesis.

Bias by commission and omission has not specifically been addressed by automated approaches despite the existence of various methods that we consider beneficial for the analysis of both forms of bias in a (semi-)automated manner. Researchers from the social sciences are already using text search to find relevant documents and phrases within documents [ 336 ]. However, search terms need to be constructed manually, and the final analysis still requires a human interpretation of the text to answer coding tasks, such as “assess the spin of the coverage of the event” [ 326 ]. Another challenge is that content analyses comparing news articles with other sources require the development of scrapers and information extractors tailored specifically to these sources. Footnote 3 To our knowledge, there are no established or publicly available generic extractors for commonly used sources such as police reports.

An approach that partially addresses commission and omission of information is aspect-level browsing as implemented in the news aggregator NewsCube [ 276 ]. Park et al. [ 276 ] define an “aspect” as the semantic proposition of a news topic. The aspect-level browsing enables users to view different perspectives on political topics. The approach follows the news aggregation workflow described in Sect. 2.3.1 , but with a novel grouping phase: NewsCube extracts aspects from each article using keywords and syntactic rules and weighs these aspects according to their position in the article (motivated by the inverted pyramid concept: the earlier the information appears in the article, the more important it is [ 52 ]). Afterward, NewsCube performs HAC to group related articles. The visualization is similar to the topic list shown in established news aggregators, but additionally shows different aspects of a selected topic. A user study found that users of NewsCube became aware of the different perspectives and subsequently read more articles containing perspective-attributing aspects. However, the approach cannot reliably assess the diversity of the aspects. NewsCube shows all aspects, even though many of them are similar, which decreases the efficiency of using the visualization to get an overview of the different perspectives in news coverage. Word and phrase embeddings might be used to recognize the similarity of aspects (cf. [ 197 , 319 ]). The visualization also does not highlight which information is subject to commission and omission bias, i.e., what information is contained in one article and left out in another article.

Methods from plagiarism detection (see Sect. 2.3.2 ) open a promising research direction for the automated detection of commission and omission of information in news. More than 80% of related news articles add no new information and only reuse information contained in previously published articles [ 358 ]. Comparing the presented facts of one article with the facts presented in previously published articles would help identify commission and omission of information. Methods from PD can detect and visualize which segments of a text may have been taken from other texts [ 105 ]. The relatedness of bias by source selection and bias by commission and omission suggests that an analysis workflow may ideally integrate methods from PD to address both issues (also see Sect. 2.3.2 ).

Centering resonance analysis (CRA) aims to find how influential terms are within a text by constructing a graph with each node representing a term that is contained in the noun phrases (NP) of a given text [ 60 ]. Two nodes are connected if their terms are in the same NP or boundary terms of two adjacent NPs. The idea of the approach is that the more edges a node has, the more influential its term is to the text’s meaning. To compare two documents, methods from graph analysis can be used to analyze both CRA graphs (Sect. 2.3.2 gives a brief introduction to methods from graph analysis). Researchers from the social sciences have successfully employed CRA to extract influential words from articles and then manually compare the information contained in the articles [ 274 ]. Recent advancements toward computational extraction and representation of the “meaning” of words and phrases, especially word embeddings [ 197 ], may serve as another way to (semi-)automatically compare the contents of multiple news articles.

To conclude, studies in the social sciences researching bias by commission and omission have always compared the analyzed articles with other news articles and/or non-media sources, such as police reports. No approaches from computer science research specifically aim to identify this bias form. However, automated methods, specifically PD, CRA, graph analysis, and more recent also word embeddings, are promising candidates to address this form of bias opening new avenues for unique contributions of well-established computer science methodology in this area. CRA, for instance, has already been employed by researchers from the social sciences to compare the information contained in two articles.

2.3.4 Word Choice and Labeling

When referring to a semantic concept, such as an entity, a geographic position, or an activity, authors can label the concept and choose from various words to refer to it (cf. [ 86 ]). Instances of bias by labeling and word choice frame the referred concept differently, e.g., simply positively or negatively, or they highlight a specific perspective, e.g., economical or cultural (see Sect. 2.2.2 for a background on framing). Examples include “immigrant” or “economic migrant” and “Robert and John got in a fight” and “Robert attacked John.” The effects of this form of bias range from concept level, e.g., a specific politician is shown to be incompetent, to article level, e.g., the overall tone of the article features emotional or factual words [ 263 , 274 ].

Content analyses and framing analyses are used in the social sciences to identify bias by labeling and word choice within news articles. Similar to the approaches discussed in previous sections, the manual coding task is once again time-consuming, since annotating news articles requires careful human interpretation. The analyses are typically either topic-oriented or person-oriented. For instance, Papacharissi and Oliveira used CRA to extract influential words (see Sect. 2.3.3 ). They investigated labeling and word choice in the coverage of different news outlets on topics related to terrorism [ 274 ]. They found that The New York Times used a more dramatic tone, e.g., news articles dehumanized terrorists by not ascribing any motive to terrorist attacks or usage of metaphors, such as “David and Goliath” [ 274 ]. The Washington Post used a less dramatic tone, and both the Financial Times and The Guardian focused their news articles on factual reporting. Another study analyzed whether articles portrayed Bill Clinton, the US president at that time, positively, neutrally, or negatively [ 263 ].

The automated analysis of labeling and word choice in news texts is challenging due to limitations of current NLP methods [ 128 ], which cannot reliably interpret the frame induced by labeling and word choice, due to the frame’s dependency on the context of the words in the text [ 266 ]. Few automated methods from computer science have been proposed to identify bias induced by labeling and word choice. Grefenstette et al. devised a system that investigates the frequency of affective words close to words defined by the user, for example, names of politicians [ 116 ]. They find that the automatically derived polarity scores of named entities are in line with the publicly assumed slant of analyzed news outlets, e.g., George Bush, the Republican US president at that time, was mentioned more positively in the conservative The Washington Times compared to other news outlets.

The most closely related field is sentiment analysis , which aims to extract an author’s attitude toward a semantic concept mentioned in the text [ 272 ]. Current sentiment analysis methods reliably extract the unambiguously stated sentiment [ 272 ]. For example, those methods reliably identify whether customers used “positive,” such as “good” and “durable,” or “negative” words, such as “poor quality,” to review a product [ 272 ]. However, the highly context-dependent, hence more ambiguous sentiment in news coverage described previously in this section remains challenging to detect reliably [ 266 ]. Recently, researchers proposed approaches using affect analysis , e.g., using more dimensions than polarity in sentiment analysis to extract and represent emotions induced by a text, and crowdsourcing , e.g., systems that ask users to rate and annotate phrases that induce bias by labeling and word choice [ 277 ]. We describe these fields in the following paragraphs.

While sentiment analysis presents one promising technique to be used for automating the identification of bias by word choice and labeling, the performance of current sentiment classification on news texts is poor (cf. [ 167 , 266 ]) or even “useless” [ 335 ]. Two reasons why sentiment analysis performs poorly on news texts [ 266 ] are (1) the lack of large-scale gold standard datasets and (2) the high context dependency or implicitness of sentiment-inducing phrases. Large annotated datasets are required to train current sentiment classifiers [ 400 ]. More traditional classifiers use manually [ 153 ] or semi-automatically [ 13 , 110 , 335 ] created dictionaries of positive and negative words to score a sentence’s sentiment. However, to our knowledge, no sentiment dictionary exists that is specifically designed for news texts, and generic dictionaries tend to perform poorly on such texts (cf. [ 16 , 167 , 266 ]). Godbole, Srinivasaiah, and Skiena [ 110 ] used WordNet to automatically expand a small, manually created seed dictionary to a larger dictionary. They used the semantic relations of WordNet to expand upon the manually added words to closely related words. An evaluation showed that the resulting dictionary had similar quality in sentiment analysis as solely manually created dictionaries. However, the performance of entity-related sentiment classification using the dictionary tested on news websites and blogs is missing a comparison against a ground truth, such as an annotated news dataset. Most importantly, dictionary-based approaches are not sufficient for news texts, since the sentiment of a phrase depends on its context, for example, in economics, a “low market price” may be good for consumers but bad for producers.

To avoid the difficulties of interpreting news texts, researchers have proposed approaches to perform sentiment analysis specifically on quotes [ 16 ] or on the comments of readers [ 278 ]. The motivation for analyzing only the sentiment contained in quotes or comments is that phrases stated by someone are far more likely to contain an explicit statement of sentiment or opinion-conveying words. While the analysis of quotes achieved poor results [ 16 ], readers’ comments appeared to contain more explicitly stated opinions, and regular sentiment analysis methods perform better: a classifier that used the extracted sentiments from the readers’ comments achieved a precision of 0.8 [ 278 ].

Overall, the performance of sentiment analysis on news texts is still rather poor. This is attributable to the fact that, thus far, not much research has focused on improving sentiment analysis when compared to the large number of publications targeting the prime use case of sentiment analysis: product reviews. Currently, no public annotated news dataset for sentiment analysis exists, which is a crucial requirement for driving forward successful, collaborative research on this topic.

A final challenge when applying sentiment analysis to news articles is that the one-dimensional positive-negative scale used by all mature sentiment analysis methods may fall short of representing the complexity of news articles. Some researchers suggested to investigate emotions or affects, e.g., induced by headlines [ 341 ] or in entire news articles [ 116 ], whereas investigating the full text seems to yield better results. Affect analysis aims to find the emotions that a text induces on the contained concepts, e.g., entities or activities, by comparing relevant words from the text, e.g., nearby the investigated concept, with affect dictionaries [ 344 ]. Bhowmick [ 28 ] devised an approach that automatically estimates which emotions a news text induces on its readers using features such as tokens, polarity, and semantic representation of tokens. An ML-based approach by Mishne classifies blog posts into emotion classes using features such as n-grams and semantic orientation to determine the mood of the author when writing the text [ 243 ].

Semantics derived using word embeddings may be used to determine whether words in an article contain a slant, since the most common word embedding models contain biases, particularly gender bias and racial discrimination [ 32 , 42 ]. Bolukbasi describe a method to debias word embeddings [156]; the dimensions that were removed or changed by this process contain potentially biased words; hence, they may also be used to find biased words in news texts.

Besides fully automated approaches to identify bias by labeling and word choice, semi-automated approaches incorporate users’ feedback. For instance, NewsCube 2.0 employs crowdsourcing to estimate the bias of articles reporting on a topic. The system allows users to collaboratively annotate bias by labeling and word choice in news articles [ 277 ]. Afterward, NewsCube 2.0 presents contrastive perspectives on the topic to users. In their user study, Park et al. [ 277 ] find that the NewsCube 2.0 supports participants to collectively organize news articles according to their slant of bias. Section 2.3.8 describes AllSides, a news aggregator that employs crowdsourcing, though not to identify bias by labeling and word choice but to identify spin bias, i.e., the overall slant of an article.

The forms of bias by labeling and word choice have been studied extensively in the social sciences using frame analyses and content analyses. However, to date, not much research on both forms has been conducted in computer science. Yet, the previously presented techniques from computer science, such as sentiment analysis and affect analysis, are already capable of achieving reliable results in other domains. Besides, crowdsourcing has already successfully been used to identify instances of such bias.

2.3.5 Placement and Size Allocation

The placement and size allocation of a story indicates the value a news outlet assigns to that story [ 14 , 64 ]. Long-term analyses reveal patterns of bias, e.g., the favoring of specific topics or avoidance of others. Furthermore, the findings of such an analysis should be combined with frame analysis to give comprehensive insights on the bias of a news outlet, e.g., a news outlet might report disproportionately much on one topic, but otherwise, its articles are well-balanced and objective [ 75 ].

The first manual studies on the placement and size of news articles in the social sciences were already conducted in the 1960s. Researchers measured the size and the number of columns of articles present in newspapers, or the broadcast length in minutes dedicated to a specific topic, to investigate if there were any differences in US presidential election coverage [ 337 , 338 , 339 , 340 ]. These early studies, and also a more recent study conducted in 2000 [ 34 ], found no significant differences in article size between the news outlets analyzed. Fewer studies have focused on the placement of an article, but found that article placement does not reveal patterns of bias for specific news outlets [ 339 , 340 ]. Related factors that have also been considered are the size of headlines and pictures (see also Sect. 2.3.6 for more information on the analysis of pictures), which also showed no significant patterns of bias [ 339 , 340 ].

Bias by article placement and size has more recently not been revisited, even though the rise of online news and social media may have introduced significant changes. Traditional printed news articles are a permanent medium, in the sense that once they were printed, their content could not (easily) be altered, especially not for all issues ever printed. However, online news websites are often updated. For example, if a news story is still developing, the news article may be updated every few minutes (cf. [ 59 ]). Such updates of news articles also include the placement and allotted size of previews of articles on the main page and on other navigational pages. To our knowledge, no study has yet systematically analyzed the changes in the size and position of online news articles over time.

Fully automated methods are able to measure placement and size allocation of news articles because both forms can be determined by volumetric measurements (see Sect. 2.2.4 ). Printed newspapers must be digitalized first, e.g., using optical character recognition (OCR) and document segmentation techniques [ 160 , 248 ]. Measuring a digitalized or online article’s placement and size is a trivial task. Due to the Internet’s inherent structure of linked websites, online news even allows for a more advanced and fully automated measurements of news article importance, such as PageRank [ 269 ], which could also be applied within pages of the publishing news outlet. Most popular news datasets, such as RCV1 [ 205 ], are text-based and do not contain information on the size and placement of a news article. Future research, however, should especially take into consideration the fast pace in online news production as described previously.

While measuring size and placement automatically is a straightforward task in computer science, only few specialized systems currently exist that can measure these forms of news bias. Saez-Trumper, Castillo, and Lalmas [ 307 ] devised a system that measures the importance ascribed to a news story by an outlet by counting the total number of words of news articles reporting on the story. To measure the importance ascribed to the story by the outlet’s readers, the system counts the number of tweets linking to these news articles. One finding is that both factors are slightly correlated. NewsCube’s visualization is designed to provide equal size and avoid unfair placement of news articles to “not skew users’ visual attention” [ 276 ]. Even though the authors ascribe this issue high importance in their visualization, they do not analyze placement and size in the underlying articles.

Research in the social sciences and in computer science benefit from the increasing accessibility of online news, which allows effective automated analysis of bias by taking into consideration article placement and size. Measuring placement and size of articles is a trivial and scalable task that can be performed on any number of articles without requiring high manual effort. However, most recent studies in the social sciences have not considered including bias by placement and size into their analysis. The same is true for systems in computer science that should similarly include the placement and size of articles as an additional dimension of media bias. With the conclusions that have been drawn based on the analysis of traditional, printed articles, still in need of verification for online media, computer science approaches can here make a truly unique contribution.

2.3.6 Picture Selection

Pictures contained in news articles can influence how readers perceive a reported topic [ 304 ]. In particular, readers who wish to get an overview of current events are likely to browse many articles and thus view only each article’s headline and image. The effects of picture selection even go so far as to influence readers’ voting preferences in elections [ 304 ]. Reporters or news agencies sometimes (purposefully) show pictures out of context [ 83 ], e.g., a popular picture in 2015 showed an aggressive refugee with an alleged ISIS flag fighting against police officers. It later turned out that the picture was taken in 2012, before the rise of ISIS, and that the flag was not related to ISIS [ 70 ]; hence, the media had falsely linked the refugee with the terrorist organization.

Researchers from the social sciences have analyzed pictures used in news articles for over 50 years [ 173 ], approximately as long as media bias itself has been studied. Basic studies count the number of pictures and their size to measure the degree of importance ascribed by the news outlet to a particular topic (see also Sect. 2.3.5 for information on bias by size). In this section, we describe the techniques studies use to analyze the semantics of selected images. To our knowledge, all bias-related studies in the social sciences are concerned with political topics. Analyses of picture selection are either person-oriented or topic-oriented .

Person-oriented analyses ask analysts to rate the articles’ pictures showing specific politicians. Typical rating dimensions are [ 169 , 371 ]:

Expression , e.g., smiling vs. frowning

Activity , e.g., shaking hands vs. sitting

Interaction , e.g., cheering crowd vs. alone

Background , e.g., the country’s flags vs. not identifiable

Camera angle , e.g., eye-level shots vs. shots from above

Body posture , e.g., upright vs. bowed torso

Findings are mixed, e.g., a study from 1998 found no significant differences in the selected pictures between the news outlets analyzed, e.g., whether selected pictures of a specific news outlets were in favor of a specific politician [ 371 ]. Another study from 1988 found that The Washington Post did not contain significant picture selection bias but that The Washington Times selected images that were more likely favorable toward Republicans [ 169 ]. A study of German TV broadcasts in 1976 found that one candidate for German chancellorship, Helmut Schmidt, was significantly more often shown in favorable shots including better camera angles and reactions of citizens than the other main candidate, Helmut Kohl [ 171 ].

Topic-oriented analyses do not investigate bias toward persons but toward certain topics. For instance, a recent study on Belgian news coverage analyzed the presence of two frames [ 369 ]: asylum seekers in Belgium are (1) victims that need protection or (2) intruders that disturb Belgian culture and society. Articles supporting the first frame typically chose pictures depicting refugee families with young children in distress or expressing fear. Articles supporting the second frame chose pictures depicting large groups of mostly male, asylum seekers. The study found that the victim frame was predominantly adopted in Belgian news coverage and particularly in the French-speaking part of Belgium. The study also revealed a temporal pattern: during Christmas time, the victim frame was even more predominant.

To our knowledge, there are currently no systems or approaches from computer science that analyze media bias through image selection. However, methods in computer vision can measure many of the previously described dimensions. This is especially true since the recent rise of deep learning, where current methods achieve unprecedented classification performance [ 370 ]. Automated methods can identify faces in images, recognize emotions, categorize objects shown in pictures, and even generate captions for a picture. Research has advanced so far in these applications that several companies, such as Facebook, Microsoft, and Google, are using such automated methods in production, e.g., in autonomous cars, or are offering them as a paid service.

In the broad context of bias through image selection, Segalin et al. [ 317 ] trained a convolutional neural network (CNN) on the Psycho-Flickr dataset to estimate the personality traits of the pictures’ authors. To evaluate the classification performance of the system, they compared the CNN’s classifications with self-assessments by picture authors and also with attributed assessments by participants of a study. The results of their evaluation suggest that CNNs are suitable to derive such characteristics that are not even visible in the analyzed pictures.

Picture selection is an important factor in the perception of news. Basic research from psychology has shown that image selection can slant coverage toward one direction, although studies in the social sciences on bias by selection in the past concluded that there were no significant differences in picture selection. Advances in image processing research and the increasing accessibility of online news provide completely new avenues to study potential effects of picture selection. Computer science approaches can here primarily contribute by enabling the automated analysis of images on much bigger scale allowing us to reopen important questions on the effect of picture selection in news coverage and beyond.

2.3.7 Picture Explanation

Captions below images and referrals to the images in the main text provide images with the needed textual context. Images and their captions should be analyzed jointly because text can change a picture’s meaning and vice versa [ 172 , 173 ]. For instance, during Hurricane Katrina in 2005, two similar pictures published in US media showed survivors wading away with food from a grocery store. The only difference was that one picture showed a black man, who “looted” the store, while the other picture depicted a white couple, who “found” food in the store [ 328 ].

Researchers from the social sciences typically perform two types of analyses that are concerned with bias from image captions: jointly analyzing image and caption, or only analyzing the caption, ignoring the image. Only few studies analyze captions and images jointly. For instance, a comparison of images and captions from The Washington Post and The Washington Times found that the captions were not significantly biased [ 169 ]. A frame analysis on the refugee topic in Belgian news coverage also took into consideration image captions. However, the authors focused on the overall impression of the analyzed articles rather than examining any potential bias specifically present in the picture captions [ 369 ].

The vast majority of studies analyze captions without placing them in context with their pictures. Studies and techniques concerned with the text of a caption (but not the picture) are described in the previous sections, especially in the sections for bias by commission and omission (see Sect. 2.3.3 ) and labeling and word choice (see Sect. 2.3.4 ). We found that most studies in the social sciences either analyze image captions as a component of the main text or analyze images but disregard their captions entirely [ 339 , 340 , 371 ]. Likewise, relevant methods from computer science are effectively the same as those concerned with bias by commission and omission (see Sect. 2.3.3 ) and labeling and word choice (see Sect. 2.3.4 ). For the other type of studies, i.e., jointly analyzing images and captions, relevant methods are discussed in Sect. 2.3.6 , i.e., computer vision to analyze the contents of pictures, and additionally in Sections 2.3.3 and 2.3.4 , e.g., sentiment analysis to find biased words in captions.

To our knowledge, no study has examined picture referrals contained in the article’s main text. This is most likely due to the infrequency of picture referrals.

The few analyses on captions suggest that bias by picture explanation is not very common. However, more fundamental studies show the impact of captions on the perception of images and note rather subtle differences in word choice. While many studies analyzed captions as part of the regular text, e.g., analyzing bias by labeling and word choice, research currently lacks specialized analyses that examines captions in conjunction with their images.

2.3.8 Spin: The Vagueness of Media Bias

Bias by spin is closely related to all other forms of media bias and is also the vaguest form. Spin is concerned with the context of presented information. Journalists create the spin of an article on all textual levels, e.g., by supporting a quote with an explanation (phrase level), by highlighting certain parts of the event (paragraph level), or even by concluding the article with a statement that frames all previously presented information differently (article level). The order in which facts are presented to the reader influences what is perceived (e.g., some readers might only read the headline and lead paragraph) and how readers rate the importance of reported information [ 52 ]. Not only the text of an article but all other elements, including pictures, captions, and the presentation of the information, contribute to an article’s overall spin.

In the social sciences, the two primarily used methods to analyze the spin of articles are frame analysis and more generally content analysis. For instance, one finding in the terrorism analysis conducted by Papacharissi and Oliveira (see Sect. 2.3.2 ) was that The New York Times often personified the events in their articles, e.g., by focusing on persons involved in the event and the use of dramatic language [ 274 ].

Some practices in journalism can be seen as countermeasures to mitigate media bias. Press reviews summarize an event by referring to the main statements found in articles by other news outlets. This does not necessarily reveal media bias, because any perspective can be supported by source selection, e.g., only “reputable” outlets are used. However, typically press reviews broaden a reader’s understanding of an event and might be a starting point for further research. Another practice that supports mitigation of media bias are opposing commentaries in newspapers, where two authors subjectively elaborate their perspective on the same topic. Readers will see both perspectives and can make their own decisions regarding the topic.

Social media has given rise to new collaborative approaches to media bias detection. Reddit Footnote 4 is a social news aggregator, where users post links or texts regarding current events or other topics and rate or comment on posts by other users. Through the comments on a post, a discussion can emerge that is often controversial and contains the various perspectives of commenters on the topic. Reddit also has a “media bias” thread Footnote 5 where contributors share examples of biased articles. Wikinews Footnote 6 is a collaborative news producer, where volunteers author and edit articles. Wikinews aims to provide “reliable, unbiased and relevant news […] from a neutral point of view.” However, two main issues are as follows: first, the mixed quality of the news items, because many authors may participate in producing them, and second, the low number of articles, i.e., only major events are covered in the English version and other languages have even fewer articles. Thus, Wikinews currently cannot be used as a primary, fully reliable news source. Some approaches employ crowdsourcing to visualize different opinions or statements on politicians or news topics, for example, the German news outlet Spiegel Online frequently asks readers to define their position regarding two pairs of contrary statements that span a two-dimensional map [ 331 ]. Below the map, the news outlet lists excerpts from other outlets that support or contradict the map’s statements.

The automated analysis of spin bias using methods from computer science is maybe the most challenging of all forms because its manifestation is the vaguest among the forms of bias discussed. Spin refers to the overall perception of an article. Bias by spin is not, however, just the sum of all other forms but includes other factors, such as the order of information presented in a news article, the article’s tone, and emphasis on certain facts. Methods we describe in the following are partially also relevant for other forms of bias. For instance, the measurement of an article’s degree of personification in the terrorism in news coverage study [ 274 ] is supported by the computation of CRA [ 52 ]. What is not automated is the annotation of entities and their association with an issue. Named entity extraction [ 255 , 391 ] could be used to partially address these previously manually performed tasks.

Other approaches analyze news readers’ input, such as readers’ comments, to identify differences in news coverage. The rationale of these approaches is that readers’ input contains explicitly stated opinions and sentiment on certain topic, which are usually missing from the news article itself. Explicitly stated opinion can reliably be extracted with the help of NLP methods, such as sentiment analysis. For instance, one method analyzes readers’ comments to categorize related articles [ 1 ]. The method measures the similarity of two articles by comparing their reader comments, thereby focusing in each comment on the mentioned entities, the expressed sentiment, and country of the comment’s author. Another method counts and analyzes Twitter followers of news outlets to estimate the political orientation of the audience of the news outlet [ 111 ]. A seed group of Twitter accounts is manually rated according to their political orientation, e.g., conservative or liberal. This group is automatically expanded using those accounts’ followers. The method then estimates the political orientation of a news outlet’s audience by averaging the political orientation of the outlet’s followers in the expanded group of categorized accounts (cf. [ 98 , 117 , 220 ]).

The news aggregator AllSides [ 8 ] shows users the most contrastive articles on a topic, e.g., left and right leaning on a political spectrum. The system asks users to rate the spin of news outlets, e.g., after reading articles published by these outlets. To estimate the spin of an outlet, AllSides uses the feedback of users and expert knowledge provided by their staff. NewsCube 2.0 lets (expert) users collaboratively define and rate frames in related articles [ 277 ]. The frames are in turn presented to other users, e.g., a contrast view shows the most contrasting frames of one event. Users can then incrementally improve the quality of coding by refining existing frames.

Another method for news spin identification categorizes news articles on contentious news topics into two (opposing) groups by analyzing quotes and nearby entities [ 275 ]. The rationale of the approach is that articles portraying a similar perspective on a topic have more common quotes, which may support the given perspective, than articles that have different perspectives. The method extracts weighted triples representing who criticizes whom, where the weight depends on the importance of the triple, e.g., estimated by the position within the article (the earlier, the more important). The method measures the similarity of two articles by comparing their triples.

Other methods analyze frequencies and co-occurrences of terms to find frames in related articles and assign each article to one of the frames. For instance, one method clusters articles by measuring the similarity of two documents using the co-occurrences of the two documents’ most frequent terms [ 241 ]. The results of this rather simple method are then used for a manually conducted frame analysis. Hiérarchie uses recursive topic modeling to find topics and subtopics in tweets posted by users on a specific issue [ 325 ]. A radial treemap visualizes the extracted topics and subtopics. In the presented case study, users find and explore different theories on the disappearance of flight MH-370 discussed in tweets.

Lastly, manually annotated information related to media bias, e.g., the overall spin of articles rated by users of AllSides or articles annotated by social scientists during frame analysis, can in our view serve as a basis when creating training datasets for machine learning . Other data that exploits the wisdom of the crowd might be incorporated as well, e.g., analyzing the Reddit media bias thread. However, one should carefully review the information for its characteristics and inherent biases, especially if crowdsourced.

In our view, the existence of the very concept of spin bias allows drawing two conclusions. First, media bias is a complex model of skewed news coverage with overlapping and partially contradicting definitions. While many instances of media bias fit into one of the other more precisely defined forms of media defined in the news production and consumption process (see Sect. 2.2.3 ), some instances of bias do not. Likewise, such instances of bias may fit into other models from the social sciences that are concerned with differences in news coverage, such as the bias forms of coverage, gatekeeping, and statement (Sect. 2.2.3 briefly discusses other models of media bias), while other instances would not fit into such models. Second, we found that most of the approaches from computer science for identifying, or suitable for identifying, spin bias omit the research that has been conducted in the social sciences. Computer science approaches currently still address media bias as vaguely defined differences in news coverage and therefore stand to profit from prior research in the social sciences. In turn, there are few scalable approaches to the analysis of media biases in the social sciences significantly hampering progress in the field. We therefore see a strong prospect for collaborative research on automated approaches to the analysis of media bias across both disciplines.

2.3.9 Summary

Most automated approaches focus on analyzing vaguely defined “biases.” These biases can be technically significant but may often not represent meaningful slants of the news. In contrast, in social science research, media bias emerges from observing systematic tendencies of specific bias forms or means. For example, the news production process that we use in our literature review defines nine bias forms.

One reason for the previously mentioned lack of conclusive or meaningful results is that almost no automated approach aims to specifically find such individual bias forms. At the same time, however, we found that suitable automated techniques are available to aid in the analysis of the individual bias forms.

2.4 Reliability, Generalizability, and Evaluation

This section discusses how automated approaches for analyzing media bias should be evaluated. Therefore, we first describe how social scientists measure the reliability and generalizability of studies on media bias.

The reliability and generalizability of manual annotation in the social sciences provide the benchmark for any automated approach. Best practices in social science research can involve both the careful development and iterative refinement of underlying codebooks and the formal validation of inter-coder reliability. For example, as discussed in Sect. 2.2.4 , a smaller, careful inductive manual annotation aids in constructing the codebook. The main deductive analysis is then performed by a larger pool of coders where the results of individual coders and their agreement on the assignment of codes can be systematically compared. Standard measures for inter-coder reliability, e.g., the widely used Krippendorff’s alpha [ 144 ], provide estimates for the reliability and robustness of the coding. Whether coding rules, and with these the quality of annotations, can be generalized beyond a specific case is usually not routinely analyzed because, by virtue of the significant effort required for manual annotation, the scope of such studies is usually limited to a specific question or context. Note, however, that the usual setup of a small deductive analysis, conducted on a subset of the data, implies that a codebook generated in this way can generalize to a larger corpus.

Computer science approaches for the automated analysis of media bias stand to profit a lot from a broad adoption of their methods by researchers across a wider set of disciplines. The impact and usefulness of automatized approaches for substantial cross-disciplinary analyses, however, hinge critically on two central questions. First, compared to manual methodologies, how reliable are automated approaches? Specifically, broad adoption of automated approaches in social sciences applications is only likely if the automated approaches identify at least close to the same instances of bias as manual annotations would.

Depending on which kind of more or less subtle form of bias is analyzed, the results gained through manual annotation might represent a more or less difficult benchmark to beat. Especially in complex cases, manual annotation of individual items may systematically perform better in capturing subtle instances relevant to the analysis question than automated approaches. Note that, for example, currently no public annotated news dataset for sentiment analysis exists (see Sect. 3.4 ). The situation is similar for most of the applications reviewed in this chapter, i.e., there is currently a dearth of standard benchmark datasets. Meaningful validation would thus require as a first step the careful (and time-intensive) development of such datasets across a range of relevant contexts.

One way to counter the present lack of evaluation datasets is to not solely rely on manual content analysis for annotation. For simple annotation tasks, such as rating the subjective slant of a news picture, crowdsourcing can be a suitable alternative to content analysis. This procedure requires less effort than conducting a full content analysis, including creation of a codebook and refining it until the ICR is sufficiently high (cf. [ 152 ]). One can also use other available data. For example, Recasens, Danescu-Niculescu-Mizil, and Jurafsky [ 297 ] use bias-related revisions from the Wikipedia edit history to retrieve presumably biased single-word phrases. The political slant classification of news articles and outlets crowdsourced by users on web services such as AllSides (see Sect. 2.3.8 ) may serve as another comparison baseline. As stated in Sect. 2.3.8 , before employing crowdsourced information, one should carefully review its characteristics and quality.

Another way to evaluate the performance of bias identification methods is to manually analyze the automatically extracted instances of media bias, e.g., through crowdsourcing or (typically fewer) specialized coders. However, evaluating the results of an automated approach this way decreases the comparability between approaches, since these have to be evaluated in the same way manually again. Generating annotated benchmark datasets on the other hand requires greater initial effort, but the results can then be used multiple times to evaluate and compare multiple approaches. Footnote 7

The second central question is how well-automated approaches generalize to the study of similar forms of bias in different contexts than those contexts for which they were initially developed. This question pertains to the external validity of developed approaches, i.e., is their performance dependent on a specific empirical or topical context? Out-of-sample performance could be tested against benchmark datasets not used for initial evaluation; however, as emphasized before, such datasets must still be developed. Hence, systematically testing the performance of approaches across many contexts is likely also infeasible for the near future simply because the cost of generating benchmark datasets is too high. Ultimately, it would be best practice for benchmark studies to establish more generally whether or not specific characteristics of news are related to the performance of the automated approaches developed.

2.5 Key Findings

News coverage strongly influences public opinion. While slanted news coverage is not harmful per se, systematically biased news coverage can negatively impact the public. Recent trends, such as social bots that automatically write news posts or the centralization of media outlet ownership, have the potential to further amplify the negative effects of biased news coverage. News consumers should be able to view different perspectives of the same news topic [ 252 ]. Unrestricted access to unbiased information is crucial for citizens to form their own views and make informed decisions [ 135 , 250 ], e.g., during elections. Since media bias has been, and continues to be, structurally inherent in news coverage [ 146 , 147 , 276 ], the detection and analysis of media bias is a topic of high societal and policy relevance.

Researchers from the social sciences have studied media bias over the past decades, resulting in a comprehensive set of methodologies, such as content analysis and frame analysis, as well as models to describe media bias. One of these models, the news production process , describes how journalists turn events into news articles. The process defines nine forms of media bias that can occur during the three phases of news production: In the first phase, “gathering of information,” the bias forms are (1) event selection, (2) source selection, and (3) commission and omission of information. In the second phase, “writing,” the bias forms are (4) labeling and word choice. In the third phase, “editing,” the bias forms are (5) story placement, (6) size allocation, (7) picture selection, and (8) picture explanation. Lastly, bias by (9) spin is a form of media bias that represents the overall bias of a news article and essentially combines the other forms of bias, including minor forms not defined specifically by the news production and consumption process.

For each of the forms of media bias, we discussed exemplary approaches being applied in the social sciences and described the automated methods from computer science that have been used, or could best be used, to address the particular form of bias. We summarize the findings of our review of the current status quo as follows:

Only few approaches in computer science address the analysis of media bias. The majority of these approaches analyze media bias from the perspective of regular news consumers and neglect both the approaches and models that have already been developed in the social sciences. In many cases, the underlying models of media bias are too simplistic, and their results when compared to models and results of research in the social sciences do not provide additional insights.

The majority of content analyses in the social sciences do not employ state-of-the-art methods for automated text analysis. As a result, the manual content analysis approaches conducted by social scientists require exacting and very time-consuming effort, as well as significant expertise and experience. This severely limits the scope of what social scientists can study and has significantly hampered progress in the field.

Thus, there is, in our view, much potential for interdisciplinary research on media bias among computer scientists and social scientists. Automated approaches are available for each of the nine forms of media bias that we discussed. On the one hand, methodologies and models of media bias in the social sciences can help to make automated approaches more effective. Likewise, the development of automated methods to identify instances of specific forms of media bias can help make content analysis in the social sciences more efficient by automating more tasks.

Media bias analysis is a rather young research topic within computer science, particularly when compared with the social sciences, where the first studies on media bias were published more than 70 years ago [ 172 , 377 ]. Our first finding (F1) is that most of the reviewed computer science approaches treat media bias vaguely and view it only as “differences of [news] coverage” [ 278 ], “diverse opinions” [ 251 ], or “topic diversity” [ 252 ]. The majority of the current approaches neglect the state of the art developed in the social sciences. They do not make use of models describing different forms of media bias or how biased news coverage emerges in the news production and consumption process [ 14 , 276 ] (Sect. 2.2.3 ). Also, approaches in computer science do not employ methods to analyze the specific forms of bias, such as content analysis [ 64 ] and frame analysis [ 368 ] (Sect. 2.2.4 ). Consequently, many approaches in computer science are limited in their capability for identifying instances of media bias. For instance, matrix-based news aggregation (MNA) organizes articles and topics in a matrix to facilitate showing differences in international news topics, but the approach can neither determine whether there are actual differences, nor can MNA enforce finding differences [ 129 ]. Likewise, Hiérarchie finds subtopics in news posts that may or may not refer to differences caused by media bias [ 325 ]. To overcome the limitations in identifying bias, some approaches, such as NewsCube 2.0 [ 277 ] and AllSides (Sect. 2.3.8 ), outsource the task of identifying media bias to users, e.g., by asking users to manually rate the slant of news articles.

Content analysis and frame analysis both require significant manual effort and expertise (F2). Especially time-intensive are the tasks of systematic screening and subsequent annotation of texts. Such tasks can currently only be performed by human coders [ 64 , 368 ]. Currently, in our view, the execution of these tasks cannot be improved significantly by employing automated text analysis methods due to the lack of mature methods capable of identifying specific instances of media bias, which follows from F1. This limitation, however, may be revised once interdisciplinary research has resulted in more advanced automated methods. Other tasks, such as data gathering or searching for relevant documents and phrases, are already supported by basic (semi-)automated methods and tools, such as content analysis software [ 215 ]. However, clearly the full potential of the state of the art in computer science is not yet being exploited. The employed techniques, e.g., keyword-based text matching to find relevant documents [ 336 ] or frequency-based extraction of representative terms to find patterns [ 215 ], are rather simple compared to state-of-the-art methods for text analysis. Few of the reviewed tools used by researchers in the social sciences employ methods proven effective in natural language processing, such as resolution of coreferences or synonyms or finding related article using an event-based search approach.

In our view, combining the expertise of the social sciences and computer science results in valuable opportunities for interdisciplinary research (F3). Reliable models of media bias and manual approaches for the detection of media bias can be combined with methods for automated data analysis, in particular, with text analysis and natural language processing approaches. NewsCube [ 276 ], for instance, extracts so-called aspects from news articles, which refer to the frames defined by social scientists [ 159 ]. Users of NewsCube became more aware of the different perspectives contained in news coverage on specific topics, than users of Google News. In this chapter, we showed that promising automated methods from computer science are available for all forms of media bias as defined by the news production and consumption process (see Sect. 2.3 ). For instance, studies concerned with bias by source selection or the commission and omission of information investigate how information is reused in news coverage [ 98 , 117 , 120 ]. Similarly to these studies, methods from plagiarism detection aim to identify instances of information reuse in a set of documents, and these methods yield reliable results for plagiarism with sufficient textual similarity [ 89 , 179 ]. Finally, recent advancements in text analysis, particularly word embeddings [ 197 ] and deep learning [ 198 ], open a promising area of research on media bias. Thus far, few studies use word embeddings and deep learning to analyze media bias in news coverage. However, the techniques have proven very successful in various related problems (cf. [ 5 , 191 , 306 , 311 ]), which lets us anticipate that the majority of the textual bias forms could be addressed effectively with such approaches.

We believe that interdisciplinary research on media bias can result in three main benefits. First, automated approaches for analyzing media bias will become more effective and more broadly applicable, since they build on the substantial, theoretical expertise that already exists in the social sciences. Second, content analyses in the social sciences will become more efficient, since more tasks can be automated or supported by automated methods from computer science. Finally, we argue that news consumers will benefit from improved automated methods for identifying media bias, since the methods can be used by news aggregators to detect and visualize the occurrence of potential media bias in real time.

2.6 Practical View on the Research Gap: A Real-World Example

This section practically demonstrates the implications of the literature review’s finding using a real-world example of news coverage and consumption.

Suppose you are reading the news. When viewing the coverage on an event, e.g., in your favorite news aggregator, or a single article reporting on the event, e.g., on the website of your favorite news outlet, you are wondering whether there might be other perspectives on the event. Which information are you missing since it is not mentioned in the articles you viewed or read? Mapping these questions to the terminology introduced earlier, the objective in this scenario is to efficiently and effectively get an overview of all the major perspectives present in the media. Efficiency is vital since newsreaders typically have only limited time for informing themselves on current events. While this example entails only one event, newsreaders are interested in multiple events, limiting the time available for a single event further. Effectiveness refers to understanding distinct and meaningful perspectives that help determine whether one already has a comprehensive overview of the coverage or if and which articles may offer alternative interpretations or additional information.

Table 2.2 shows headlines of news articles reporting on the Republican Party debate during the US presidential primaries in New Hampshire hosted by ABC News on February 6, 2016. We selected the articles using the following criteria: they had to primarily report on the event and be published by a popular online US news outlet Footnote 8 on the day of the event or the day after. This way, we retrieved more than 30 articles. Afterward, we conducted an inductive frame analysis (Sect. 2.2.4.2 ) to get a comprehensive overview of the content and perspectives present in the event coverage. For the sake of simplicity in this example, we selected eight articles that represented all major perspectives with only minor differences between the articles. In daily news consumption, the eight articles could, for example, be the results of an online search for coverage on the event or be shown in a news aggregator or another news application. Note that our pre-selection of articles already gives an unrealistic improvement concerning the example’s objective compared to regular news consumption because the article set is small and at the same time fully represents the coverage’s substantial frames.

Interactive experiment

Look at the headlines in Table 2.2 . The headlines are taken from news articles that report on a debate during the 2016 presidential primaries. Estimate how many major perspectives there are in the event coverage on the debate. Think of a perspective as a distinct viewpoint on the debate that is the most prominent viewpoint common to one or more articles.

Next, decide for each article which perspective it has on the event.

You can try to increase the “accuracy” of your results by looking at further information, such as the articles’ outlets, their political orientation (Table 2.2 ), or the articles’ full text (Appendix A.1). Please write down your results for each article and compare them with those presented in the following.

Manual Frame Analysis

The previously mentioned frame analysis yielded three frames, Footnote 9 which are shown in the last column (“Frame”) for each article (“ID”) in Table 2.3 . Frame F1 occurs in a single article (ID 2 with political orientation center), which is the only article that was updated consistently during the event to contain up-to-date information. In contrast to the other frames and articles, F1 consists primarily of quotes by the candidates, mostly about themselves. The frame thus portrays most candidates as they portrayed themselves in the debate, i.e., positively. There is not much commentary or assessment by journalists in this frame.

Common to much coverage on the event and thus also common to the two remaining frames is the prominence of three candidates. Chris Christie is portrayed as rather strong, and Marco Rubin as weak, being a target of verbal attacks by Christie and the other candidates. Also common to most articles reporting on the debate is that they prominently or often report on Donald Trump. At the time of the event, he generally received particular media interest, e.g., because he had boycotted the previous debate. As such, Trump is also frequently mentioned in the remaining articles of the set and serves as a distinguishing factor for the two remaining frames. Articles of frame F2 portray Trump rather negatively. Articles of F2 mention, for example, that Trump was “booed” by the audience (0, left), that Trump was accused “of taking advantage of an elderly woman” (3, center), and that “Trump was hit hard by Bush” (6, right). In contrast, articles of frame F3 portray Trump primarily positively, e.g., that “he seemed to do well enough to possibly win” (4, center), that “he was unwaveringly in charge” (7, right), that “Trump was measured and thoughtful” (7, right), and that “it is easy to see the Trump train getting on a roll” (1, left).

We use the results of the manual frame analysis as the ground truth since the technique represents one of the standards in social science research on media bias.

Means for Bias-Sensitive News Consumption

In addition to frame analysis, we tested three means to identify the articles’ perspectives. These means represent practices suitable for daily news consumption as well as automated techniques. Table 2.3 shows the perspectives assigned to individual articles by the approaches. The column “Headline” represents a means applied by many news consumers due to its high efficiency, i.e., determining the content of an article by its headline. Specifically, the column contains the author’s results of the previous interactive experiment, where H1 represents a perspective Footnote 10 that portrays Rubin negatively. Using as much information as available in the headlines, we identified two sub-perspectives of H1 where additionally Christie and Bush are portrayed positively (H1a) and Trump is portrayed positively (H1b). H2 represents an “anti-Trump” perspective, and in perspective H3, all candidates and especially Rubio are portrayed negatively. Following the previous perspective categorization centered on persons, two headlines (articles 1 and 2) could not be assigned to a meaningful perspective. Footnote 11 When comparing these headline-implied perspectives with the frames in the right column that were deduced by carefully analyzing the articles’ full content, the lack of an overall coherence across both directly indicates that the headlines do not allow for reliably estimation of an article’s slant.

Using the political orientation of the articles’ outlets to determine the articles’ potential slant is another means [ 8 ] for bias identification (column “Political”). Employing the left-right dichotomy is fast and often also effective when analyzing political discourse and even more so in polarized media landscapes such as in the USA [ 395 ]. However, the lack of coherence between the perspectives implied by the outlets’ political orientation and the frames shown in Table 2.3 highlights that this approach is superficial and its results are inconclusive. While employing the political orientation can increase the visibility of slants, they cannot reliably identify an article’s slant. In the example, there are major differences even across articles that have the same perspective according to this means.

The clustering approach (column “Clustering”), albeit simply using affinity propagation [ 91 ] on word embeddings, Footnote 12 is the only approach to determine the previously mentioned difference of article 2, the only with frame F1, compared to all others. However, otherwise, the technique yields inconclusive results, e.g., a large group of articles (C2) entailing articles from the entire political spectrum, and entails both remaining frames. The results of this simple approach are representative of automated approaches for bias identification, which analyze bias, for example, as vaguely defined “topic diversity” [ 252 ] or “differences in coverage” [ 278 ] as shown in the literature review. Other technical means may even amplify the newsreaders’ own biases, e.g., Google News, Facebook, and other news aggregators or channels learn from users’ preferences and show primarily those news items that are to the users’ liking or interest. Footnote 13

Of course, the generalizability of this simple example is limited by various factors. For example, the inductive frame analysis was conducted only by one person, likely increasing the degree of subjectivity. In frame analyses, researchers in the social sciences typically rely on the annotations of multiple persons. At least during test phases, the annotations are compared and discussed to avoid subjectivity or achieve a known level of subjectivity that is coherent across the annotations (Sect. 2.2.4 ).

However, the example also highlights two key findings of our literature review. Whether they are automated or manual, current means are unreliable and suffer from superficial methodology and results or are reliable but cause high manual effort. There is no coherence across the perspectives determined by the three fast approaches compared to the results of the frame analysis. There is not even any coherence when comparing any pair of the fast methods.

If you participated in the interactive experiment, your findings might differ from those shown in Table 2.3 , depending on which information you analyzed. Examining further information than the headlines alone may have yielded a more comprehensive understanding of the news coverage but came at an additional investment of time and effort. This effort is even increased in regular news consumption since newsreaders first have to research relevant articles of an event. Ultimately, critical assessment of the news takes too much time to be applied during regular news consumption. However, as automated approaches are unreliable, such manual practices currently present the only reliable means to analyze media bias.

It is this gap that the thesis at hand aims to address.

2.7 Summary of the Chapter

This chapter reviewed the issue of media bias and gave an interdisciplinary overview on the topic, particularly on methods and tools used to analyze media bias. The comparison of prior work in computer science, political science, and related disciplines revealed differences. Media bias has been studied extensively in the social sciences, whereas it is a relatively young research subject in computer science and other disciplines concerned with devising automated approaches. Consequently, while many automated methods offer effortless, scalable analysis, they yield inconclusive or less substantial results than methods used in the social sciences. Conversely, social science methods are practice-proven and effective but require much effort because researchers have to conduct them manually.

The chapter showed that the work conducted in either of the disciplines could benefit from incorporating knowledge and methods established in the other disciplines. Thus, while this thesis has a focus on computer science methodology, our general research principle is to make use of social science expertise where possible and feasible. Chapter 3 discusses how we can effectively address our research question in the context of the state of the art in computer science and the social sciences.

The paragraphs about news aggregation have been adapted partially from [ 129 ].

https://en.wikinews.org/wiki/ .

In Sect. 3.5 , we propose a system for crawling and extracting news articles.

https://www.reddit.com/ .

https://www.reddit.com/r/MediaBias/ .

The SemEval series [ 5 ] are a representative example from computer science where with high initial effort comprehensive evaluation datasets are created, allowing a quantitative comparison of the performance of multiple approaches afterward.

An outlet was defined as being “popular” if it was contained in the list of “top outlets” shown on https://www.allsides.com/media-bias/media-bias-ratings .

Frame analyses are task-specific, and the resulting frames may depend on the data and analysis question at hand. Due to the articles’ focus on persons involved in the debate, we centered our framing categories on these persons.

We use the term “perspective” to highlight that this classification resulted from applying a practice or technique. In contrast to a frame, a perspective may, however, not fully or meaningfully represent an article’s content and framing.

However, in another categorization scheme, the headlines could be interpreted as a perspective giving an overview of the event.

The embeddings were derived using the largest model “en_core_web_lg” of the natural language processing toolkit spaCy (v3.0). Source: https://spacy.io/usage/v3 .

A typical example highlighting the filter bubble issue occurred when compiling the set of articles used in this example. Google News and Google Search presented the author with articles from only two political orientations, even when using the browser’s privacy mode. This could only be overcome by using search engines that did not adapt search results to their users, such as DuckDuckGo.

Sofiane Abbar et al. “Real-time recommendation of diverse related articles”. In: Proceedings of the 22nd international conference on World Wide Web . ACM. 2013, pp. 1–12. doi : 10.1145/2488388.2488390. url : https://doi.org/10.1145/2488388.2488390 .

Eneko Agirre et al. “SemEval-2016 Task 1: Semantic Textual Similarity, Monolingual and Cross-Lingual Evaluation”. In: Proceedings of the 10th International Workshop on Semantic Evaluation (SemEval-2016) . Stroudsburg, PA, USA: Association for Computational Linguistics, 2016, pp. 497–511. isbn : 978-1-941643-95-2. doi : https://doi.org/10.18653/v1/S16-1081 . url : http://aclweb.org/anthology/S16-1081 .

Phyllis F. Agran, Dawn N. Castillo, and Dianne G. Winn. “Limitations of data compiled from police reports on pediatric pedestrian and bicycle motor vehicle events”. In: Accident Analysis and Prevention 22.4 (1990), pp. 361–370. issn : 00014575. doi : https://doi.org/10.1016/0001-4575(90)90051-L .

AllSides.com. AllSides - balanced news. 2021. url : https://www.allsides.com/unbiased-balanced-news (visited on 02/24/2021).

Amanda Amos and Margaretha Haglund. “From social taboo to “torch of freedom”: the marketing of cigarettes to women”. In: Tobacco control 9.1 (2000), pp. 3–8. doi : 10.1136/tc.9.1.3. url : https://doi.org/10.1136/tc.9.1.3 .

Jisun An et al. “Visualizing media bias through Twitter”. In: Proc. ICWSM SocMedNews Workshop. 2012. url : https://www.aaai.org/ocs/index.php/ICWSM/ICWSM12/paper/view/4775 .

Stefano Baccianella, Andrea Esuli, and Fabrizio Sebastiani. “SentiWordNet 3.0: An Enhanced Lexical Resource for Sentiment Analysis and Opinion Mining.” In: Proceedings of the Seventh International Conference on Language Resources and Evaluation (LREC’10) . Vol. 10. Valletta, Malta: European Language Resources Association (ELRA), 2010, pp. 2200–2204. url : https://www.aclweb.org/anthology/L10-1531/ .

Brent H Baker, Tim Graham, and Steve Kaminsky. How to identify, expose & correct liberal media bias . Alexandria, VA: Media Research Center, 1994. isbn: 978-0962734823.

Google Scholar  

Eytan Bakshy, Solomon Messing, and Lada A Adamic. “Exposure to ideologically diverse news and opinion on Facebook”. In: Science 348.6239 (2015), pp. 1130–1132. doi : https://doi.org/10.1126/science.aaa1160 . url : https://science.sciencemag.org/content/348/6239/1130 .

Alexandra Balahur et al. “Sentiment analysis in the news”. In: Proceedings of the Seventh International Conference on Language Resources and Evaluation (LREC’10) . Valletta, Malta: European Language Resources Association (ELRA), 2010. url : https://arxiv.org/abs/1309.6202 .

Pablo Barberá et al. “Tweeting From Left to Right”. In: Psychological Science 26.10 (Oct. 2015), pp. 1531–1542. issn : 0956-7976. doi : https://doi.org/10.1177/0956797615594620 . url : http://journals.sagepub.com/doi/10.1177/0956797615594620 .

David P Baron. “Persistent media bias”. In: Journal of Public Economics 90.1 (2006), pp. 1–36. doi : https://doi.org/10.1016/j.jpubeco.2004.10.006 .

Dan Bernhardt, Stefan Krasa, and Mattias Polborn. “Political polarization and the electoral effects of media bias”. In: Journal of Public Economics 92.5-6 (June 2008), pp. 1092–1104. issn : 00472727. doi : https://doi.org/10.1016/j.jpubeco.2008.01.006 . url : https://linkinghub.elsevier.com/retrieve/pii/S0047272708000236 .

Timothy Besley and Andrea Prat. “Handcuffs for the Grabbing Hand? Media Capture and Government Accountability”. In: American Economic Review 96.3 (May 2006), pp. 720–736. issn : 0002-8282. doi : https://doi.org/10.1257/aer.96.3.720 . url : https://pubs.aeaweb.org/doi/10.1257/aer.96.3.720 .

Clive Best et al. Europe Media Monitor - System Description . Tech. rep. December. 2005, pp. 1–57. url : https://publications.europa.eu/flexpaper/common/view.jsp?doc=c0d6bb93-7ec4-496f-b857-b7fe9bc33d19.en.PDF.pdf&format=pdf&page=10 .

Plaban Kumar Bhowmick. “Reader Perspective Emotion Analysis in Text through Ensemble based Multi-Label Classification Framework”. In: Computer and Information Science 2.4 (Oct. 2009), pp. 64–74. issn : 1913-8997. doi : 10.5539/cis.v2n4p64. url : http://www.ccsenet.org/journal/index.php/cis/article/view/3872 .

DavidMBlei. “Probabilistic topic models”. In: Communications of the ACM 55.4 (2012), pp. 77–84. doi : https://doi.org/10.1145/2133806.2133826 .

Pablo J. Boczkowski. “The Processes of Adopting Multimedia and Interactivity in Three Online Newsrooms”. In: Journal of Communication 54.2 (June 2004), pp. 197–213. issn : 0021-9916. doi : https://doi.org/10.1093/joc/54.2.197 . url : http://joc.oupjournals.org/cgi/doi/10.1093/joc/54.2.197 .

Tolga Bolukbasi et al. “Man is to computer programmer as woman is to homemaker? Debiasing word embeddings”. In: Advances in Neural Information Processing Systems . 2016, pp. 4349–4357. url : https://arxiv.org/abs/1607.06520 .

Dylan Bourgeois, Jérémie Rappaz, and Karl Aberer. “Selection Bias in News Coverage: Learning it, Fighting it”. In: Companion of the The Web Conference 2018 on The Web Conference 2018 - WWW ’18 . 2018. isbn : 9781450356404. doi : https://doi.org/10.1145/3184558.3188724 .

TomÁš Brychcín and Lukáš Svoboda. “UWB at SemEval-2016 Task 1: Semantic Textual Similarity using Lexical, Syntactic, and Semantic Information”. In: Proceedings of the 10th InternationalWorkshop on Semantic Evaluation (SemEval-2016) . 2016. isbn : 9781941643952. doi : https://doi.org/10.18653/v1/S16-1089 .

Hans Jürgen Bucher and Peter Schumacher. “The relevance of attention for selecting news content. An eye-tracking study on attention patterns in the reception of print and online media”. In: Communications 347–368.31 (2006), p. 3. issn : 03412059. doi : https://doi.org/10.1515/COMMUN.2006.022 .

C Bui. “How online gatekeepers guard our view: News portals’ inclusion and ranking of media and events”. In: Global Media Journal 9.16 (2010), pp. 1–41. url : https://www.globalmediajournal.com/peer-reviewed/howonline-gatekeepers-guard-our-viewnews-portals-inclusion-and-rankingof-media-and-events-35232.html .

Business Insider. These 6 Corporations Control 90% Of The Media In America . 2014. url : http://www.businessinsider.com/these-6-corporationscontrol-90-of-the-media-in-america-2012-6 (visited on 01/13/2021).

Aylin Caliskan, Joanna J. Bryson, and Arvind Narayanan. “Semantics derived automatically from language corpora contain human-like biases”. In: Science (2017). issn : 10959203. doi : https://doi.org/10.1126/science.aal4230 . arXiv: 1608.07187.

Joseph N . Cappella and Kathleen Hall Jamieson. Spiral of cynicism: The press and the public good . Oxford University Press on Demand, 1997.

Darrell Christian et al. The Associated Press Stylebook and Briefing on Media Law . Basic Books, 2019. isbn : 978-1541699892.

Nicole S. Cohen. “AtWork in the DigitalNewsroom”. In: Digital Journalism 7.5 (May 2019), pp. 571–591. issn : 2167-0811. doi : https://doi.org/10.1080/21670811.2017.1419821 . url : https://www.tandfonline.com/doi/full/10.1080/21670811.2017.1419821 .

Steven R Corman et al. “Studying Complex Discursive Systems.” In: Human communication research 28.2 (2002), pp. 157–206.

Jackie Crossman. Aussies Turn To Social Media For News Despite Not Trusting It As Much . Nov. 2014. url : https://www.bandt.com.au/aussies-turnsocial-media-news-despite-trusting-much/ (visited on 12/11/2020).

Christian S. Czymara and Marijn van Klingeren. “New perspective? Comparing frame occurrence in online and traditional news media reporting on Europe’s “Migration Crisis””. In: Communications (Apr. 2021), pp. 1–27. issn : 1613-4087. doi : https://doi.org/10.1515/commun-2019-0188 . url : https://www.degruyter.com/document/doi/10.1515/commun-2019-0188/html .

Dave D’Alessio and Mike Allen. “Media Bias in Presidential Elections: A Meta-Analysis”. In: Journal of Communication 50.4 (Dec. 2000), pp. 133–156. doi : https://doi.org/10.1111/j.1460-2466.2000.tb02866.x . url : http://doi.wiley.com/10.1111/j.1460-2466.2000.tb02866.x .

Paul D’Angelo and Jim A Kuypers. Doing news framing analysis: Empirical and theoretical perspectives . Routledge, 2010.

Murray S. Davis and Erving Goffman. “Frame Analysis: An Essay on the Organization of Experience.” In: Contemporary Sociology 4.6 (Nov. 1975), p. 599. issn : 00943061. doi : https://doi.org/10.2307/2064021 . url : http://www.jstor.org/stable/2064021?origin=crossref .

Claes H De Vreese. “News framing: Theory and typology”. In: Information design journal and document design 13.1 (2005), pp. 51–62.

Lizzie Dearden. The fake refugee images that are being used to distort public opinion on asylum seekers . Sept. 2015. url : http://www.independent.co.uk/news/world/europe/the-fake-refugee-images-that-are-being-usedto-distort-public-opinion-on-asylum-seekers-10503703.html (visited on 02/18/2020).

Stefano DellaVigna and Ethan Kaplan. The Fox News Effect: Media Bias and Voting. Tech. rep. 3. Cambridge, MA: National Bureau of Economic Research, Apr. 2006, pp. 1187–1234. doi : https://doi.org/10.3386/w12169 . url : http://www.nber.org/papers/w12169.pdf .

P. M. DeMarzo, Dimitri Vayanos, and Jeffrey Zwiebel. “Persuasion Bias, Social Influence, and Unidimensional Opinions”. In: The Quarterly Journal of Economics 118.3 (Aug. 2003), pp. 909-968. issn : 0033-5533. doi : 10.1162/00335530360698469. url : https://doi.org/10.1162/00335530360698469 .

James N Druckman and Michael Parkin. “The impact of media bias: How editorial slant affects voters”. In: Journal of Politics 67.4 (2005), pp. 1030–1049.

RobertMEntman. “Framing: Toward Clarification of a Fractured Paradigm”. In: Journal of Communication 43.4 (Dec. 1993), pp. 51–58. issn : 0021-9916. doi : https://doi.org/10.1111/j.1460-2466.1993.tb01304.x . url : https://academic.oup.com/joc/article/43/4/51-58/4160153 .

Robert M. Entman. “Framing Bias: Media in the Distribution of Power”. In: Journal of Communication 57.1 (Mar. 2007), pp. 163–173. issn : 00219916. doi : https://doi.org/10.1111/j.1460-2466.2006.00336.x . url : https://academic.oup.com/joc/article/57/1/163-173/4102665 .

Frank Esser. “Editorial Structures and Work Principles in British and German Newsrooms”. In: European Journal of Communication 13.3 (Sept. 1998), pp. 375–405. issn : 0267-3231. doi : https://doi.org/10.1177/0267323198013003004 . arXiv: 0803973233. url : http://journals.sagepub.com/doi/10.1177/0267323198013003004 .

Frank Esser, Carsten Reinemann, and David Fan. “Spin Doctors in theUnited States, Great Britain, and Germany Metacommunication about Media Manipulation”. In: The Harvard International Journal of Press/Politics 6.1 (2001), pp. 16–45.

James Estrin. The Real Story About the Wrong Photos in #BringBackOur-Girls. May 2014. url : http://lens.blogs.nytimes.com/2014/05/08/thereal-story-about-the-wrong-photos-in-bringbackourgirls/ (visited on 02/18/2020).

David Kirk Evans, Judith L. Klavans, and Kathleen R.McKeown. “Columbia Newsblaster”. In: Demonstration Papers at HLT-NAACL 2004 on XX - HLTNAACL ’04 . Morristown, NJ, USA: Association for Computational Linguistics, 2004, pp. 1–4. doi : https://doi.org/10.3115/1614025.1614026 . url : http://portal.acm.org/citation.cfm?doid=1614025.1614026 .

Facebook. Company Info. 2021. url : http://web.archive.org/web/20210210223947/ https://about.fb.com/company-info/ (visited on 02/12/2021).

Lukas Feick, Karsten Donnay, and Katherine T. McCabe. “The Subconscious Effect of Subtle Media Bias on Perceptions of Terrorism”. In: American Politics Research 49.3 (May 2021), pp. 313–318. issn : 1532-673X. doi : https://doi.org/10.1177/1532673X20972105 . url : http://journals.sagepub.com/doi/10.1177/1532673X20972105 .

Tomáš Foltýnek, Norman Meuschke, and Bela Gipp. “Academic Plagiarism Detection”. In: ACM Computing Surveys 52.6 (Jan. 2020), pp. 1–42. issn : 0360-0300. doi : https://doi.org/10.1145/3345317 . url : https://dl.acm.org/doi/10.1145/3345317 .

Brendan Frey and Delbert Dueck. “Clustering by Passing Messages Between Data Points”. In: Science 315.5814 (Feb. 2007), pp. 972–976. doi : https://doi.org/10.1126/science.1136800 .

Dieter Frey. “Recent research on selective exposure to information”. In: Advances in experimental social psychology 19 (1986), pp. 41–80. url : https://doi.org/10.1016/S0065-2601(08)60212-9 .

Matthew Gentzkow, Edward Glaeser, and Claudia Goldin. The Rise of the Fourth Estate: How Newspapers Became Informative and Why It Mattered. Tech. rep. Cambridge, MA: National Bureau of Economic Research, Sept. 2004, pp. 187–230. doi : https://doi.org/10.3386/w10791 . url : http://www.nber.org/papers/w10791.pdf .

Matthew Gentzkow and Jesse M Shapiro. “What drives media slant? Evidence from US daily newspapers”. In: Econometrica 78.1 (2010), pp. 35–71. url : https://web.stanford.edu/~gentzkow/research/biasmeas.pdf .

Matthew Gentzkow and Jesse M. Shapiro. “Media Bias and Reputation”. In: Journal of Political Economy 114.2 (Apr. 2006), pp. 280–316. issn : 0022-3808. doi : 10.1086/499414. url : https://doi.org/10.1086/499414 .

Alan S Gerber, Dean Karlan, and Daniel Bergan. “Does the media matter? A field experiment measuring the effect of newspapers on voting behavior and political opinions”. In: American Economic Journal: Applied Economics 1.2 (2009), pp. 35–52.

Martin Gilens and Craig Hertzman. “Corporate ownership and news bias: Newspaper coverage of the 1996 Telecommunications Act”. In: The Journal of Politics 62.02 (2000), pp. 369–386.

Bela Gipp. Citation-based Plagiarism Detection. Wiesbaden: Springer FachmedienWiesbaden, 2014. isbn : 978-3-658-06393-1. doi : https://doi.org/10.1007/978-3-658-06394-8 . url : http://link.springer.com/10.1007/978-3-658-06394-8 .

Bela Gipp, Adriana Taylor, and Jöran Beel. ‘Link Proximity Analysis - ClusteringWebsites by Examining Link Proximity”. In: 2010, pp. 449–452. doi : https://doi.org/10.1007/978-3-642-15464-5_54 . url : http://link.springer.com/10.1007/978-3-642-15464-554 .

Namrata Godbole, Manja Srinivasaiah, and Steven Skiena. “Large-Scale Sentiment Analysis for News and Blogs”. In: Proceedings of the International Conference onWeblogs and Social Media (ICWSM) 7 (2007), pp. 219–222.

Jennifer Golbeck and Derek Hansen. “Computing political preference among twitter followers”. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems . ACM. 2011, pp. 1105–1108.

Gregory Grefenstette et al. “Coupling Niche Browsers and Affect Analysis for an Opinion Mining Application”. In: Coupling Approaches, Coupling Media and Coupling Languages for Information Retrieval . Vaucluse, France: Le Centre de Hautes Etudies Internationales D’Informatique Documentaire, 2004, pp. 186–194. url : https://dl.acm.org/doi/abs/10.5555/2816272.2816290 .

Tim Groseclose and Jeffrey Milyo. “A Measure of Media Bias”. In: The Quarterly Journal of Economics 120.4 (Nov. 2005), pp. 1191–1237. issn : 0033-5533. doi : https://doi.org/10.1162/003355305775097542 . url : http://dx.doi.org/10.1162/003355305775097542 .

Jeff Gruenewald, Jesenia Pizarro, and Steven M. Chermak. “Race, gender, and the newsworthiness of homicide incidents”. In: Journal of Criminal Justice 37.3 (May 2009), pp. 262–272. issn: 00472352. doi: https://doi.org/10.1016/j.jcrimjus.2009.04.006 . url : https://linkinghub.elsevier.com/retrieve/pii/S0047235209000440 .

Joachim W H Haes. “September 11 in Germany and the United States: Reporting, reception, and interpretation”. In: Crisis Communications: Lessons from September 11 (2003), pp. 125–132.

Felix Hamborg, Norman Meuschke, and Bela Gipp. “Bias-aware news analysis using matrix-based news aggregation”. In: International Journal on Digital Libraries 21.2 (June 2020), pp. 129–147. issn : 1432–5012. doi : https://doi.org/10.1007/s00799-018-0239-9 . url : http://link.springer.com/10.1007/s00799-018-0239-9 .

Felix Hamborg, Norman Meuschke, and Bela Gipp. “Matrix-Based News Aggregation: Exploring Different Newsctives”. In: 2017 ACM/IEEE Joint Conference on Digital Libraries (JCDL) . IEEE, June 2017, pp. 1–10. isbn : 978-1-5386-3861-3. doi : https://doi.org/10.1109/JCDL.2017.7991561 . url : http://ieeexplore.ieee.org/document/7991561/ .

Felix Hamborg et al. “Identification and Analysis of Media Bias in News Articles”. In: 15th International Symposium of Information Science (ISI 2017) . Berlin, Germany: Verlag Werner Hülsbusch, 2017, pp. 224–236. isbn : 978-3-86488-117-6.

Felix Hamborg et al. “NewsDeps: Visualizing the Origin of Information in News Articles”. In: Wahrheit und Fake im postfaktisch-digitalen Zeitalter . Ed. by Peter Klimczak and Thomas Zoglauer. Springer Vieweg, 2021, pp. 151–166. isbn : 978-3-658-32957-0. doi : https://doi.org/10.1007/978-3-658-32957-0 .

Mark Hanna. “Keywords in News and Journalism Studies”. In: Journalism Studies 15.1 (Jan. 2014), pp. 118–119. issn : 1461-670X. doi : https://doi.org/10.1080/1461670X.2012.712759 . url : http://www.tandfonline.com/doi/abs/10 . 1080/1461670X.2012.712759.

Tony Harcup and Deirdre O’neill. “What is news? Galtung and Ruge revisited”. In: Journalism studies 2.2 (2001), pp. 261–280. url : https://www.tandfonline.com/doi/10.1080/14616700118449 .

Andrew F. Hayes and Klaus Krippendorff. “Answering the Call for a Standard Reliability Measure for Coding Data”. In: Communication Methods and Measures 1.1 (Apr. 2007), pp. 77–89. issn : 1931-2458. doi : https://doi.org/10.1080/19312450709336664 . url : http://www.tandfonline.com/doi/abs/10.1080/19312450709336664 .

Edward S Herman. “The propaganda model:Aretrospective”. In: Journalism Studies 1.1 (2000), pp. 101–112. doi : https://doi.org/10.1080/146167000361195 .

Edward S Herman and Noam Chomsky. Manufacturing consent: The political economy of the mass media . Random House, 2010.

Timothy C Hoad and Justin Zobel. “Methods for identifying versioned and plagiarized documents”. In: Journal of the American society for information science and technology 54.3 (2003), pp. 203–215.

George Hripcsak. “Agreement, the F-Measure, andReliability in Information Retrieval”. In: Journal of the American Medical Informatics Association 12.3 (Jan. 2005), pp. 296–298. issn : 1067-5027. doi : https://doi.org/10.1197/jamia.M1733 . url : https://academic.oup.com/jamia/article-lookup/doi/10.1197/jamia.M1733 .

Minqing Hu and Bing Liu. “Mining and summarizing customer reviews”. In: Proceedings of the 2004 ACM SIGKDD international conference on Knowledge discovery and data mining - KDD ’04 . New York, New York, USA: ACM Press, 2004, p. 168. doi : https://doi.org/10.1145/1014052.1014073 . url : http://portal.acm.org/citation.cfm?doid=1014052.1014073 .

John Edward Hunter, Frank L Schmidt, and Gregg B Jackson. Meta-analysis: Cumulating research findings across studies . Vol. 4. Sage Publications, Inc, 1982.

Shanto Iyengar. Is anyone responsible? How television frames political issues . University of Chicago Press, 1994.

Anil K Jain and Sushil Bhattacharjee. “Text segmentation using Gabor filters for automatic document processing”. In: Machine Vision and Applications 5.3 (1992), pp. 169–184. url : https://link.springer.com/article/10.1007/BF02626996 .

Silvia Julinda, Christoph Boden, and Alan Akbik. Extracting a Repository of Events and Event References fromNews Clusters . Dublin, Ireland,Aug. 2014. doi : https://doi.org/10.3115/v1/W14-4503 . url : https://www.aclweb.org/anthology/W14-4503 .

Daniel Kahneman and Amos Tversky. “Choices, values, and frames.” In: American Psychologist 39.4 (1984), pp. 341–350. issn : 0003-066X. doi : https://doi.org/10.1037/0003-066X.39.4.341 . url : http://content.apa.org/journals/amp/39/4/341 .

Mesut Kaya, Guven Fidan, and Ismail H Toroslu. “Sentiment Analysis of Turkish Political News”. In: 2012 IEEE/WIC/ACM International Conferences on Web Intelligence and Intelligent Agent Technology . IEEE Computer Society. IEEE, Dec. 2012, pp. 174–180. isbn : 978-1-4673-6057-9. doi : https://doi.org/10.1109/WI-IAT.2012.115 . url : http://ieeexplore.ieee.org/document/6511881/ .

Keith Kenney and Chris Simpson. “Was coverage of the 1988 presidential race by Washington’s two major dailies biased?” In: Journalism & Mass Communication Quarterly 70.2 (1993), pp. 345–355. doi : https://doi.org/10.1177/107769909307000210 .

Hans Mathias Kepplinger. “Visual biases in television campaign coverage”. In: Communication Research 9.3 (1982), pp. 432–446. doi : https://doi.org/10.1177/009365082009003005 .

Jean S Kerrick. “News pictures, captions and the point of resolution”. In: Journalism & Mass Communication Quarterly 36.2 (1959), pp. 183–188. doi : https://doi.org/10.1177/107769905903600207 .

Jean S. Kerrick. “The Influence of Captions on Picture Interpretation”. In: Journalism Quarterly 32.2 (June 1955), pp. 177–182. issn : 0022-5533. doi : https://doi.org/10.1177/107769905503200205 . url : http://journals.sagepub.com/doi/10.1177/107769905503200205 .

JongWook Kim, K Selçuk Candan, and Junichi Tatemura. “Efficient overlap and content reuse detection in blogs and online news articles”. In: Proceedings of the 18th international conference on World wide web - WWW ’09 . 0735014. New York, New York, USA: ACM Press, 2009, p. 81. isbn : 9781605584874. doi : https://doi.org/10.1145/1526709.1526721 . url : http://portal.acm.org/citation.cfm?doid=1526709.1526721 .

Christian Kohlschütter, Peter Fankhauser, and Wolfgang Nejdl. “Boilerplate detection using shallow text features”. In: Proceedings of the third ACM international conference on Web search and data mining - WSDM ’10 . New York, New York, USA: ACM Press, 2010, p. 441. isbn : 9781605588896. doi: https://doi.org/10.1145/1718487.1718542 . url : http://portal.acm.org/citation.cfm?doid=1718487.1718542 .

Wolfgang Kreißig. Medienvielfaltsmonitor 2020-I: Anteile der Medienangebote und Medienkonzerne am Meinungsmarkt der Medien in Deutschland . Tech. rep. Munich, Germany: Bayerische Landeszentrale für neue Medien (BLM), 2020. url : https://www.blm.de/files/pdf2/medienvielfaltsmonitor-2020-1.pdf .

Steven Kull, Clay Ramsay, and Evan Lewis. “Misperceptions, the media, and the Iraqwar”. In: Political Science Quarterly 118.4 (2003), pp. 569–598. url : https://onlinelibrary.wiley.com/doi/10.1002/j.1538-165X.2003.tb00406.x .

Ankit Kumar et al. “Ask Me Anything: Dynamic Memory Networks for Natural Language Processing”. In: arXiv (2015). issn : 1938–7228. doi : https://doi.org/10.1017/CBO9781107415324.004 . arXiv: arXiv:1506.07285v1.

George Lakoff. “Women, fire, and dangerous things”. In: What categories reveal about the mind (1987).

J Richard Landis and Gary G Koch. “The Measurement of Observer Agreement for Categorical Data”. In: Biometrics 33.1 (Mar. 1977), p. 159. issn : 0006341X. doi : https://doi.org/10.2307/2529310 . url : https://www.jstor.org/stable/2529310?origin=crossref .

Valentino Larcinese, Riccardo Puglisi, and James M Snyder. “Partisan bias in economic news: Evidence on the agenda-setting behavior of US newspapers”. In: Journal of Public Economics 95.9 (2011), pp. 1178–1189.

Quoc V. Le and Tomas Mikolov. “Distributed Representations of Sentences and Documents”. In: International Conference on Machine Learning - ICML 2014 32 (May 2014). arXiv: 1405.4053. url : http://arxiv.org/abs/1405.4053 .

Yann Lecun, Yoshua Bengio, and Geoffrey Hinton. “Deep learning”. In: Nature 521.7553 (2015), pp. 436–444. issn : 14764687. doi : https://doihorg/10.1038/nature14539. arXiv: arXiv:1312.6184v5.

Kalev Leetaru and Philip A Schrodt. “GDELT: Global Data on Events, Location and Tone, 1979-2012”. In: Annual Meeting of the International Studies Association (2013), pp. 1–51. url : http://data.gdeltproject.org/documentation/ISA.2013.GDELT.pdf .

Jure Leskovec, Lars Backstrom, and Jon Kleinberg. “Meme-tracking and the dynamics of the news cycle”. In: Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining . ACM. 2009, pp. 497–506. doi : https://doi.org/10.1145/1557019.1557077 .

David D Lewis et al. “RCV1: A New Benchmark Collection for Text Categorization Research”. In: The Journal of Machine Learning Research 5 (2004), pp. 361–397. url : https://dl.acm.org/doi/10.5555/1005332.1005345 .

LexisNexis. LexisNexis Police Reports . 2020. url : http://web.archive.org/web/20200405053436/https://policereports.lexisnexis.com/search/search (visited on 02/12/2020).

Sora Lim, Adam Jatowt, and Masatoshi Yoshikawa. “Towards Bias Inducing Word Detection by Linguistic Cue Analysis in News Articles”. In: DEIM Forum 2018 . 2018, pp. 1–6. url : https://db-event.jpn.org/deim2018/data/papers/275.pdf .

Will Lowe. “Software for content analysis-A Review”. In: Cambridge: Weatherhead Center for International Affairs and the Harvard Identity Project (2002).

Luca Luceri, Silvia Giordano, and Emilio Ferrara. “Detecting Troll Behavior via Inverse Reinforcement Learning: A Case Study of Russian Trolls in the 2016 US Election”. In: Proceedings of the Fourteenth International AAAI Conference on Web and Social Media (ICWSM 2020) . Association for the Advancement of ArtificialIntelligence, 2020, pp. 417–427. arXiv: 2001.10570. url : http://arxiv.org/abs/2001.10570 .

Brent MacGregor. Live, direct, and biased? making television news in the satellite age . Arnold, 1997.

Oded Maimon and Lior Rokach. “Introduction to knowledge discovery and data mining”. In: Data mining and knowledge discovery handbook. Springer, 2009, pp. 1–15.

Christopher D. Manning, Prabhakar Raghavan, and Hinrich Schutze. Introduction to Information Retrieval. Cambridge: Cambridge University Press , 2008. isbn : 9780511809071. doi : https://doi.org/10.1017/CBO9780511809071 . url : http://ebooks.cambridge.org/ref/id/CBO9780511809071 .

Jörg Matthes. “What’s in a Frame? A Content Analysis of Media Framing Studies in the World’s Leading Communication Journals, 1990-2005”. In: Journalism & Mass Communication Quarterly 86.2 (June 2009), pp. 349–367. issn : 1077-6990. doi : https://doi.org/10.1177/107769900908600206 . url : http://journals.sagepub.com/doi/10.1177/107769900908600206 .

John McCarthy et al. “Assessing stability in the patterns of selection bias in newspaper coverage of protest during the transition from communism in Belarus”. In: Mobilization: An International Quarterly 13.2 (2008), pp. 127–146.

John D McCarthy, Clark McPhail, and Jackie Smith. “Images of Protest: Dimensions of Selection Bias in Media Coverage of Washington Demonstrations, 1982 and 1991”. In: American Sociological Review 61.3 (June 1996), p. 478. issn : 00031224. doi : https://doi.org/10.2307/2096360 . url : http://www.jstor.org/stable/2096360?origin=crossref .

Margaret J McGregor et al. “Why don’t more women report sexual assault to the police?” In: Canadian Medical Association Journal 162.5 (2000), pp. 659–660.

KathleenRMcKeown et al. “Tracking and summarizing news on a daily basis with Columbia’s Newsblaster”. In: Proceedings of the second international conference on Human Language Technology Research . 2002, pp. 280–285.

Norman Meuschke and Bela Gipp. “State-of-the-art in detecting academic plagiarism”. In: International Journal for Educational Integrity 9.1 (June 2013), p. 50. issn : 1833-2595. doi : https://doi.org/10.21913/.EI.v9i1.847 . url : https://ojs.unisa.edu.au/index.php/.EI/article/view/847 .

Norman Meuschke et al. “HyPlag”. In: The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval . NewYork, NY, USA: ACM, June 2018, pp. 1321–1324. isbn : 9781450356572. doi : https://doi.org/10.1145/3209978.3210177 . url : https://dl.acm.org/doi/10.1145/3209978.3210177 .

Joshua Meyrowitz. No sense of place: The impact of electronic media on social behavior . Oxford University Press, 1986.

M. Mark Miller. “Frame Mapping and Analysis of News Coverage of Contentious Issues”. In: Social Science Computer Review 15.4 (Dec. 1997), pp. 367–378. issn : 0894-4393. doi : https://doi.org/10.1177/089443939701500403 . url : http://journals.sagepub.com/doi/10.1177/089443939701500403 .

Gilad Mishne. “Experiments with mood classification in blog posts”. In: Proceedings of ACM SIGIR 2005 Workshop on Stylistic Analysis of Text for Information Access (2005).

Ryan Mitchell. Web scraping with Python: collecting data from the modern web . O’Reilly Media, Inc., 2015.

Shunji Mori, Hirobumi Nishida, and Hiromitsu Yamada. Optical character recognition . John Wiley & Sons, Inc., 1999.

Karen Mossberger, Caroline J Tolbert, and Ramona S McNeal. Digital citizenship: The Internet, society, and participation . MIT Press, 2007. isbn : 9780262134859. url : https://mitpress.mit.edu/books/digital-citizenship .

Sendhil Mullainathan and Andrei Shleifer. “The market for news”. In: American Economic Review (2005), pp. 1031–1053.

Sean A Munson and Paul Resnick. “Presenting diverse political opinions”. In: Proceedings of the 28th international conference on Human factors in computing systems -CHI ’10 . ACM. NewYork, NewYork, USA: ACMPress, 2010, p. 1457. isbn : 9781605589299. doi : https://doi.org/10.1145/1753326.1753543 . url : http://portal.acm.org/citation.cfm?doid=1753326.1753543 .

Sean A Munson, Daniel Xiaodan Zhou, and Paul Resnick. “Sidelines: An Algorithm for Increasing Diversity in News and Opinion Aggregators.” In: ICWSM . 2009.

Sean A. Munson, Stephanie Y. Lee, and Paul Resnick. “Encouraging reading of diverse political viewpoints with a browser widget”. In: Proceedings of the 7th International Conference on Weblogs and Social Media, ICWSM 2013 . 2013.

Diana C Mutz. “Facilitating communication across lines of political difference: The role of mass media”. In: American Political Science Association . Vol. 95. 01. Cambridge Univ Press. 2001, pp. 97–114.

David Nadeau and Satoshi Sekine. “A survey of named entity recognition and classification”. In: Lingvisticae Investigationes 30.1 (Aug. 2007), pp. 3–26. issn : 0378-4169. doi : https://doi.org/10.1075/li.30.1.03nad . url : http://www.jbeplatform.com/content/journals/10.1075/li.30.1.03nad .

Joseph Napolitan. The election game and how to win it . Doubleday, 1972.

Kimberly A Neuendorf. The content analysis guidebook . Sage Publications, 2016. isbn : 9781412979474.

Nic Newman, David A L Levy, and Rasmus Kleis Nielsen. Reuters Institute Digital News Report 2015 . Reuters Institute for the Study of Journalism, 2015. isbn : 978-1907384134.

Nic Newman et al. Reuters Institute Digital News Report 2020 . Reuters Institute for the Study of Journalism, 2020.

David Niven. Tilt? The search for media bias . Praeger, 2002. isbn : 978–0275975777.

Daniela Oelke, Benno Geißelmann, and Daniel A Keim. “Visual Analysis of Explicit Opinion and News Bias in German Soccer Articles”. In: Euro- Vis Workshop on Visual Analytics. Vienna , Austria, 2012. doi : 10.2312/PE/EuroVAST/EuroVA12/049-053. url : https://doi.org/10.2312/PE/EuroVAST/EuroVA12/049-053 .

Pamela E. Oliver and Gregory M. Maney. “Political Processes and Local Newspaper Coverage of Protest Events: From Selection Bias to Triadic Interactions”. In: American Journal of Sociology 106.2 (Sept. 2000), pp. 463–505. issn : 0002-9602. doi : https://doi.org/10.1086/316964 . url : http://www.journals.uchicago.edu/doi/10.1086/316964 .

Lawrence Page et al. The PageRank citation ranking: bringing order to the web . Tech. rep. 1999.

Georgios Paliouras et al. “PNS: A Personalized News Aggregator on the Web”. In: Intelligent interactive systems in knowledge-based environments . Ed. by George A. Tsihrintzis and Maria Virvou. Berlin, Germany: Springer, 2008, pp. 175–197. isbn : 978-3-540-77471-6. doi : https://doi.org/10.1007/978-3-540-77471-6_10 . url : http://link.springer.com/10.1007/978-3-540-77471-6_10 .

Zhongdang Pan and GeraldKosicki. “Framing analysis: An approach to news discourse”. In: Political Communication 10.1 (1993), pp. 55–75. issn : 1058-4609. doi : https://doi.org/10.1080/10584609.1993.9962963 . url : http://www.tandfonline.com/doi/abs/10.1080/10584609.1993.9962963 .

Bo Pang and Lillian Lee. “Opinion mining and sentiment analysis”. In: Foundations and trends in information retrieval 2.1-2 (2008), pp. 1–135. doi: https://doi.org/10.1561/1500000011 .

Zizi Papacharissi and Maria de Fatima Oliveira. “News Frames Terrorism: A Comparative Analysis of Frames Employed in Terrorism Coverage in U.S. and U.K. Newspapers”. In: The International Journal of Press/Politics 13.1 (Jan. 2008), pp. 52–74. issn : 1940-1612. doi : https://doi.org/10.1177/1940161207312676 . url: http://journals.sagepub.com/doi/10.1177/1940161207312676 .

Souneil Park, KyungSoon Lee, and Junehwa Song. “Contrasting Opposing Views of News Articles on Contentious Issues”. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies . Portland, Oregon, USA: Association for Computational Linguistics, 2011, pp. 340–349. url : https://www.aclweb.org/anthology/P11-1035 .

Souneil Park et al. “NewsCube”. In: Proceedings of the 27th international conference on Human factors in computing systems - CHI 09 . New York, New York, USA: ACM Press, 2009, p. 443. isbn : 9781605582467. doi : https://doi.org/10.1145/1518701.1518772 . url : http://dl.acm.org/citation.cfm?doid=1518701.1518772 .

Souneil Park et al. “NewsCube 2.0: An Exploratory Design of a Social News Website for Media Bias Mitigation”. In: Workshop on Social Recommender Systems . 2011.

Souneil Park et al. “The politics of comments”. In: Proceedings of the ACM 2011 conference on Computer supported cooperative work - CSCW ’11 . ACM. New York, New York, USA: ACM Press, 2011, p. 113. isbn : 9781450305563. doi : https://doi.org/10.1145/1958824.1958842 . url : http://portal.acm.org/citation.cfm?doid=1958824.1958842 .

Richard Paul and Linda Elder. The Thinker’s Guide for Conscientious Citizens on how to Detect Media Bias & Propaganda in National and World News . Foundation Critical Thinking, 2004.

Dragomir R Radev et al. “Centroid-based summarization of multiple documents”. In: Information Processing & Management 40.6 (2004), pp. 919–938.

Marta Recasens, Cristian Danescu-Niculescu-Mizil, and Dan Jurafsky. “Linguistic Models for Analyzing and Detecting Biased Language”. In: Proceedings of the 51st Annual Meeting on Association for Computational Linguistics . Sofia, BG: Association for Computational Linguistics, 2013, pp. 1650–1659. isbn : 9781937284503. url : https://www.aclweb.org/anthology/P13-1162.pdf .

Shawn W Rosenberg et al. “The image and the vote: The effect of candidate presentation on voter preference”. In: American Journal of Political Science 30.1 (1986), pp. 108–127. doi : https://doi.org/10.2307/2111296 .

Barbara Rychalska et al. “Samsung Poland NLP Team at SemEval-2016 Task 1: Necessity for diversity; combining recursive autoencoders,WordNet and ensemble methods to measure semantic similarity.” In: Proceedings of the 10th International Workshop on Semantic Evaluation (SemEval-2016) . Stroudsburg, PA, USA: Association for Computational Linguistics, 2016, pp. 602–608. isbn : 9781941643952. doi : https://doi.org/10.18653/v1/S16-1091 . url : http://aclweb.org/anthology/S16-1091 .

Diego Saez-Trumper, Carlos Castillo, and Mounia Lalmas. “Social media news communities”. In: Proceedings of the 22nd ACM international conference on Conference on information & knowledge management - CIKM ’13 . New York, New York, USA: ACM Press, 2013, pp. 1679–1684. isbn : 9781450322638. doi : https://doi.org/10.1145/2505515.2505623 . url : http://dl.acm.org/citation.cfm?doid=2505515.2505623 .

Gerard Salton and Christopher Buckley. “Term-weighting approaches in automatic text retrieval”. In: Information processing and management 24.5 (1988), pp. 513–523. doi : https://doi.org/10.1016/0306-4573(88)90021-0 .

Mark Sanderson. “Duplicate detection in the Reuters collection”. In: ”Technical Report (TR-1997-5) of the Department of Computing Science at the University of Glasgow G12 8QQ, UK” (1997).

Cicero Nogueira dos Santos and Maira Gatti. “Deep Convolutional Neural Networks for Sentiment Analysis of Short Texts”. In: Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics: Technical Papers . 2014, pp. 69–78. url : https://www.aclweb.org/anthology/C14-1008 .

Frane Šariæ et al. “Takelab: Systems for Measuring Semantic Text Similarity”. In: Proceedings of the First Joint Conference on Lexical and Computational Semantics-Volume 1: Proceedings of the main conference and the shared task, and Volume 2: Proceedings of the Sixth InternationalWorkshop on Semantic Evaluation . Association for Computational Linguistics, 2012, pp. 441–448. url : https://www.aclweb.org/anthology/S12-1060 .

Dietram A Scheufele. “Agenda-setting, priming, and framing revisited: Another look at cognitive effects of political communication”. In: Mass Communication & Society 3.2-3 (2000), pp. 297–316. doi : 10.1207/S15327825MCS0323_07. url : https://doi.org/10.1207/S15327825MCS0323_07 .

Margrit Schreier. Qualitative content analysis in practice. SAGE Publications, 2012, pp. 1–280. isbn : 9781849205931.

Crisitina Segalin et al. “The Pictures We Like Are Our Image: Continuous Mapping of Favorite Pictures into Self-Assessed and Attributed Personality Traits”. In: IEEE Transactions on Affective Computing 8.2 (Apr. 2017), pp. 268–285. issn : 1949-3045. doi : https://doi.org/10.1109/TAFFC.2016.2516994 . url : http://ieeexplore.ieee.org/document/7378902/ .

Anup Shah. Media Conglomerates, Mergers, Concentration of Ownership. 2009. url : https://www.globalissues.org/article/159/media-conglomeratesmergers-concentration-of-ownership (visited on 02/19/2021).

Walid Shalaby, Wlodek Zadrozny, and Hongxia Jin. “Beyond word embeddings: learning entity and concept representations from large scale knowledge bases”. In: Information Retrieval Journal (2018), pp. 1–18. doi : s10791-018-9340-3. url : https://doi.org/10.1007/s10791-018-9340-3 .

Smriti Sharma et al. “News Event Extraction Using 5W1H Approach & Its Analysis”. In: International Journal of Scientific & Engineering Research 4.5 (2013), pp. 2064–2068. url : https://www..ser.org/onlineResearchPaperViewer.aspx?News-Event-Extraction-Using-5W1HApproach-Its-Analysis.pdf.

Narayanan Shivakumar and Hector Garcia-Molina. “SCAM: A Copy Detection Mechanism for Digital Documents”. In: In Proceedings of the Second Annual Conference on the Theory and Practice of Digital Libraries . 1995. url : http://ilpubs.stanford.edu:8090/95/ .

Alison Smith, Timothy Hawes, and Meredith Myers. “Hiérarchie: Interactive Visualization for Hierarchical Topic Models”. In: Proceedings of the Workshop on Interactive Language Learning, Visualization, and Interfaces . Association for Computational Linguistics, 2014, pp. 71–78. isbn : 9781941643150. doi : https://doi.org/10.3115/v1/W14-3111 .

Jackie Smith et al. “From Protest to Agenda Building: Description Bias in Media Coverage of Protest Events in Washington, D.C.” In: Social Forces 79.4 (2001), pp. 1397–1423. url : https://www.jstor.org/stable/2675477 .

Norman Solomon. “Media Bias”. In: New Political Science 24.2 (June 2002), pp. 293–297. issn : 0739-3148. doi : https://doi.org/10.1080/073931402200145252 . url : http://www.tandfonline.com/doi/abs/10.1080/073931402200145252 .

Samuel R Sommers et al. “Race and media coverage of Hurricane Katrina: Analysis, implications, and future research questions”. In: Analyses of Social Issues and Public Policy 6.1 (2006), pp. 39–55. doi : https://doi.org/10.1111/j.1530-2415.2006.00103.x .

Spiegel Online. Übertreibt Horst Seehofer seine Attacken? Das sagen die Medien. 2016. url : http://www.spiegel.de/politik/deutschland/uebertreibthorst-seehofer-seine-attacken-das-sagen-die-medien-a-1076867.html (visited on 02/15/2021).

Andreas Spitz and Michael Gertz. “Breaking theNews: Extracting the Sparse CitationNetwork Backbone of OnlineNews Articles”. In: Proceedings of the 2015 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining 2015 . ACM. 2015, pp. 274–279. doi : https://doi.org/10.1145/2808797.2809380 .

Ralf Steinberger et al. “Large-scale news entity sentiment analysis”. In: RANLP 2017 - Recent Advances in Natural Language Processing Meet Deep Learning . Incoma Ltd. Shoumen, Bulgaria, Nov. 2017, pp. 707–715. isbn: 9789544520496. doi : https://doi.org/10.26615/978-954-452-049-6_091 . url : http://www.acl-bg.org/proceedings/2017/RANLP%202017/pdf/RANLP091.pdf .

Steve Stemler. “An overview of content analysis”. In: Practical assessment, research & evaluation 7.17 (2001), pp. 137–146.

Guido H Stempel. “The prestige press meets the third-party challenge”. In: Journalism & Mass Communication Quarterly 46.4 (1969), pp. 699–706. doi : https://doi.org/10.1177/107769906904600402 .

Guido H Stempel and John W Windhauser. “The prestige press revisited: coverage of the 1980 presidential campaign”. In: Journalism and Mass Communication Quarterly 61.1 (1984), p. 49. doi : https://doi.org/10.1177/107769908406100107 .

James Glen Stovall. “Coverage of 1984 presidential campaign”. In: Journalism and Mass Communication Quarterly 65.2 (1988), p. 443. doi : https://doi.org/10.1177/107769908806500227 .

James Glen Stovall. “The third-party challenge of 1980: News coverage of the presidential candidates”. In: Journalism and Mass Communication Quarterly 62.2 (1985), p. 266. doi : https://doi.org/10.1177/107769908506200206 .

Carlo Strapparava and Rada Mihalcea. “Semeval-2007 task 14: Affective text”. In: Proceedings of the 4th InternationalWorkshop on Semantic Evaluations . Association for Computational Linguistics. Prague, Czech Republic, 2007, pp. 70–74. url : https://www.aclweb.org/anthology/S07-1013/ .

Joseph D Straubhaar. Media Now: Communication Media in Information Age. Thomson Learning, 2000.

Pero Subasic and Alison Huettner. “Affect analysis of text using fuzzy semantic typing”. In: IEEE Transactions on Fuzzy Systems 9.4 (2001), pp. 483–496. issn : 10636706. doi : https://doi.org/10.1109/91.940962 . url : http://ieeexplore.ieee.org/document/940962/ .

S Shyam Sundar. “Exploring receivers’ criteria for perception of print and online news”. In: Journalism & Mass Communication Quarterly 76.2 (1999), pp. 373–386. doi : https://doi.org/10.1177/107769909907600213 .

Cass R Sunstein. Echo Chambers: Bush v. Gore, Impeachment, and Beyond . Princeton University Press Princeton, 2001.

Cass R Sunstein. “The law of group polarization”. In: Journal of political philosophy 10.2 (2002), pp. 175–195. url : https://papers.ssrn.com/sol3/papers.cfm?abstract_id=199668 .

The Media Insight Project. The Personal News Cycle: How Americans Get Their News. Tech. rep. 2014. url : https://www.americanpressinstitute.org/publications/reports/survey-research/personal-news-cycle/ .

Manos Tsagkias, Maarten De Rijke, and Wouter Weerkamp. “Linking online news and social media”. In: Proceedings of the fourth ACM international conference on Web search and data mining . ACM. 2011, pp. 565–574. doi : https://doi.org/10.1145/1935826.1935906 .

Larry Tye. The father of spin: Edward L. Bernays and the birth of public relations . Macmillan, 2002.

University of Michigan. News Bias Explored - The art of reading the news. 2014. url : http://umich.edu/~newsbias/ (visited on 02/01/2021).

Christine D Urban. Examining Our Credibility: Perspectives of the Public and the Press . Asne Foundation, 1999, pp. 1–108.

Mojtaba Vaismoradi, Hannele Turunen, and Terese Bondas. “Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study”. In: Nursing & Health Sciences 15.3 (Sept. 2013), pp. 398–405. issn : 14410745. doi : https://doi.org/10.1111/nhs.12048 . url : http://doi.wiley.com/10.1111/nhs.12048 .

Robert P Vallone, Lee Ross, and Mark R Lepper. “The hostile media phenomenon: biased perception and perceptions of media bias in coverage of the Beirut massacre.” In: Journal of personality and social psychology 49.3 (1985), p. 577. doi : https://doi.org/10.1037//0022-3514.49.3.577 .

Baldwin Van Gorp. “Strategies to take subjectivity out of framing analysis”. In: Doing news framing analysis: Empirical and theoretical perspectives (2010), pp. 100–125. url : https://www.taylorfrancis.com/chapters/edit/10.4324/9780203864463-11/strategies-take-subjectivity-framing-analysisbaldwin-van-gorp .

Baldwin Van Gorp. “Where is the frame? Victims and intruders in theBelgian press coverage of the asylum issue”. In: European Journal of Communication 20.4 (2005), pp. 484–507. doi : https://doi.org/10.1177/0267323105058253 .

Athanasios Voulodimos et al. “Deep Learning for Computer Vision: A Brief Review”. In: Computational Intelligence and Neuroscience 2018 (2018), pp. 1–13. issn : 1687-5265. doi : https://doi.org/10.1155/2018/7068349 . url : https://www.hindawi.com/journals/cin/2018/7068349/ .

Paul Waldman and James Devitt. “Newspaper Photographs and the 1996 Presidential Election: The Question of Bias”. In: Journal of Mass Communication 75.2 (1998), pp. 302–311. issn : 10776990. doi : https://doi.org/10.1177/107769909807500206 .

Wayne Wanta, Guy Golan, and Cheolhan Lee. “Agenda setting and international news: Media influence on public perceptions of foreign nations”. In: Journalism & Mass Communication Quarterly 81.2 (2004), pp. 364–377. doi : https://doi.org/10.1177/107769900408100209 .

David Manning White. “The ”Gate Keeper”: A Case Study in the Selection of News”. In: Journalism Bulletin 27.4 (1950), pp. 383–390. issn : 0197-2448. doi : https://doi.org/10.1177/107769905002700403 . url : http://journals.sagepub.com/doi/10.1177/107769905002700403 .

Alden Williams. “Unbiased Study of Television News Bias”. In: Journal of Communication 25.4 (Dec. 1975), pp. 190–199. issn : 0021-9916. doi : https://doi.org/10.1111/j.1460-2466.1975.tb00656.x . url : https://academic.oup.com/joc/article/25/4/190-199/4553978 .

Vikas Yadav and Steven Bethard. “A Survey on Recent Advances in Named Entity Recognition from Deep Learning models”. In: Proceedings of the 27th International Conference on Computational Linguistics. Santa Fe, New Mexico, USA: Association for Computational Linguistics , 2018, pp. 2145–2158. url : https://www.aclweb.org/anthology/C18-1182 .

JungHwan Yang et al. “Why Are “Others” So Polarized? Perceived Political Polarization and Media Use in 10 Countries”. In: Journal of Computer-Mediated Communication 21.5 (Sept. 2016), pp. 349–367. issn : 10836101. doi: https://doi.org/10.1111/jcc4.12166 . url : https://academic.oup.com/jcmc/article/21/5/349-367/4161799 .

John Zaller. The nature and origins of mass opinion . Cambridge university press, 1992. doi : https://doi.org/10.1017/CBO9780511818691 .

Biqing Zeng et al. “LCF: A Local Context Focus Mechanism for Aspect-Based Sentiment Classification”. In: Applied Sciences 9.16 (Aug. 2019), pp. 1–22. issn : 2076-3417. doi : https://doi.org/10.3390/app9163389 . url : https://www.mdpi.com/2076-3417/9/16/3389 .

Sven Meyer Zu Eissen and Benno Stein. “Intrinsic plagiarism detection”. In: European Conference on Information Retrieval. Springer . 2006, pp. 565–569. url : https://link.springer.com/chapter/10.1007/11735106_66 .

Download references

Author information

Authors and affiliations.

Department of Computer Science, Humboldt University of Berlin, Berlin, Germany

Felix Hamborg

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Reprints and permissions

Copyright information

© 2023 The Author(s)

About this chapter

Hamborg, F. (2023). Media Bias Analysis. In: Revealing Media Bias in News Articles. Springer, Cham. https://doi.org/10.1007/978-3-031-17693-7_2

Download citation

DOI : https://doi.org/10.1007/978-3-031-17693-7_2

Published : 06 October 2022

Publisher Name : Springer, Cham

Print ISBN : 978-3-031-17692-0

Online ISBN : 978-3-031-17693-7

eBook Packages : Computer Science Computer Science (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research
  • Entertainment
  • Environment
  • Information Science and Technology
  • Social Issues

Home Essay Samples Social Issues

Essay Samples on Media Bias

In the vast landscape of media, a pervasive phenomenon lurks beneath the surface, shaping the narratives we encounter and influencing our perspectives. Welcome to the world of media bias, where the analysis of its intricate workings can reveal profound insights into the dynamics of information dissemination. This category explores the multifaceted nature of media bias, providing a comprehensive understanding of its mechanisms and implications in media bias examples for students.

Media bias, in its simplest form, refers to the tendency of media outlets to present information or shape narratives in a manner that aligns with certain biases or ideological leanings. It encompasses a range of practices, including selective reporting, framing, and sensationalism, which can subtly or overtly impact the way news is presented. Whether it manifests in the choice of language, the selection of sources, or the emphasis on specific events, media bias can significantly shape public opinion and alter societal discourse.

How to Write a Media Bias Essay

To write a media bias analysis essay example, consider exploring diverse aspects of the topic. Start by examining media bias essay topics that capture your interest and have relevance in contemporary society. Delve into the history and evolution of media bias, examining notable examples that highlight its presence and effects. Uncover how media bias can influence different spheres, such as politics, social issues, or economics.

To illustrate your arguments effectively, you might analyze specific techniques employed by media outlets. Explore how biases can vary across different media platforms, including print, television, and online channels. Discuss the potential consequences of media bias on public trust, democratic processes, and the formation of public opinion.

Additionally, consider including case studies or empirical research that sheds light on media bias and its implications. Drawing from academic literature, scholarly articles and expert opinions can lend credibility to your arguments and provide a comprehensive analysis of the topic.

The Role of Media in Advancing Gender Equality

The role of media in promoting gender equality is a topic that underscores the power of storytelling, representation, and influence in shaping societal perceptions. Advocates assert that media can challenge stereotypes, amplify women's voices, and catalyze social change. On the other hand, critics point to...

  • Gender Equality

Media Bias in Florida’s Governorship Race 

The media is a tool that usually passes information to people. The information should not forward the agenda and preferences of the individuals at a media outlet, but it should present the real and unbiased situation. The distribution and presentation manner of a vital issue...

Unveiling Media Bias: Conservative Perspectives and Liberal Values

The media, encompassing outlets like CNN, exhibits a bias leaning towards liberal values. Their primary motive appears to be financial gain, even if they promote misinformation. As confirmed by Student News Daily, there exists a media bias, with sources being selectively chosen to support a...

Targeting the Innocent and Vulnerable with Fake News

Throughout the world, altered perceptions regarding the veracity of “factual information,” and trust in journalism are being eroded worldwide through the proliferation of claims of “fake news, ”giving birth to a crisis in the news business in general. “Fake news” is defined as “false stories...

Different Types of Fake News in the Media World

In recent years, the number of fake news has grown extremely due to the internet and social media. However, fake news is a major problem which has serious implications. It is clear to see that not all news is trustworthy. Hence, this essay will analysis...

  • Media Ethics

Stressed out with your paper?

Consider using writing assistance:

  • 100% unique papers
  • 3 hrs deadline option

The Issue Of Media Bias Towards Politics In America

Current political achievement depends on the control of mass correspondences, for example, television, radio, paper, magazines, which are all essential hotspots for the general population to remain educated on day by day exercises of the American republic. The American political system has continuously formed into...

  • Role of Media

Media Bias and Its Effect on Society

In society, media has a unique place that helps in shaping the image people have gained about social and political issues. Currently, media coverage impacts the public by how it characterizes specific events and provides reliable information relating to numerous topics touching society, such as...

  • Media Influence

Media Bias in News: The Correlation of Media Bias, Its Social Outcome and Outputs

Media bias in the news has been around since the beginning of it. Media is a name for the communication outlets or tools used to store and deliver information or data comprised of multiple media firms and corporations that produce content relevant to societal needs....

Media Bias and the American Government

How is the media, biased, more towards progressive-liberal values, conservative values or in both directions depending on the specific media outlets you may access? Media is essential to our democratic process. It is our main source of information about the world. The way we understand...

  • Role of Government

Best topics on Media Bias

1. The Role of Media in Advancing Gender Equality

2. Media Bias in Florida’s Governorship Race 

3. Unveiling Media Bias: Conservative Perspectives and Liberal Values

4. Targeting the Innocent and Vulnerable with Fake News

5. Different Types of Fake News in the Media World

6. The Issue Of Media Bias Towards Politics In America

7. Media Bias and Its Effect on Society

8. Media Bias in News: The Correlation of Media Bias, Its Social Outcome and Outputs

9. Media Bias and the American Government

  • Civil Rights
  • Women's Rights
  • Pornography
  • Black Lives Matter
  • Anti Slavery Movement
  • Environmental Justice
  • Affirmative Action
  • Black Power Movement

Need writing help?

You can always rely on us no matter what type of paper you need

*No hidden charges

100% Unique Essays

Absolutely Confidential

Money Back Guarantee

By clicking “Send Essay”, you agree to our Terms of service and Privacy statement. We will occasionally send you account related emails

You can also get a UNIQUE essay on this or any other topic

Thank you! We’ll contact you as soon as possible.

AllSides

Investment Round Closing June 30

AllSides Bias Meter Full

TRUST, restored. NEWS, balanced. DEMOCRACY, strengthened.

AllSides Logo

  • Bias Ratings
  • Media Bias Chart
  • Fact Check Bias
  • Rate Your Bias
  • Types of Bias

How to Spot 16 Types of Media Bias

Journalism is tied to a set of ethical standards and values, including truth and accuracy, fairness and impartiality, and accountability. However, journalism today often strays from objective fact, resulting in biased news and endless examples of media bias.

Media bias isn't necessarily a bad thing. But hidden bias misleads, manipulates and divides us. This is why AllSides provides hundreds of media bias ratings , a balanced newsfeed , the AllSides Media Bias Chart™ , and the AllSides Fact Check Bias Chart™ .

72 percent of Americans believe traditional news sources report fake news , falsehoods, or content that is purposely misleading. With trust in media declining, media consumers must learn how to spot different types of media bias.

This page outlines 16 types of media bias, along with examples of the different types of bias being used in popular media outlets. Download this page as a PDF .

Related: 14 Types of Ideological Bias

16 Types of Media Bias and how to spot them

  • Unsubstantiated Claims
  • Opinion Statements Presented as Facts
  • Sensationalism/Emotionalism
  • Mudslinging/Ad Hominem
  • Mind Reading
  • Flawed Logic
  • Bias by Omission
  • Omission of Source Attribution
  • Bias by Story Choice and Placement
  • Subjective Qualifying Adjectives
  • Word Choice
  • Negativity Bias
  • Elite v. Populist Bias

Some Final Notes on Bias

Spin is a type of media bias that means vague, dramatic or sensational language. When journalists put a “spin” on a story, they stray from objective, measurable facts. Spin is a form of media bias that clouds a reader’s view, preventing them from getting a precise take on what happened.

In the early 20th century, Public Relations and Advertising executives were referred to as “spin doctors.” They would use vague language and make unsupportable claims in order to promote a product, service or idea, downplaying any alternative views in order to make a sale. Increasingly, these tactics are appearing in journalism.

Examples of Spin Words and Phrases:

  • High-stakes
  • Latest in a string of...
  • Turn up the heat
  • Stern talks
  • Facing calls to...
  • Even though
  • Significant

Sometimes the media uses spin words and phrases to imply bad behavior . These words are often used without providing hard facts, direct quotes, or witnessed behavior:

  • Acknowledged
  • Refusing to say
  • Came to light

To stir emotions, reports often include colored, dramatic, or sensational words as a substitute for the word “said.” For example:

  • Frustration

Examples of Spin Media Bias:

media bias analysis essay

“Gloat” means “contemplate or dwell on one's own success or another's misfortune with smugness or malignant pleasure.” Is there evidence in Trump’s tweet to show he is being smug or taking pleasure in the layoffs, or is this a subjective interpretation?

Source article

Business Insider Bias Rating

media bias analysis essay

In this example of spin media bias, the Washington Post uses a variety of dramatic, sensationalist words to spin the story to make Trump appear emotional and unhinged. They also refer to the president's "vanity" without providing supporting evidence.

Washington Post Bias Rating

Top of Page

2. Unsubstantiated Claims

Journalists sometimes make claims in their reporting without including evidence to back them up. This can occur in the headline of an article, or in the body.

Statements that appear to be fact, but do not include specific evidence, are a key indication of this type of media bias.

Sometimes, websites or media outlets publish stories that are totally made up. This is often referred to as a type of fake news .

Examples of Unsubstantiated Claims Media Bias

media bias analysis essay

In this media bias instance, The Daily Wire references a "longstanding pattern," but does not back this up with evidence.

The Daily Wire Bias Rating

media bias analysis essay

In late January 2019, actor Jussie Smollett claimed he was attacked by two men who hurled racial and homophobic slurs. The Hill refers to “the violent attack” without using the word “alleged” or “allegations." The incident was revealed to be a hoax created by Smollett himself.

The Hill Bias Rating

media bias analysis essay

This Washington Post columnist makes a claim about wealth distribution without noting where it came from. Who determined this number and how?

3. Opinion Statements Presented as Fact

Sometimes journalists use subjective language or statements under the guise of reporting objectively. Even when a media outlet presents an article as a factual and objective news piece, it may employ subjective statements or language.

A subjective statement is one that is based on personal opinions, assumptions, beliefs, tastes, preferences, or interpretations. It reflects how the writer views reality, what they presuppose to be the truth. It is a statement colored by their specific perspective or lens and cannot be verified using concrete facts and figures within the article.

There are objective modifiers — “blue” “old” “single-handedly” “statistically” “domestic” — for which the meaning can be verified. On the other hand, there are subjective modifiers — “suspicious,” “dangerous,” “extreme,” “dismissively,” “apparently” — which are a matter of interpretation.

Interpretation can present the same events as two very different incidents. For instance, a political protest in which people sat down in the middle of a street blocking traffic to draw attention to their cause can be described as “peaceful” and “productive,” or, others may describe it as “aggressive” and “disruptive.”

Examples of Words Signaling Subjective statements :

  • Good/Better/Best
  • Is considered to be
  • May mean that
  • Bad/Worse/Worst
  • It's likely that

Source: Butte College Critical Thinking Tipsheet

An objective statement, on the other hand, is an observation of observable facts . It is not based on emotions or personal opinion and is based on empirical evidence — what is quantifiable and measurable.

It’s important to note that an objective statement may not actually be true. The following statements are objective statements, but can be verified as true or false:

Taipei 101 is the world's tallest building. Five plus four equals ten. There are nine planets in our solar system. Now, the first statement of fact is true (as of this writing); the other two are false. It is possible to verify the height of buildings and determine that Taipei 101 tops them all. It is possible to devise an experiment to demonstrate that five plus four does not equal ten or to use established criteria to determine whether Pluto is a planet.

Editorial reviews by AllSides found that some media outlets blur the line between subjective statements and objective statements, leading to potential confusion for readers, in two key ways that fall under this type of media bias :

  • Including subjective statements in their writing and not attributing them to a source. (see Omission of Source Attribution )
  • Placing opinion or editorial content on the homepage next to hard news, or otherwise not clearly marking opinion content as “opinion.”

Explore logical fallacies that are often used by opinion writers.

Examples of Opinion Statements Presented as Fact

media bias analysis essay

The sub-headline Vox uses is an opinion statement — some people likely believe the lifting of the gas limit will strengthen the coal industry — but Vox included this statement in a piece not labeled “Opinion.”

Vox Bias Rating

media bias analysis essay

In this article about Twitter CEO Elon Musk banning reporters, we can detect that the journalist is providing their personal opinion that Musk is making "arbitary" decisions by making note of the word "seemingly." Whether or not Musk's decisions are arbitrary is a matter of personal opinion and should be reserved for the opinion pages.

SFGate Rating

media bias analysis essay

In this article about Hillary Clinton’s appearance on "The Late Show With Stephen Colbert," the author makes an assumption about Clinton’s motives and jumps to a subjective conclusion.

Fox News Bias Rating

4. Sensationalism/Emotionalism

Sensationalism is a type of media bias in which information is presented in a way that gives a shock or makes a deep impression. Often it gives readers a false sense of culmination, that all previous reporting has led to this ultimate story.

Sensationalist language is often dramatic, yet vague. It often involves hyperbole — at the expense of accuracy — or warping reality to mislead or provoke a strong reaction in the reader.

In recent years, some media outlets have been criticized for overusing the term “breaking” or “breaking news,” which historically was reserved for stories of deep impact or wide-scale importance.

With this type of media bias, reporters often increase the readability of their pieces using vivid verbs. But there are many verbs that are heavy with implications that can’t be objectively corroborated: “blast” “slam” “bury” “abuse” “destroy” “worry.”

Examples of Words and Phrases Used by the Media that Signal Sensationalism and Emotionalism:

  • Embroiled in...
  • Torrent of tweets

Examples of Sensationalism/Emotionalism Media Bias

media bias analysis essay

“Gawk” means to stare or gape stupidly. Does AP’s language treat this event as serious and diplomatic, or as entertainment?

AP Bias Rating

media bias analysis essay

Here, BBC uses sensationalism in the form of hyperbole, as the election is unlikely to involve bloodshed in the literal sense.

BBC Bias Rating

media bias analysis essay

In this piece from the New York Post, the author uses multiple sensationalist phrases and emotional language to dramatize the “Twitter battle."

New York Post Bias Rating

5. Mudslinging/Ad Hominem

Mudslinging is a type of media bias when unfair or insulting things are said about someone in order to damage their reputation. Similarly, ad hominem (Latin for “to the person”) attacks are attacks on a person’s motive or character traits instead of the content of their argument or idea. Ad hominem attacks can be used overtly, or as a way to subtly discredit someone without having to engage with their argument.

Examples of Mudslinging

media bias analysis essay

A Reason editor calls a New York Times columnist a "snowflake" after the columnist emailed a professor and his provost to complain about a tweet calling him a bedbug.

Reason Bias Rating

media bias analysis essay

In March 2019, The Economist ran a piece describing political commentator and author Ben Shapiro as “alt-right.” Readers pointed out that Shapiro is Jewish (the alt-right is largely anti-Semitic) and has condemned the alt-right. The Economist issued a retraction and instead referred to Shapiro as a “radical conservative.”

Source: The Economist Twitter

6. Mind Reading

Mind reading is a type of media bias that occurs in journalism when a writer assumes they know what another person thinks, or thinks that the way they see the world reflects the way the world really is.

Examples of Mind Reading

media bias analysis essay

We can’t objectively measure that Trump hates looking foolish, because we can’t read his mind or know what he is feeling. There is also no evidence provided to demonstrate that Democrats believe they have a winning hand.

CNN Bias Rating

media bias analysis essay

How do we know that Obama doesn’t have passion or sense of purpose? Here, the National Review writer assumes they know what is going on in Obama’s head.

National Review Bias Rating

media bias analysis essay

Vox is upfront about the fact that they are interpreting what Neeson said. Yet this interpretation ran in a piece labeled objective news — not a piece in the Opinion section. Despite being overt about interpreting, by drifting away from what Neeson actually said, Vox is mind reading.

Slant is a type of media bias that describes when journalists tell only part of a story, or when they highlight, focus on, or play up one particular angle or piece of information. It can include cherry-picking information or data to support one side, or ignoring another perspective. Slant prevents readers from getting the full story, and narrows the scope of our understanding.

Examples of Slant

media bias analysis essay

In the above example, Fox News notes that Rep. Alexandria Ocasio-Cortez’s policy proposals have received “intense criticism.” While this is true, it is only one side of the picture, as the Green New Deal was received well by other groups.

media bias analysis essay

Here, Snopes does not indicate or investigate why police made sweeps (did they have evidence criminal activity was occurring in the complex?), nor did Snopes ask police for their justification, giving a one-sided view. In addition, the studies pointed to only show Black Americans are more likely to be arrested for drug possession, not all crimes.

Snopes Bias Rating

8. Flawed Logic

Flawed logic or faulty reasoning is a way to misrepresent people’s opinions or to arrive at conclusions that are not justified by the given evidence. Flawed logic can involve jumping to conclusions or arriving at a conclusion that doesn’t follow from the premise.

Examples of Flawed Logic

media bias analysis essay

Here, the Daily Wire interprets a video to draw conclusions that aren’t clearly supported by the available evidence. The video shows Melania did not extend her hand to shake, but it could be because Clinton was too far away to reach, or perhaps there was no particular reason at all. By jumping to conclusions that this amounted to a “snub” or was the result of “bitterness” instead of limitations of physical reality or some other reason, The Daily Wire is engaging in flawed logic.

9. Bias by Omission

Bias by omission is a type of media bias in which media outlets choose not to cover certain stories, omit information that would support an alternative viewpoint, or omit voices and perspectives on the other side.

Media outlets sometimes omit stories in order to serve a political agenda. Sometimes, a story will only be covered by media outlets on a certain side of the political spectrum. Bias by omission also occurs when a reporter does not interview both sides of a story — for instance, interviewing only supporters of a bill, and not including perspectives against it.

Examples of Media Bias by Omission

media bias analysis essay

In a piece titled, "Hate crimes are rising, regardless of Jussie Smollett's case. Here's why," CNN claims that hate crime incidents rose for three years, but omits information that may lead the reader to different conclusions. According to the FBI’s website , reports of hate crime incidents rose from previous years, but so did the number of agencies reporting, “with approximately 1,000 additional agencies contributing information.” This makes it unclear whether hate crimes are actually on the rise, as the headline claims, or simply appear to be because more agencies are reporting.

10. Omission of Source Attribution

Omission of source attribution is when a journalist does not back up their claims by linking to the source of that information. An informative, balanced article should provide the background or context of a story, including naming sources (publishing “on-the-record” information).

For example, journalists will often mention "baseless claims," "debunked theories," or note someone "incorrectly stated" something without including background information or linking to another article that would reveal how they concluded the statement is false or debunked. Or, reporters will write that “immigration opponents say," "critics say," or “supporters of the bill noted” without identifying who these sources are.

It is sometimes useful or necessary to use anonymous sources, because insider information is only available if the reporter agrees to keep their identity secret. But responsible journalists should be aware and make it clear that they are offering second-hand information on sensitive matters. This fact doesn’t necessarily make the statements false, but it does make them less than reliable.

Examples of Media Bias by Omission of Source Attribution

media bias analysis essay

In this paragraph, The New York Times says Trump "falsely claimed" millions had voted illegally; they link to Trump's tweet, but not to a source of information that would allow the reader to determine Trump's claim is false.

The New York Times Bias Rating

media bias analysis essay

In this paragraph, the Epoch Times repeatedly states "critics say" without attributing the views to anyone specific.

The Epoch Times Bias Rating

media bias analysis essay

In a piece about the Mueller investigation, The New York Times never names the investigators, officials or associates mentioned.

11. Bias by Story Choice and Placement

Story choice, as well as story and viewpoint placement, can reveal media bias by showing which stories or viewpoints the editor finds most important.

Bias by story choice is when a media outlet's bias is revealed by which stories the outlet chooses to cover or to omit. For example, an outlet that chooses to cover the topic of climate change frequently can reveal a different political leaning than an outlet that chooses to cover stories about gun laws. The implication is that the outlet's editors and writers find certain topics more notable, meaningful, or important than others, which can tune us into the outlet's political bias or partisan agenda. Bias by story choice is closely linked to media bias by omission and slant .

Bias by story placement is one type of bias by placement. The stories that a media outlet features "above the fold" or prominently on its homepage and in print show which stories they really want you to read, even if you read nothing else on the site or in the publication. Many people will quickly scan a homepage or read only a headline, so the stories that are featured first can reveal what the editor hopes you take away or keep top of mind from that day.

Bias by viewpoint placement is a related type of bias by placement. This can often be seen in political stories. A balanced piece of journalism will include perspectives from both the left and the right in equal measure. If a story only features viewpoints from left-leaning sources and commentators, or includes them near the top of the story/in the first few paragraphs, and does not include right-leaning viewpoints, or buries them at the end of a story, this is an example of bias by viewpoint.

Examples of Media Bias by Placement

media bias analysis essay

In this screenshot of ThinkProgress' homepage taken at 1 p.m. ET on Sept. 6, 2019, the media outlet chooses to prominently display coverage of LGBT issues and cuts to welfare and schools programs. In the next screenshot of The Epoch Times homepage taken at the same time on the same day, the outlet privileges very different stories.

media bias analysis essay

Taken at the same time on the same day as the screenshot above, The Epoch Times chooses to prominently feature stories about a hurricane, the arrest of illegal immigrants , Hong Kong activists, and the building of the border wall. Notice that ThinkProgress' headline on the border wall focuses on diverting funds from schools and day cares, while the Epoch Times headline focuses on the wall's completion.

12. Subjective Qualifying Adjectives

Journalists can reveal bias when they include subjective, qualifying adjectives in front of specific words or phrases. Qualifying adjectives are words that characterize or attribute specific properties to a noun. When a journalist uses qualifying adjectives, they are suggesting a way for you to think about or interpret the issue, instead of just giving you the facts and letting you make judgements for yourself. This can manipulate your view. Subjective qualifiers are closely related to spin words and phrases , because they obscure the objective truth and insert subjectivity.

For example, a journalist who writes that a politician made a "serious allegation" is interpreting the weight of that allegation for you. An unbiased piece of writing would simply tell you what the allegation is, and allow you to make your own judgement call as to whether it is serious or not.

In opinion pieces, subjective adjectives are okay; they become a problem when they are inserted outside of the opinion pages and into hard news pieces.

Sometimes, the use of an adjective may be warranted, but journalists have to be careful in exercising their judgement. For instance, it may be warranted to call a Supreme Court ruling that overturned a major law a "landmark case." But often, adjectives are included in ways that not everyone may agree with; for instance, people who are in favor of limiting abortion would likely not agree with a journalist who characterizes new laws restricting the act as a "disturbing trend." Therefore, it's important to be aware of subjective qualifiers and adjectives so that you can be on alert and then decide for yourself whether it should be accepted or not. It is important to notice, question and challenge adjectives that journalists use.

Examples of Subjective Qualifying Adjectives

  • disturbing rise
  • serious accusations
  • troubling trend
  • sinister warning
  • awkward flaw
  • extreme law
  • baseless claim
  • debunked theory ( this phrase could coincide with bias by omission , if the journalist doesn't include information for you to determine why the theory is false. )
  • critical bill
  • offensive statement
  • harsh rebuke
  • extremist group
  • far-right/far-left organization

media bias analysis essay

HuffPost's headline includes the phrases "sinister warning" and "extremist Republican." It goes on to note the politician's "wild rant" in a "frothy interview" and calls a competing network "far-right." These qualifying adjectives encourage the reader to think a certain way. A more neutral piece would have told the reader what Cawthorn said without telling the reader how to interpret it.

HuffPost bias rating

13. Word Choice

Words and phrases are loaded with political implications. The words or phrases a media outlet uses can reveal their perspective or ideology.

Liberals and conservatives often strongly disagree about the best way to describe hot-button issues. For example, a liberal journalist who favors abortion access may call it " reproductive healthcare ," or refer to supporters as " pro-choice ." Meanwhile, a conservative journalist would likely not use these terms — to them, this language softens an immoral or unjustifiable act. Instead, they may call people who favor abortion access " pro-abortion " rather than " pro-choice ."

Word choice can also reveal how journalists see the very same event very differently. For instance, one journalist may call an incident of civil unrest a " racial justice protest " to focus the readers' attention on the protesters' policy angles and advocacy; meanwhile, another journalist calls it a " riot " to focus readers' attention on looting and property destruction that occurred.

Words and their meanings are often shifting in the political landscape. The very same words and phrases can mean different things to different people. AllSides offers a Red Blue Translator to help readers understand how people on the left and right think and feel differently about the same words and phrases.

Examples of Polarizing Word Choices

  • pro-choice | anti-choice
  • pro-abortion | anti-abortion
  • gun rights | gun control
  • riot | protest
  • illegal immigrants | migrants
  • illegal alien | asylum-seeking migrants
  • woman | birthing person
  • voting rights | voting security
  • sex reassignment surgery | gender-affirming care
  • critical race theory | anti-racist education

Examples of Word Choice Bias

media bias analysis essay

An outlet on the left calls Florida's controversial Parental Rights in Education law the "Don't Say Gay" bill, using language favored by opponents, while an outlet on the right calls the same bill the "FL education bill," signaling a supportive view.

USA Today source article

USA TODAY media bias rating

Fox News source article

Fox News media bias rating

14. Photo Bias

Photos can be used to shape the perception, emotions or takeaway a reader will have regarding a person or event. Sometimes a photo can give a hostile or favorable impression of the subject.

For example, a media outlet may use a photo of an event or rally that was taken at the very beginning of the event to give the impression that attendance was low. Or, they may only publish photos of conflict or a police presence at an event to make it seem violent and chaotic. Reporters may choose an image of a favored politician looking strong, determined or stately during a speech; if they disfavor him, they may choose a photo of him appearing to yell or look troubled during the same speech.

Examples of Photo Bias

media bias analysis essay

Obama appears stern or angry — with his hand raised, brows furrowed, and mouth wide, it looks like maybe he’s yelling. The implication is that the news about the Obamacare ruling is something that would enrage Obama.

The Blaze bias rating

media bias analysis essay

With a tense mouth, shifty eyes and head cocked to one side, Nunes looks guilty. The sensationalism in the headline aids in giving this impression (“neck-deep” in “scandal.”)

Mother Jones bias rating

media bias analysis essay

With his lips pursed and eyes darting to the side, Schiff looks guilty in this photo. The headline stating that he “got caught celebrating” also implies that he was doing something he shouldn’t be doing. Whether or not he was actually celebrating impeachment at this dinner is up for debate, but if you judged Townhall’s article by the photo, you may conclude he was.

Townhall bias rating

media bias analysis essay

With his arms outreached and supporters cheering, Texas Gov. Greg Abbott appears triumphant in this photo. The article explains that a pediatric hospital in Texas announced it will stop performing “ gender-confirming therapies ” for children, following a directive from Abbott for the state to investigate whether such procedures on kids constituted child abuse. The implication of the headline and photo is that this is a victory.

The Daily Wire bias rating

15. Negativity Bias

Negativity bias refers to a type of bias in which reporters emphasize bad or negative news, or frame events in a negative light.

"If it bleeds, it leads" is a common media adage referring to negativity bias. Stories about death, violence, turmoil, struggle, and hardship tend to get spotlighted in the press, because these types of stories tend to get more attention and elicit more shock, outrage, fear, and cause us to become glued to the news, wanting to hear more.

Examples of Negativity Bias

media bias analysis essay

This story frames labor force participation as a negative thing. However, if labor force participation remained low for a long time, that would also be written up as bad news.

New York Times bias rating

16. Elite v. Populist Bias

Elite bias is when journalists defer to the beliefs, viewpoints, and perspectives of people who are part of society's most prestigious, credentialed institutions — such as academic institutions, government agencies, business executives, or nonprofit organizations. Populist bias, on the other hand, is a bias in which the journalist defers to the perspectives, beliefs, or viewpoints of those who are outside of or dissent from prestigious institutions — such as "man on the street" stories, small business owners, less prestigious institutions, and people who live outside of major urban centers.

Elite/populist bias has a geographic component in the U.S. Because major institutions of power are concentrated in American coastal cities (which tend to vote blue), there can exist conflicting values, perspectives, and ideologies among “coastal elites” and “rural/middle America" (which tends to vote red). The extent to which journalists emphasize the perspectives of urbanites versus people living in small town/rural areas can show elite or populist bias, and thus, political bias.

Examples of Elite v. Populist Bias

media bias analysis essay

Elite Bias: This article emphasizes the guidance and perspectives of major government agencies and professors at elite universities.

NBC News bias rating

media bias analysis essay

Populist Bias: In this opinion piece, journalist Naomi Wolf pushes back against elite government agencies, saying they can't be trusted.

The Epoch Times bias rating

Everyone is biased. It is part of human nature to have perspectives, preferences, and prejudices. But sometimes, bias — especially media bias — can become invisible to us. This is why AllSides provides hundreds of media bias ratings and a media bias chart.

We are all biased toward things that show us in the right. We are biased toward information that confirms our existing beliefs. We are biased toward the people or information that supports us, makes us look good, and affirms our judgements and virtues. And we are biased toward the more moral choice of action — at least, that which seems moral to us.

Journalism as a profession is biased toward vibrant communication, timeliness, and providing audiences with a sense of the current moment — whether or not that sense is politically slanted. Editors are biased toward strong narrative, stunning photographs, pithy quotes, and powerful prose. Every aspiring journalist has encountered media bias — sometimes the hard way. If they stay in the profession, often it will be because they have incorporated the biases of their editor.

But sometimes, bias can manipulate and blind us. It can put important information and perspectives in the shadows and prevent us from getting the whole view. For this reason, there is not a single type of media bias that can’t, and shouldn’t occasionally, be isolated and examined. This is just as true for journalists as it is for their audiences.

Good reporting can shed valuable light on our biases — good and bad. By learning how to spot media bias, how it works, and how it might blind us, we can avoid being fooled by media bias and fake news . We can learn to identify and appreciate different perspectives — and ultimately, come to a more wholesome view.

Julie Mastrine | Director of Marketing and Media Bias Ratings, AllSides

Early Contributors and Editors (2018)

Jeff Nilsson | Saturday Evening Post

Sara Alhariri | Stossel TV

Kristine Sowers | Abridge News

24/7 writing help on your phone

To install StudyMoose App tap and then “Add to Home Screen”

Media Bias Essay Examples

Media Bias - Free Essay Examples and Topic Ideas

Media bias refers to the intentional or unintentional favoritism or prejudice in the media towards one particular political party, ideology or social group. It can manifest in various forms like tone of reporting, selection of news stories or sources, and omission or oversimplification of facts. Media bias has a significant impact on public perception and can influence the shaping of public policy, elections and social attitudes. It is a contentious issue and demands a fair and balanced approach to journalism.

  • 📘 Free essay examples for your ideas about Media Bias
  • 🏆 Best Essay Topics on Media Bias
  • ⚡ Simple & Media Bias Easy Topics
  • 🎓 Good Research Topics about Media Bias
  • ❓ Questions and Answers

Essay examples

Essay topic.

Save to my list

Remove from my list

  • What is Media Bias?
  • Is the Media Biased?
  • Media bias in politics
  • The Danger in Media
  • Media Bias and Sensationalized Reporting in News Journalism
  • The Reliability of the Media
  • Effects of News Media
  • Mass media is incredibly persuasive in our society and it consumes
  • The Effects of News Media
  • The Importance of Media Censorship and Filtering out Bad Journalism Free Essay Example
  • The Influence Of The Media Politics
  • Why are there different views about the influence of media on the course of Vietnam war?
  • The Social Media And Politics Media
  • The Media’s Role in the Establishing Racial and Ethnic Equality
  • Gender Socialization and Identity Theory Free Essay Example
  • Is there a need for media men/journalists?
  • Gender Discrimination in Media
  • Gender Representations and Sexism in the Media
  • Media Coursework – How Media Texts Persuade Us
  • Exaggeration by Media
  • Description And Presentation Of Media Account Of Health Issues.
  • Media and Politics
  • The Impact of Agenda Setting Media
  • A Gender Stereotypes in Media
  • The Role of Social Media in Our Life
  • Social Media Influencers as a Medium of Advertising
  • Teens and Social Media Use: What’s the impact?
  • The Mass Media and Cultural Influence
  • Media Literacy
  • The Negative Impact of Conglomeration of Media Companies on Audiences
  • How media effect our life?

FAQ about Media Bias

search

👋 Hi! I’m your smart assistant Amy!

Don’t know where to start? Type your requirements and I’ll connect you to an academic expert within 3 minutes.

Essay Service Examples Sociology Media Industry

Media Bias Essays

13 samples in this category

According to David Leonhardt, there are several types of media bias as he wrote a newsletter for the New York Times, with reporter Margaret Sullivan. Too often, journalists confuse centrism with fairness, objectivity or common sense truth, but centrism is none of those, it’s a point of view, and it...

According to David Leonhardt, there are several types of media bias as he wrote a newsletter for the New York Times, with reporter Margaret Sullivan. Too often, journalists confuse centrism with fairness, objectivity or common sense truth, but centrism is none of those, it’s a point of view, and it can be wrong, just as conservatism or liberalism can be. (Leonhardt)(Sullivan)

Leonhardt’s ideas of Media bias are broken down into 6 different types. Those are types are; Centrist bias, Affluent bias, Bias for the new, the same biases that afflict society, Liberal bias, and conservative bias. The first type, Centrist bias believes that political centrism notes that it often crowds out and provokes political views on both the left and right sides. Margaret Sullivan also calls out a related problem,” both sidesism” blaming parties equally, even when they don’t deserve equal blame or both parties not agreeing. (sullivan)

Second, Affluent Bias means the media doesn’t focus on just one in particular group of people, all wealthy and middle class and lower-class individuals. Third, Bias for the new is often confused with newness rather than importance like the news. Fourth, “The same biases that afflict society”, is any type of reporting from sexism to political reporting to racism or crime coverage, which tends to keep people interested if it’s a topic that catches their eye or sparks an interest. Fifth, Liberal Bias have more debates with one another or amongst a group of individuals, one of the bigger debates that Sullivan talks about the hostility towards charter schools, one party agreed the other party didn’t, and lastly, Conservative bias, some believe that the information given out is a little exaggerated and most information can be misleading or wrong information, mainly coming from something like fox news or what people hear on the radio. (Sullivan)(Leonhardt)

The assumption that news should be objective is the object of considerable debate. Assertions of a conservative or establishment bias in the news often draw on critical theory, which argues that news preserves the hegemony of society’s ruling interests. (Lichter)

Assertions of liberal bias draw on surveys of journalists’ attitudes and content analyses of news coverage. This case has recently been bolstered by economic modeling. However, numerous content analytic studies have failed to find a liberal bias. This has led to efforts to explain public perceptions of liberal bias in terms of cognitive psychology and elite manipulation. Other explanations include structural biases and media negativism. Internet-driven changes in journalism, including an increase in partisan news, may force a rethinking of the entire debate or even render it irrelevant. If media companies all followed each other, even though they have their own competition for reporting, it may cut out a lot of disagreeing or crime going on because of what’s being reported may not always be as accurate. (Lichter)

Liberalism subscribes to a set of values, while progressivism provides a call to action to achieve those values. It is in the word, progressivism which is a belief in progress, and progress requires action.

In broadcast media, the FCC policy of the Fairness Doctrine required broadcast licensees to present controversial issues of public importance and to present such issues in an honest, equal, and balanced manner. The Red Lion Case was a key legal precedent in defining the role of the FCC and the enforcement of the Doctrine. The combination of underutilized AM frequencies and the absence of content restrictions led a number of radio programmers and syndicators to produce and broadcast conservative talk shows. Notable examples are Rush Limbaugh, Hugh Hewitt, Michael Medved, Michael Savage, Sean Hannity and Glenn Beck. These talk shows draw large audiences and have arguably altered the political landscape. Talk radio became a key force in the 2000 and 2004 presidential elections. While some liberal talk radio also emerged, such as Pacifica Radio’s Democracy Now. (Watson)

So much talk shoe radio happens when you listen you immediately want to go into shock and think “oh my”, or is this really true because you get so much information that’s not always true or correct, this problem can and could eventually cause people not to listen or watch.. Also, I could see the pro side for the media because this in a sense is how they make money, just like you earn honest money you should broadcast or tell honest information.

Media bias in the United States occurs when the US media systematically skews reporting in a way that crosses the standards of professional journalism. Claims of media bias in the United States include claims of conservative bias, corporate bias, liberal bias, and mainstream bias. A variety of watchdog groups combat this by fact-checking both biased reporting and unfounded claims of bias. A variety of scholarly disciplines study media bias. Many news outlets make no pretense of being unbiased, and give their readers or listeners the news they want, leading to what has been called post-truth politics. The term is used to describe the practice of making false statements about events or people without verifiable evidence. (Siegfried)

Media Bias As a Negative Aspect in Our Society: Argumentative Essay

Media bias as a shift in society, ways the media distorts the information in everyday life: analysis of media bias.

writers

800+ verified writers can handle your paper.

The Reasons For News And Media Bias

Relationship between media bias and politics and its impact to the trudeau government, media bias informative essay.

sitejabber

Social Media Bias Essay

Issues of media bias in america: analytical essay, left and right leaning bias in the media, the types of bias in media, the reasons for bias media, does the media have a liberal bias: essay, news bias identification: critical analysis.

Top similar topics

Join our 150k of happy users

  • Get original paper written according to your instructions
  • Save time for what matters most

Fair Use Policy

EduBirdie considers academic integrity to be the essential part of the learning process and does not support any violation of the academic standards. Should you have any questions regarding our Fair Use Policy or become aware of any violations, please do not hesitate to contact us via [email protected].

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Elizabeth Morrissette, Grace McKeon, Alison Louie, Amy Luther, and Alexis Fagen

Media bias could be defined as the unjust favoritism and reporting of a certain ideas or standpoint. In the news, social media, and entertainment, such as movies or television, we see media bias through the information these forms of media choose to pay attention to or report (“How to Detect Bias in News Media”, 2012). We could use the example of the difference between FOX news and CNN because these two news broadcasters have very different audiences, they tend to be biased to what the report and how they report it due to democratic or republican viewpoints.

Bias, in general, is the prejudice or preconceived notion against a person, group or thing. Bias leads to stereotyping which we can see on the way certain things are reported in the news. As an example, during Hurricane Katrina, there were two sets of photos taken of two people wading through water with bags of food. The people, one white and one black, were reported about but the way they were reported about was different. For the black man, he was reported “looting” a grocery store, while the white person was reported “finding food for survival”.  The report showed media bias because they made the black man seem like he was doing something wrong, while the white person was just “finding things in order to survive” (Guarino, 2015).

Commercial media is affected by bias because a corporation can influence what kind of entertainment is being produced. When there is an investment involved or money at stake, companies tend to want to protect their investment by not touching on topics that could start a controversy (Pavlik, 2018). In order to be able to understand what biased news is, we must be media literate. To be media literate, we need to adopt the idea that news isn’t completely transparent in the stories they choose to report. Having the knowledge that we can’t believe everything we read or see on the news will allow us as a society to become a more educated audience (Campbell, 2005).

Bias in the News

The news, whether we like it or not, is bias. Some news is bias towards Republicans while other news outlets are biased towards Democrats. It’s important to understand this when watching or reading the news to be media literate. This can be tricky because journalists may believe that their reporting is written with “fairness and balance” but most times there is an underlying bias based around what news provider the story is being written for (Pavlik and McIntosh, 61). With events happening so rapidly, journalist write quickly and sometimes point fingers without trying to. This is called Agenda-Setting which is defined by Shirley Biagi as, how reporters neglect to tell people what to think, but do tell them what and who to talk about (Biagi, 268).

The pressure to put out articles quickly, often times, can affect the story as well. How an event is portrayed, without all the facts and viewpoints, can allow the scene to be laid out in a way that frames it differently than it may have happened (Biagi, 269). However, by simply watching or reading only one portrayal of an event people will often blindly believe it is true, without see or reading other stories that may shine a different light on the subject (Vivian, 4). Media Impact   defines this as Magic Bullet Theory or the assertion that media messages directly and measurably affect people’s behavior (Biagi, 269). The stress of tight time deadlines also affects the number of variations of a story. Journalist push to get stories out creates a lack of deeper consideration to news stories. This is called Consensus Journalism or the tendency among journalists covering the same topic to report similar articles instead of differing interpretations of the event (Biagi, 268).

To see past media bias in the news it’s important to be media literate. Looking past any possible framing, or bias viewpoints and getting all the facts to create your own interpretation of a news story. It doesn’t hurt to read both sides of the story before blindly following what someone is saying, taking into consideration who they might be biased towards.

Stereotypes in the Media

Bias is not only in the news, but other entertainment media outlets such as TV and movies. Beginning during childhood, our perception of the world starts to form. Our own opinions and views are created as we learn to think for ourselves. The process of this “thinking for ourselves” is called socialization. One key agent of socialization is the mass media. Mass media portrays ideas and images that at such a young age, are very influential. However, the influence that the media has on us is not always positive. Specifically, the entertainment media, plays a big role in spreading stereotypes so much that they become normal to us (Pavlik and McIntosh, 55).

The stereotypes in entertainment media may be either gender stereotypes or cultural stereotypes. Gender stereotypes reinforce the way people see each gender supposed to be like. For example, a female stereotype could be a teenage girl who likes to go shopping, or a stay at home mom who cleans the house and goes grocery shopping. Men and women are shown in different ways in commercials, TV and movies. Women are shown as domestic housewives, and men are shown as having high status jobs, and participating in more outdoor activities (Davis, 411). A very common gender stereotype for women is that they like to shop, and are not smart enough to have a high-status profession such as a lawyer or doctor. An example of this stereotype can be shown in the musical/movie, Legally Blonde. The main character is woman who is doubted by her male counterparts. She must prove herself to be intelligent enough to become a lawyer. Another example of a gender stereotype is that men like to use tools and drive cars. For example, in most tool and car commercials /advertisements, a man is shown using the product.  On the other hand, women are most always seen in commercials for cleaning supplies or products like soaps. This stems the common stereotype that women are stay at home moms and take on duties such as cleaning the house, doing the dishes, doing the laundry, etc.

Racial stereotyping is also quite common in the entertainment media. The mass media helps to reproduce racial stereotypes, and spread those ideologies (Abraham, 184). For example, in movies and TV, the minority characters are shown as their respective stereotypes. In one specific example, the media “manifests bias and prejudice in representations of African Americans” (Abraham, 184). African Americans in the media are portrayed in negative ways. In the news, African Americans are often portrayed to be linked to negative issues such as crime, drug use, and poverty (Abraham 184). Another example of racial stereotyping is Kevin Gnapoor in the popular movie, Mean Girls . His character is Indian, and happens to be a math enthusiast and member of the Mathletes. This example strongly proves how entertainment media uses stereotypes.

Types of Media Bias

Throughout media, we see many different types of bias being used. These is bias by omission, bias by selection of source, bias by story selection, bias by placement, and bias by labeling. All of these different types are used in different ways to prevent the consumer from getting all of the information.

  • Bias by omission:  Bias by omission is when the reporter leaves out one side of the argument, restricting the information that the consumer receives. This is most prevalent when dealing with political stories (Dugger) and happens by either leaving out claims from either the liberal or conservative sides. This can be seen in either one story or a continuation of stories over time (Media Bias). There are ways to avoid this type of bias, these would include reading or viewing different sources to ensure that you are getting all of the information.
  • Bias by selection of sources:  Bias by selection of sources occurs when the author includes multiple sources that only have to do with one side (Baker).  Also, this can occur when the author intentionally leaves out sources that are pertinent to the other side of the story (Dugger). This type of bias also utilizes language such as “experts believe” and “observers say” to make people believe that what they are reading is credible. Also, the use of expert opinions is seen but only from one side, creating a barrier between one side of the story and the consumers (Baker).
  • Bias by story selection: The second type of bias by selection is bias by story selection. This is seen more throughout an entire corporation, rather than through few stories. This occurs when news broadcasters only choose to include stories that support the overall belief of the corporation in their broadcasts. This means ignoring all stories that would sway people to the other side (Baker).  Normally the stories that are selected will fully support either the left-wing or right-wing way of thinking.
  • Bias by placement: Bias by placement is a big problem in today’s society. We are seeing this type of bias more and more because it is easy with all of the different ways media is presented now, either through social media or just online. This type of bias shows how important a particular story is to the reporter. Editors will choose to poorly place stories that they don’t think are as important, or that they don’t want to be as easily accessible. This placement is used to downplay their importance and make consumers think they aren’t as important (Baker).
  • Bias by labeling: Bias by labeling is a more complicated type of bias mostly used to falsely describe politicians. Many reporters will tag politicians with extreme labels on one side of an argument while saying nothing about the other side (Media Bias). These labels that are given can either be a good thing or a bad thing, depending on the side they are biased towards. Some reporters will falsely label people as “experts”, giving them authority that they have not earned and in turn do not deserve (Media Bias). This type of bias can also come when a reporter fails to properly label a politician, such as not labeling a conservative as a conservative (Dugger). This can be difficult to pick out because not all labeling is biased, but when stronger labels are used it is important to check different sources to see if the information is correct.

Bias in Entertainment

Bias is an opinion in favor or against a person, group, and or thing compared to another, and are presented, in such ways to favor false results that are in line with their prejudgments and political or practical commitments (Hammersley & Gomm, 1).  Media bias in the entertainment is the bias from journalists and the news within the mass media about stories and events reported and the coverage of them.

There are biases in most entertainment today, such as, the news, movies, and television. The three most common biases formed in entertainment are political, racial, and gender biases. Political bias is when part of the entertainment throws in a political comment into a movie or TV show in hopes to change or detriment the viewers political views (Murillo, 462). Racial bias is, for example, is when African Americans are portrayed in a negative way and are shown in situations that have to do with things such as crime, drug use, and poverty (Mitchell, 621). Gender biases typically have to do with females. Gender biases have to do with roles that some people play and how others view them (Martin, 665). For example, young girls are supposed to be into the color pink and like princess and dolls. Women are usually the ones seen on cleaning commercials. Women are seen as “dainty” and “fragile.” And for men, they are usually seen on the more “masculine types of media, such as things that have to do with cars, and tools.

Bias is always present, and it can be found in all outlets of media. There are so many different types of bias that are present, whether it is found in is found in the news, entertainment industry, or in the portrayal of stereotypes bias, is all around us. To be media literate it’s important to always be aware of this, and to read more than one article, allowing yourself to come up with conclusion; thinking for yourself.

Works Cited 

Abraham, Linus, and Osei Appiah. “Framing News Stories: The Role of Visual Imagery in Priming Racial Stereotypes.”  Howard Journal of Communications , vol. 17, no. 3, 2006, pp. 183–203.

Baker, Brent H. “Media Bias.”  Student News Daily , 2017.

Biagi, Shirley. “Changing Messages.”  Media/Impact; An Introduction to Mass Media , 10th ed., Cengage Learning, 2013, pp. 268-270.

Campbell, Richard, et al.  Media & Culture: an Introduction to Mass Communication . Bedford/St Martins, 2005.

Davis, Shannon N. “Sex Stereotypes In Commercials Targeted Toward Children: A Content Analysis.”  Sociological Spectrum , vol. 23, no. 4, 2003, pp. 407–424.

Dugger, Ashley. “Media Bias and Criticism .” http://study.com/academy/lesson/media-bias-criticism-definition-types-examples.html .

Guarino, Mark. “Misleading reports of lawlessness after Katrina worsened crisis, officials say.”   The Guardian , 16 Aug. 2015, http://www.theguardian.com/us-news/2015/aug/16/hurricane-katrina-new-orleans-looting-violence-misleading-reports .

Hammersley, Martyn, and Roger Gomm. Bias in Social Research . Vol. 2, ser. 1, Sociological Research Online, 1997.

“How to Detect Bias in News Media.”  FAIR , 19 Nov. 2012, http://fair.org/take-action-now/media-activism-kit/how-to-detect-bias-in-news-media/ .

Levasseur, David G. “Media Bias.”  Encyclopedia of Political Communication , Lynda Lee Kaid, editor, Sage Publications, 1st edition, 2008. Credo Reference, https://search.credoreference.com/content/entry/sagepolcom/media_bias/0 .

Martin, Patricia Yancey, John R. Reynolds, and Shelley Keith, “Gender Bias and Feminist Consciousness among Judges and Attorneys: A Standpoint Theory Analysis,” Signs: Journal of Women in Culture and Society 27, no. 3 (Spring 2002): 665-701,

Mitchell, T. L., Haw, R. M., Pfeifer, J. E., & Meissner, C. A. (2005). “Racial Bias in Mock Juror Decision-Making: A Meta-Analytic Review of Defendant Treatment.” Law and Human Behavior , 29(6), 621-637.

Murillo, M. (2002). “Political Bias in Policy Convergence: Privatization Choices in Latin America.” World Politics , 54(4), 462-493.

Pavlik, John V., and Shawn McIntosh. “Media Literacy in the Digital Age .”  Converging Media: a New Introduction to Mass Communication , Oxford University Press, 2017.

Vivian, John. “Media Literacy .”  The Media of Mass Communication , 8th ed., Pearson, 2017, pp. 4–5.

Introduction to Media Studies Copyright © by Elizabeth Morrissette, Grace McKeon, Alison Louie, Amy Luther, and Alexis Fagen is licensed under a Creative Commons Attribution 4.0 International License , except where otherwise noted.

Share This Book

Home — Essay Samples — Sociology — Sociology of Media and Communication — Media Analysis

one px

Essays on Media Analysis

What makes a good media analysis essay topics.

When embarking on the quest to find the perfect topic for a media analysis essay, it is crucial to select one that not only captivates but also provides ample opportunities for analysis. Here are some innovative recommendations to fuel your brainstorming process and aid in the selection of an outstanding essay topic:

Brainstorm: Begin by jotting down all the media-related subjects that pique your interest. Explore various forms of media, including television, movies, social media, news articles, and advertising campaigns.

Research potential topics: Once you have a list of potential topics, conduct preliminary research to ensure that there is enough information available to support your analysis. Seek out recent and relevant sources that offer diverse perspectives.

Choose a specific angle: Narrow down your topic by selecting a specific aspect or angle to analyze. Instead of analyzing generic "television shows," for example, you could focus on the portrayal of gender roles in reality TV programs.

Consider significance: Evaluate the significance of your chosen topic. Does it address a current issue or prevalent challenge in society? Opt for subjects that have broader implications and can generate meaningful discussions.

Uniqueness: Strive for a topic that stands out from the ordinary. Avoid overdone subjects and aim for creativity and originality. Look for unique angles or lesser-known media artifacts to analyze.

Personal interest: Lastly, choose a topic that genuinely interests you. A personal interest in the subject matter will make the writing process more enjoyable and result in a more engaging essay.

Remember, a good media analysis essay topic should be specific, relevant, unique, and align with your personal interests. Now, let's embark on an exploration of the best media analysis essay topics that meet these criteria.

The Best Media Analysis Essay Topics

The Influential Role of Social Media in Shaping Body Image Perception Among Teenagers

Analyzing the Portrayal of Mental Health in Popular TV Shows

The Impact of Media on Political Opinion Formation during Election Campaigns

Examining the Representation of Race and Ethnicity in Hollywood Movies

The Power of Advertising: Its Influence on Consumer Behavior and Purchasing Decisions

The Effects of Video Game Content on Aggression and Behavior in Young Adults

The Role of Media in Shaping Public Perception of Climate Change

The Evolution of News Media: From Traditional Outlets to Digital Platforms

Gender Stereotypes in Commercials: Analyzing Their Persistence and Impact

The Influence of Celebrity Endorsements on Brand Loyalty and Consumer Trust

Provocative Questions to Guide Your Media Analysis

To delve deeper into these media analysis essay topics, ponder these ten thought-provoking questions:

How does social media contribute to the objectification of women?

In what ways does mainstream media perpetuate racial stereotypes?

How does the portrayal of violence in video games affect children's behavior?

To what extent do advertising campaigns exploit insecurities to sell products?

How does political bias influence news reporting in mainstream media?

How do reality TV shows shape viewers' perceptions of success and failure?

What role does media play in the normalization of drug and alcohol use?

How do different news outlets cover the same event differently, and why?

In what ways do children's cartoons reinforce gender roles and stereotypes?

How does the representation of LGBTQ+ individuals in media affect societal attitudes?

Inspiring Prompts for Your Media Analysis Essay

Here are five imaginative essay prompts to ignite your creativity in the realm of media analysis:

Analyze the use of symbolism in a specific music video of your choice and examine its impact on the audience's interpretation.

Discuss how a particular news outlet's coverage of a recent event demonstrates media bias and explore its potential consequences.

Examine the marketing strategies employed in a successful viral advertising campaign and assess their effects on brand recognition and consumer behavior.

Compare and contrast the representation of technology and its impact on society in two science fiction films.

Critically analyze the portrayal of marginalized communities in a specific TV series and its influence on societal perceptions.

Frequently Asked Questions about Writing a Media Analysis Essay

  • Q: How should I structure a media analysis essay?

A: A media analysis essay typically follows an introduction, body paragraphs analyzing different aspects, and a conclusion. Ensure that each paragraph focuses on a specific argument or analysis point.

  • Q: Can I incorporate personal opinions in a media analysis essay?

A: While media analysis essays should strive for objectivity, you can include your interpretation and analysis of the media artifacts. However, always support your claims with evidence and examples.

  • Q: How can I find relevant sources for my media analysis essay?

A: Utilize academic databases, reputable news outlets, scholarly articles, books, and credible online sources to gather relevant information and support your analysis.

  • Q: Should I include a thesis statement in my media analysis essay?

A: Yes, a clear and concise thesis statement is essential in a media analysis essay. It should convey your main argument or analysis focus.

  • Q: Can I analyze media artifacts from different time periods in one essay?

A: It is generally recommended to focus on a specific time period or media artifact in each essay. This approach allows for a more in-depth analysis and prevents the essay from becoming overly broad.

Analysis of Bruno Mars’s Song "When I Was Your Man"

Analysis of a fashion vlog by jennifer im, made-to-order essay as fast as you need it.

Each essay is customized to cater to your unique preferences

+ experts online

An Analysis of The Film "The Social Network" Through The Six Perspectives of Visual Analysis

Analysis of the media influence on the identities of young girls, how the media affects the images of minority groups, how the media stereotypes our society, let us write you an essay from scratch.

  • 450+ experts on 30 subjects ready to help
  • Custom essay delivered in as few as 3 hours

How Media Images Have an Affect on Everyday Life

Evaluation of medical accuracy in grey’s anatomy, a study of tmz media practices using moral theories and concepts, a critical review of grey’s anatomy, get a personalized essay in under 3 hours.

Expert-written essays crafted with your exact needs in mind

The Impact of Media on Teens’ Views on Politics

Media analysis of kamala harris' involvement in politics, how media has impacted my daily life, they live and the impact of media on society, positive and negative impact of today’s media on the image of pakistan, theory of framing in the media, the impact of visual advertisements on bogy image, the role of the streaming media nowadays, how the media has helped the community to overcome the fear of monsters, overview of media influence on politics, how the media has helped the australian society, the different interests of the media and its effects in their reports, the effects of mass media on american values, the influence of mass media on politics in the uk, how to write media assignment, analysis of the role of media and theories of mass media, business studies: media review project, role of media in conflict zones: an analysis of cnn effect and new media, role of media and national unity in pakistan, a review of teamwork an episode in allegiance, an american drama series.

Media analysis refers to the systematic examination and interpretation of media content, including various forms of media such as print, broadcast, and digital media. It involves critically analyzing and evaluating the messages, themes, and techniques employed in media to understand their impact on individuals, society, and culture.

Media analysis uncovers underlying meanings, implicit messages, and societal implications within media texts. It examines narrative structures, visual aesthetics, language use, cultural representations, and ideological biases. Researchers gain insights into meaning construction, power dynamics, and social influences in media. This analysis reveals patterns, trends, and dominant discourses, showing how media shapes public opinion and reflects societal values. By critically examining media content, media analysis deepens understanding of media's role in shaping narratives, public discourse, and socio-political dynamics.

  • Media Texts: Analysis of news articles, television shows, films, advertisements, social media posts, and websites.
  • Representation: Analysis of the representation of individuals, groups, events, and ideas in media. It examines how different social, cultural, and political identities are portrayed and the impact of these representations on shaping perceptions, stereotypes, and biases.
  • Audience Reception: This involves examining audience responses, interpretations, and the influence of media on attitudes, beliefs, and behaviors.
  • Media Institutions: It examines the ownership structures, industry practices, and policies that shape media content and its dissemination.
  • Media Effects: This involves studying the influence of media on public opinion, social behavior, cultural values, and political processes.

Content Analysis, Semiotic Analysis, Discourse Analysis, Audience Research, Comparative Analysis, Historical Analysis, Critical Cultural Analysis.

The Media Analysis essay topics are crucial as they reveal how media shapes public opinion, reflects societal values, and influences cultural norms. By critically examining media content, we can uncover implicit messages, ideological biases, and power dynamics. This understanding helps to foster media literacy, enabling individuals to navigate and interpret media more effectively, and promotes informed and critical engagement with the information that shapes our world.

1. Anstead, N., & O'Loughlin, B. (2015). Social media analysis and public opinion: The 2010 UK general election. Journal of computer-mediated communication, 20(2), 204-220. (https://academic.oup.com/jcmc/article/20/2/204/4067564) 2. Ravaja, N. (2004). Contributions of psychophysiology to media research: Review and recommendations. Media Psychology, 6(2), 193-235. (https://www.tandfonline.com/doi/abs/10.1207/s1532785xmep0602_4) 3. Stieglitz, S., & Dang-Xuan, L. (2013). Social media and political communication: a social media analytics framework. Social network analysis and mining, 3, 1277-1291. (https://link.springer.com/article/10.1007/s13278-012-0079-3) 4. Filo, K., Lock, D., & Karg, A. (2015). Sport and social media research: A review. Sport management review, 18(2), 166-181. (https://www.sciencedirect.com/science/article/abs/pii/S1441352314000904) 5. McQuail, D. (1985). Sociology of mass communication. Annual Review of Sociology, 11(1), 93-111. (https://www.annualreviews.org/doi/abs/10.1146/annurev.so.11.080185.000521) 6. Lockyer, S., & Pickering, M. (2008). You must be joking: The sociological critique of humour and comic media. Sociology Compass, 2(3), 808-820. (https://compass.onlinelibrary.wiley.com/doi/abs/10.1111/j.1751-9020.2008.00108.x) 7. Arsenault, A., & Castells, M. (2008). Switching power: Rupert Murdoch and the global business of media politics: A sociological analysis. International Sociology, 23(4), 488-513. (https://journals.sagepub.com/doi/pdf/10.1177/0268580908090725 )

Relevant topics

  • Social Media
  • Effects of Social Media
  • Personal Identity
  • American Identity
  • Cultural Appropriation
  • Sex, Gender and Sexuality
  • Sociological Imagination

By clicking “Check Writers’ Offers”, you agree to our terms of service and privacy policy . We’ll occasionally send you promo and account related email

No need to pay just yet!

We use cookies to personalyze your web-site experience. By continuing we’ll assume you board with our cookie policy .

  • Instructions Followed To The Letter
  • Deadlines Met At Every Stage
  • Unique And Plagiarism Free

media bias analysis essay

100 Media Analysis Essay Topics & Examples

Welcome to our list of media analysis essay topics! Here, you will find plenty of content analysis topic ideas. Use them to write a critical paper, a literary analysis, or a mass-media related project. As a bonus, we’ve included media analysis example essays!

🔝 Top 10 Media Analysis Topics for 2024

🏆 best media analysis topic ideas & essay examples, ⭐ interesting topics to write about media analysis, ✅ simple & easy media analysis essay titles, 🔥 content analysis topic ideas.

  • Portrayal of Women in Ads
  • Media Bias in Political Reporting
  • Representation and Diversity on TV
  • Social Media’s Impact on Self-Esteem
  • Media Coverage of Humanitarian Crises
  • How Are News on Climate Change Framed?
  • Consequences of Fake News and Misinformation
  • How Gender Roles Are portrayed in Children’s Media
  • Does Violence in Video Games Lead to Aggressive Behavior?
  • The Relationship of Media and Public Opinion in Elections
  • Covering a Pandemic: Critical Media Analysis A lot of work over the past decades has been devoted to the study of media analysis, which has led to the formation of a new area of knowledge, concepts, and categories.
  • Analysis of Media Strategies This is because it uses a reverse marketing strategy which states that the less the advertisement, the higher the pricing and the harder it becomes to find it, the higher the chances that people will […]
  • Media and Injustice: Issues Analysis This paper will high light relations among media and the Injustice, discuss media in it’s past and current perspective and it’s possible role in future challenges by means of special importance on the media management […]
  • “Super Bowl LVI Today: Day 1” Media Analysis Hence, it is essential to consider the priorities of the mass communication organization, namely the tone, look and advertising in the show.
  • Sociological Media Analysis: “The Bachelor” and “One Day at a Time” The show is misogynistic, with the male protagonist playing the role of the pursuer and the female protagonist assuming the role of the pursued.
  • Historical Components of Media Analysis In the case of Mumford and McLuhan, Carey observes that the writing and interpretation of media can result in the reconstruction of wider arguments and even the selection of an antagonistic agent.
  • Analysis of Social Media Tools in Business The last item, the detailed analytics of the content and activity, allows for the development of the more efficient business strategy based on the subscribers’ preferences.
  • Media Literacy Research: Analysis of the Issue In the process of research, I have significantly expanded my ability to access and analyze media messages as well as to use the power of information to communicate and make a difference in the world.
  • Media Influences Learning: Analysis The use of media in learning leads to the achievement of positive outcomes if the medium used is interrelated and confounding.
  • Media Analysis: Abuse Over Vaccine Passports The article uses the direct quotations of the restaurant owners, thus making the most of the story based in the first person.
  • Media Analysis: Ageism in Advertisement In addition to the idea of saving communicated in E-trade’s ad, the commercial also seems to convey the hope of work among the old population.
  • Media Analysis of 13 Reasons Why According to the laws of the genre, the atmosphere is intensified, the pace accelerates, and the turns in the plot become more and more abrupt.
  • Media Bias Fact Check: Website Analysis For instance, Fact Check relies on the evidence provided by the person or organization making a claim to substantiate the accuracy of the source.
  • The Media Economics Analysis In addition, the assessment of the economics of media reveals crucial information about the production, distribution, and consumption patterns of the media services and products.
  • Social Media Presence Analysis I think it expresses engagement within my workplace and willingness to learn more to either explore new ideas, be a part of the discussion, and make sure the information I am gathering is accurate and […]
  • The HopeLine: Website and Social Media Analysis The organization’s social media and the site contain a body of knowledge that might be also informative or important to revise for the current employees, for instance types and signs of abuse.
  • Media Analysis: Gideon’s Trumpet As it has been mentioned above, the purpose of the movie was to show that even a criminal has the right to have someone to represent him in the courtroom.
  • Acute Otitis Media Analysis The peak of acute ear infections, which precedes otitis media, is prior to the age of 2 years, and during school entry.
  • Modern Mass Media and Tools for Their Analysis A sender is a person who originates the message, a message is the content that is communicated, a channel is a medium used to transmit it, and a recipient is a person to whom the […]
  • Analysis of Media Representation Patterns In fact, studies show that the DNA of any given human being is ninety-nine percent identical in comparison to the rest of the population, regardless of their origin.
  • News and Media Reliability: Social Analysis At the same time, given the apparent trend to use the Internet as the primary source of news, mobile devices still seem to arouse suspicion among the adult and the older adult population. The most […]
  • Analysis Representations of Britishness in Different Media Texts Although it is clear to me that facts of Britishness exist in all three media sources listed above, I understand that it has different sides and is shown as a mixture of cultural peculiarities, breathtaking […]
  • On Stereotyping in the Media Viewers watch shows regularly and do not understand the content that is biased while the media is able to attract the attention of the audience by way of drama, comedy and action.
  • Media Coverage of Issues Analysis The main arguments that the authors suggest are: Inconsistent use of labels for the alternative plans minimized the likelihood that the public would understand the details of any of them; The conflicts frame narrowed public […]
  • Mass Media Communication: Personal Analysis Finally, when I do the same in the kitchen in the morning, I am occupied with preparing and eating my breakfast; therefore, television serves as a background and I cannot be focused on the information […]
  • Mass Media Law’s Analysis Indeed, the existing regulations show that the specified action is defined as flag desecration can be interpreted as an affront of the citizen of the United States, as well as the disdain for the law.
  • “The New Yorker” and “National Geographic” Media Analysis What finds most interesting about Surowiecki’s article is that he manages to counter the politics of the USA government, whereas, in Alexander’s article, the secret of the buried treasure and the historical events are the […]
  • Media Analysis: Jacob’s Cross In the Jacob’s Cross episode that was watched the following scenes that apply to the social justice theme were observed: This episode begins in the morning by Jacob calling his attorney and some other close […]
  • Social Media Data Analysis For the company storage purposes, information in wikis is stored in a chronological order and may be used to build the company’s knowledge.
  • Fairfax Media Limited Situational Analysis While it has generally taken Fairfax a longer time than expected to identify and adapt to the shift brought about by the rise of technology in market- specifically the internet and social media- the company […]
  • Media Industry News Analysis: Gasland May Take the Oscar To learn more about the world of media, it is better to focus on the news and the main themes of the articles offered to the reader.
  • Fairfax Media Industrial Environmental Analysis When the rights are granted, they come with a cost to the company; there has been challenges of print media from free press media in Australia thus Fairfax faces the challenge to handle the situation.
  • Media Analysis: Women and Men in Media Against this background the paper attempts to probe the way in which the press and especially the print journalism help to produce and to reproduce specific ways of knowing the third world.
  • Content Analysis of Two Different Forms of Media Although the first one uses television and the second uses the Internet and the World-Wide-Web to deliver content to consumers it must be pointed out that these two are rivals and basically has the same […]
  • Analysis of Gender Issues in the Media The message in the advertisement simply showed that women are able to control men by using their bodies in a certain way.
  • The Focus on the Importance of Symbols in Media Analysis
  • Visual Media Analysis for Social Media and Other Online Platforms
  • Research Methodologies for the Media Analysis
  • Communications and Media Analysis
  • Television Media Analysis
  • Media Analysis: Leadership
  • Predicting Stock Market Using Social Media Analysis
  • Media Analysis: Television and New Media
  • Media Analysis and Feminism
  • Television Media Analysis: Authors and Producers
  • How the Media Places Responsibility for the COVID-19 Pandemic: An Australian Media Analysis
  • Media Analysis: Political and Social Bias in the USA
  • Collecting Data in Social Media Analysis
  • The Jurisprudence and Qualitative Media Analysis
  • Media Analysis: Banning Beauty and the Beast in Malaysia
  • Media Analysis and Understanding the Meaning of Islam
  • Symbolic Interactionism and Social Networks: Media Analysis
  • Television Media Analysis: The Cosby Show
  • Marketing and Business Communication: Media Analysis
  • The Difference Between the Quantitative and Qualitative Media Analysis
  • Structuring and Media Analysis
  • Media Analysis: Audiences and Consumers
  • Managing the News and Media Analysis
  • Comparative and Critical Media Analysis
  • Media Analysis: Rose Petal Cottage
  • Video Installation and Media Analysis
  • Philosophical and Social Media Analysis
  • Critical and Interdisciplinary Research in Media Analysis
  • Public Relations and Media Analysis: Semantic and Social Aspects
  • Functionalist Perspective for Media Analysis
  • Media Analysis of Traditional Primary Documents
  • Responsibility for the COVID-19 Pandemic: Media Analysis
  • Qualitative Research Methods in Media Analysis
  • A Visual Analytics System for Television Ratings
  • Food Chain Actors’ Perceptions of and Adaptations to Volatile Markets: Results of a Media Analysis
  • Religion and the Media Analysis
  • Symbolic Interactionist Perspective for Media Analysis
  • Employer Relation: Industrial Conflict Media Analysis
  • Symbolic Interactionist Perspective Media Analysis
  • Media Analysis: Overview of Media Research Methodologies and Audiences
  • Patterns of Emotional Expression in Social Media Posts
  • A Comparative Content Analysis of Television Shows and Gender Representation
  • Environmental Sustainability Messaging in Advertisements
  • Patterns of Persuasive Language in Political Debates
  • News Coverage during COVID-19: Media Framing and Public Perception
  • The Impact of Celebrity Endorsement on Consumer Behavior
  • Analysis of Unrealistic Standards in Video Game Characters
  • How Portrayal of Violence in Movies Leads to Desensitization
  • Diversity of Characters and Themes in Children’s Literature
  • How Fashion Magazines Affect Beauty Ideals
  • Effectiveness of Educational Apps for Children
  • Do Food Advertisements Promote Healthy Nutritional Choices?
  • Representation of LGBTQ+ Characters in TV Series
  • Environmental Messaging in Corporate Social Responsibility Reports
  • Representations and Perspectives on Climate Change
  • TV Show Titles
  • New York Times Topics
  • Radio Paper Topics
  • Propaganda Topics
  • Twitter Topics
  • YouTube Topics
  • Oprah Winfrey Topics
  • Mass Communication Essay Topics
  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2024, March 2). 100 Media Analysis Essay Topics & Examples. https://ivypanda.com/essays/topic/media-analysis-essay-topics/

"100 Media Analysis Essay Topics & Examples." IvyPanda , 2 Mar. 2024, ivypanda.com/essays/topic/media-analysis-essay-topics/.

IvyPanda . (2024) '100 Media Analysis Essay Topics & Examples'. 2 March.

IvyPanda . 2024. "100 Media Analysis Essay Topics & Examples." March 2, 2024. https://ivypanda.com/essays/topic/media-analysis-essay-topics/.

1. IvyPanda . "100 Media Analysis Essay Topics & Examples." March 2, 2024. https://ivypanda.com/essays/topic/media-analysis-essay-topics/.

Bibliography

IvyPanda . "100 Media Analysis Essay Topics & Examples." March 2, 2024. https://ivypanda.com/essays/topic/media-analysis-essay-topics/.

  • Corpus ID: 270620333

Connecting the Dots in News Analysis: Bridging the Cross-Disciplinary Disparities in Media Bias and Framing

  • Gisela Vallejo , Timothy Baldwin , Lea Frermann
  • Published in NLPCSS 14 September 2023
  • Sociology, Political Science

Figures and Tables from this paper

figure 1

74 References

Automated identification of media bias in news articles: an interdisciplinary literature review, in plain sight: media bias through the lens of factual reporting, analyzing political bias and unfairness in news articles at different levels of granularity, conflicts, villains, resolutions: towards models of narrative media framing, framing and agenda-setting in russian news: a computational analysis of intricate political strategies, all things considered: detecting partisan events from news media with cross-article comparison, modeling framing in immigration discourse on social media, framing bias: media in the distribution of power.

  • Highly Influential

Sentence-level Media Bias Analysis Informed by Discourse Structures

The psychology of news influence and development of media framing analysis, related papers.

Showing 1 through 3 of 0 Related Papers

Drishti IAS

  • Classroom Programme
  • Interview Guidance
  • Online Programme
  • Drishti Store
  • My Bookmarks
  • My Progress
  • Change Password
  • From The Editor's Desk
  • How To Use The New Website
  • Help Centre

Achievers Corner

  • Topper's Interview
  • About Civil Services
  • UPSC Prelims Syllabus
  • GS Prelims Strategy
  • Prelims Analysis
  • GS Paper-I (Year Wise)
  • GS Paper-I (Subject Wise)
  • CSAT Strategy
  • Previous Years Papers
  • Practice Quiz
  • Weekly Revision MCQs
  • 60 Steps To Prelims
  • Prelims Refresher Programme 2020

Mains & Interview

  • Mains GS Syllabus
  • Mains GS Strategy
  • Mains Answer Writing Practice
  • Essay Strategy
  • Fodder For Essay
  • Model Essays
  • Drishti Essay Competition
  • Ethics Strategy
  • Ethics Case Studies
  • Ethics Discussion
  • Ethics Previous Years Q&As
  • Papers By Years
  • Papers By Subject
  • Be MAINS Ready
  • Awake Mains Examination 2020
  • Interview Strategy
  • Interview Guidance Programme

Current Affairs

  • Daily News & Editorial
  • Daily CA MCQs
  • Sansad TV Discussions
  • Monthly CA Consolidation
  • Monthly Editorial Consolidation
  • Monthly MCQ Consolidation

Drishti Specials

  • To The Point
  • Important Institutions
  • Learning Through Maps
  • PRS Capsule
  • Summary Of Reports
  • Gist Of Economic Survey

Study Material

  • NCERT Books
  • NIOS Study Material
  • IGNOU Study Material
  • Yojana & Kurukshetra
  • Chhatisgarh
  • Uttar Pradesh
  • Madhya Pradesh

Test Series

  • UPSC Prelims Test Series
  • UPSC Mains Test Series
  • UPPCS Prelims Test Series
  • UPPCS Mains Test Series
  • BPSC Prelims Test Series
  • RAS/RTS Prelims Test Series
  • Daily Editorial Analysis
  • YouTube PDF Downloads
  • Strategy By Toppers
  • Ethics - Definition & Concepts
  • Mastering Mains Answer Writing
  • Places in News
  • UPSC Mock Interview
  • PCS Mock Interview
  • Interview Insights
  • Prelims 2019
  • Product Promos

Make Your Note

Biased Media is a Real Threat to Indian Democracy

  • 29 Mar 2024
  • 10 min read

Whoever controls the media, controls the mind

― Jim Morrison

Media plays a crucial role in any democratic society by providing information, shaping public opinion, and holding those in power accountable. However, the rise of biased media poses a significant threat to the democratic fabric of India. In recent years, Indian media has come under scrutiny for its biased reporting, sensationalism, and lack of objectivity. 

Media serves as the fourth pillar of democracy, alongside the executive, legislative, and judiciary branches. Its primary function is to inform citizens, facilitate debate, and act as a watchdog over the government and other powerful institutions. In India, a diverse and vibrant media landscape has emerged since independence, comprising print, broadcast, and digital platforms. However, the proliferation of biased media outlets has blurred the lines between news and propaganda, posing a grave danger to democracy.

Biased media outlets in India often prioritize sensationalism over substance, resorting to inflammatory rhetoric and divisive narratives to attract viewership or readership. This sensationalism contributes to the spread of misinformation and the polarization of society along religious, ethnic, and political lines. Moreover, biased reporting can sway public opinion, influence electoral outcomes, and undermine the credibility of democratic institutions.

The phenomenon of biased media in India is exacerbated by various challenges to press freedom , including political pressure, corporate influence, and legal threats.  The concentration of media ownership in the hands of a few conglomerates limits the diversity of viewpoints and fosters self-censorship among journalists. These challenges impede the media's ability to fulfill its democratic mandate and hold power to account.

Political pressure on media outlets is a common phenomenon in India, where governments often seek to control the narrative and suppress dissenting voices. Media outlets are made manipulated by giving them ads by the political parties for suppressing the truth and spreading rumours and fake news.

Corporate interests often wield significant influence over media organizations through ownership or advertising revenue. A prime example is the Reliance Group , one of India's largest conglomerates with interests in various sectors, including media. Reliance's ownership of a certain media platform, which controls several news channels and digital media platforms, has raised concerns about editorial independence and bias. Critics argue that Reliance's business interests may influence media coverage to favor its corporate agenda, thereby compromising journalistic integrity.

The consequences of biased media on Indian democracy are far-reaching and multifaceted. It erodes public trust in the media as an impartial source of information, leading to widespread cynicism and apathy towards democratic institutions. It undermines the pluralistic fabric of Indian society by fostering intolerance and bigotry towards marginalized communities. It compromises the integrity of electoral processes by manipulating public opinion and influencing voter behavior. Overall, biased media contributes to the erosion of democratic norms and values, posing a serious threat to the future of Indian democracy.

During the COVID-19 pandemic, misleading stories about the death toll and government responses deepened the crisis. Twitter censorship of critical tweets and pro-government channels blaming farmers’ protests for oxygen shortages distorted the truth and undermined trust in the media. This jeopardizes their ability to report objectively and hold those in power accountable. Attacks on journalists who expose corruption or criticize political leaders endanger press freedom and democratic functioning.

Sonam Wangchuk, the renowned climate activist and educationalist , recently concluded his 21-day climate fast in Leh, Ladakh. During this period, he sustained himself solely on water and salt, drawing attention to critical issues affecting the region.

Wangchuk’s fast was a powerful statement, emphasizing the need to protect Ladakh’s fragile ecology and indigenous culture . He emphasized the importance of character and foresight in addressing Ladakh’s concerns. Wangchuk’s fast garnered support from various socio-political bodies in Ladakh, including the Kargil Democratic Alliance . Members of the KDA also joined him in hunger strikes, amplifying their collective voice but big news channels and media houses ignored incident and did not provided proper coverage.

Moreover, the Sushant Singh Rajput case became a media frenzy, with sensationalism overshadowing more critical matters. The media’s obsession with Sushant Singh Rajput’s death transformed a tragic suicide into a relentless investigation, streamed live day after day.

Instead of focusing on the actual tragedy, the spotlight shifted to an actress portrayed as the evil intriguer and the perfect cinematic vamp.

The arrest of actress, after relentless pursuit, was celebrated by those addicted to this media spectacle. The media’s gossipy edge often carries deep shades of misogyny. The private-public separation blurred. While the media chased actresses and sensationalized the Rajput case, other crucial issues in the country were sidelined. The Bombay High Court recognized the harm caused by trial by media, obstructing fair criminal case investigations. The media’s role should be to inform, not to manipulate public sentiment.

The practice of accepting money from political parties to publish favorable stories or suppress negative ones, often referred to as "paid news," undermines the integrity of journalism and erodes public trust in the media. This phenomenon is particularly prevalent during election campaigns when political parties seek to manipulate public opinion and gain an unfair advantage. One notable example of paid news occurred during the run-up to the 2014 general elections in India. 

Media showed one sided news about CAA-NRC and misled minorities that led to widespread protest in country. The media played a significant role in shaping public perception of the CAA. Some channels sensationalized the issue, focusing on specific narratives while ignoring broader implications. The trial by media approach led to polarization and misinformation . Social media also played a role, with fact-checkers attempting to correct misinformation. 

Addressing the issue of biased media requires concerted efforts from multiple stakeholders, including policymakers, media professionals, civil society organizations, and the general public. There is a need for stringent regulations and mechanisms to hold media outlets accountable for ethical breaches and misinformation. Media literacy programs should be implemented to educate citizens about the importance of critical thinking and discerning reliable sources of information. Independent media watchdogs and ombudsmen should be empowered to monitor media content and address complaints from the public. Additionally, promoting diversity and plurality in the media industry through initiatives such as community media and public broadcasting can help counteract the influence of biased media conglomerates.

Biased media poses a grave threat to Indian democracy by undermining the principles of transparency , accountability, and pluralism. Its sensationalism, misinformation, and propaganda have the potential to subvert democratic processes and foster social division. Therefore, it is imperative to address the root causes of biased media and implement reforms to safeguard press freedom and media integrity. Only by upholding the highest standards of journalistic ethics and promoting media pluralism can India realize its democratic aspirations and uphold the rights of its citizens.

Even if you are a minority of one, the truth is the truth. 

—Mahatma Gandhi

media bias analysis essay

IMAGES

  1. The Impact of Reporter's Convictions on Media Bias

    media bias analysis essay

  2. 📌 Left Wing Media Bias Essay Example

    media bias analysis essay

  3. Media Literacy: Social Media Influencer Bias Analysis Personal Essay

    media bias analysis essay

  4. Media Bias Essay Example

    media bias analysis essay

  5. Media Bias Fact Check: Website Analysis

    media bias analysis essay

  6. Bias In The Media Essay

    media bias analysis essay

VIDEO

  1. Weekly analysis, bias, gameplan

  2. DAILY/SESSION TRADING BIAS

  3. Media Bias: How Confirmation Bias Shapes News Reporting

  4. How to EASILY Find Your daily bias

  5. MSA-Bias analysis using JMP

  6. Bipolar Junction NPN Transistor Amplifier Collector-Base Bias Analysis (Part 2)

COMMENTS

  1. Media Bias Essays

    Since American's don't have room schedule-wise to investigate each side to every one of... Media Bias Media Influence. Topics: American media, Fox News, Journalism, Main stream media, Mass media, Media reports, News report, People's assessment, Social media, Useless information. 9.

  2. 80 Media Bias Essay Topic Ideas & Examples

    The mass media is the principal source of political information that has an impact on the citizens. The concept of media bias refers to the disagreement about its impact on the citizens and objectivity of […] Modern Biased Media: Transparency, Independence, and Objectivity Lack. The mass media is considered to be the Fourth Estate by the ...

  3. Media Bias In News Report: [Essay Example], 667 words

    Conclusion. Media bias in news reporting is a multifaceted issue that warrants careful examination. While biases are an inherent aspect of human perception, they can be mitigated through conscious efforts by journalists and media organizations. By diversifying newsrooms, fostering transparency, and engaging in robust fact-checking, the media ...

  4. Media Bias Analysis

    This chapter provides the first interdisciplinary literature review on media bias analysis, thereby contrasting manual and automated analysis approaches. Decade-long research in political science and other social sciences has resulted in comprehensive models to describe media bias and effective methods to analyze it.

  5. Media Bias Essays at WritingBros

    To write a media bias analysis essay example, consider exploring diverse aspects of the topic. Start by examining media bias essay topics that capture your interest and have relevance in contemporary society. Delve into the history and evolution of media bias, examining notable examples that highlight its presence and effects.

  6. Media Bias Chart

    The AllSides Media Bias Chart™ is more comprehensive in its methodology than any other media bias chart on the Web. While other media bias charts show you the subjective opinion of just one or a few people, our ratings are based on multipartisan, scientific analysis, including expert panels and surveys of thousands of everyday Americans.

  7. Examples of Media Bias and How to Spot Them

    1. Spin. Spin is a type of media bias that means vague, dramatic or sensational language. When journalists put a "spin" on a story, they stray from objective, measurable facts. Spin is a form of media bias that clouds a reader's view, preventing them from getting a precise take on what happened.

  8. Media Bias Essay

    Media Bias And The Media. or the method for reporting them is termed as Media Bias. It is some of the time said that media tailor the news and as opposed to introducing the truths it shows different purposes of perspectives and sentiments. Media inclination is pervasive or broad and it defies the guidelines of news-casting.

  9. Media Bias

    Deconstructing Media Bias: An In-Depth Analysis Essay Example Media bias is an ever-present concern in today's information-driven world. As consumers of news and information, we are constantly exposed to media outlets that have the power to shape our perceptions and influence our understanding of current events. This essay aims to shed light on ...

  10. Media Bias Essays & Research Papers

    The media is biased in both direction depending on the specific media outlets you may access like CNN, MSNBC, and Fox News. The media, including CNN, is biased more towards liberal values. The media's motive is to make money even if what they're promoting is wrong. According to Student News Daily, there is media bias by a selection of ...

  11. Media Bias Essay Examples

    Media Bias Informative Essay. IntroductionThe contemporary media is characterized by the widespread bias, which has significantly contravened journalistic integrity that requires unprejudiced and bias-free reporting. Media bias is the perceived…. Mass Media Media Bias Social Media. View full sample.

  12. A systematic review on media bias detection: What is media bias, how it

    In 2015, Niculae, Suen, Zhang, Danescu-Niculescu-Mizil, and Leskovec (2015) published one of the most cited papers on the detection of media bias through the analysis of reported speech. In this work they proposed a framework for quantifying to what extent quoting political speeches follows systematic patterns that go beyond the relative ...

  13. Media Bias

    Media bias in the entertainment is the bias from journalists and the news within the mass media about stories and events reported and the coverage of them. There are biases in most entertainment today, such as, the news, movies, and television. The three most common biases formed in entertainment are political, racial, and gender biases.

  14. Media Bias Essay example

    Media Bias Essay example. One problem that plagues us everyday without us even realizing it is media bias. We see it in the news. We see it on our favorite sitcoms. We read it everyday in the paper. Yet, we really don't recognize it when we hear it or see it. Media bias is evident in every aspect of the media, yet the problem is that we don't ...

  15. Media Analysis Essay: Most Exciting Examples and Topics Ideas

    Discuss how a particular news outlet's coverage of a recent event demonstrates media bias and explore its potential consequences. ... A media analysis essay typically follows an introduction, body paragraphs analyzing different aspects, and a conclusion. Ensure that each paragraph focuses on a specific argument or analysis point.

  16. The Impact of Media Bias

    Introduction. Media bias is a contravention of professional standards by members of the fourth estate presenting in the form of favoritism of one section of society when it comes to the selection and reporting of events and stories as well as the extent of coverage (Beach 1). According to the code of conduct of the media, practitioners are ...

  17. Media Bias Essays: Examples, Topics, & Outlines

    Pro-Corporate Media Bias Pro-Corporate Bias in the Media "I believe democracy requires a 'sacred contract' between journalists and those who put their trust in us to tell them what we can about how the world really works" (Moyers, 2004). This essay examines the pro-corporate bias in media coverage as network journalism underreports corporate corruption, and analyzes how the 'sacred contract ...

  18. Does the Media Show Bias

    The six most common types of biases are omission, source and story selection, placement, labeling, and spin (Student News Daily). Bias by omission occurs when an article leaves some relevant information, pretending that it does not exist. Source and story selection are examples of biases when reporters select only certain kinds of sources, such ...

  19. Argumentative Essay On Media Bias

    Argumentative Essay On Media Bias. 1019 Words5 Pages. Bias is defined as being prejudice in favor of or against one thing, person, or group compared with another, usually in a way considered to be unfair. Americans experience some sort of bias every day, however, media bias is likely the most prevalent. Media can be biased towards liberals ...

  20. Media Bias Analysis

    The media are bias mainly because news sources all have their own political agendas they're trying to push. There are Republican news sources and Democratic news sources that people watch to gather information they want to hear. Although the information given out can be true, it can be one-sided as well and not show all viewpoints to a story.

  21. 100 Media Analysis Essay Topics & Examples

    100 Media Analysis Essay Topics & Examples. Updated: Mar 2nd, 2024. 7 min. Welcome to our list of media analysis essay topics! Here, you will find plenty of content analysis topic ideas. Use them to write a critical paper, a literary analysis, or a mass-media related project.

  22. Media Bias Essay Examples

    Stuck on your essay? Browse essays about Media Bias and find inspiration. Learn by example and become a better writer with Kibin's suite of essay help services.

  23. [PDF] Connecting the Dots in News Analysis: Bridging the Cross

    It is argued that methodologies that are currently dominant fall short of addressing the complex questions and effects addressed in theoretical media studies, and suggested directions to close identified gaps between theory and predictive models, and their evaluation. The manifestation and effect of bias in news reporting have been central topics in the social sciences for decades, and have ...

  24. Biased Media is a Real Threat to Indian Democracy

    Biased media poses a grave threat to Indian democracy by undermining the principles of transparency, accountability, and pluralism. Its sensationalism, misinformation, and propaganda have the potential to subvert democratic processes and foster social division. Therefore, it is imperative to address the root causes of biased media and implement ...