SciELO - Scientific Electronic Library Online

 
vol.18 número2Actores Europeos y Desinformación: la disputa entre el factchecking, las agendas alternativas y la geopolíticaEvaluar la competencia mediática: una aproximación crítica desde las perspectivas pedagógica, política y metodológica índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Journal

Artigo

Indicadores

  • Não possue artigos citadosCitado por SciELO

Links relacionados

  • Não possue artigos similaresSimilares em SciELO

Compartilhar


Revista de Comunicación

versão impressa ISSN 1684-0933versão On-line ISSN 2227-1465

Revista de Comunicación vol.18 no.2 Piura jul./dez 2019

http://dx.doi.org/10.26441/rc18.2-2019-a13 

Artículos de Trabajo

Algoritmos y noticias: Redes sociales como editores y distribuidores de noticias

Algorithms and the News: Social Media Platforms as News Publishers and Distributors

Rodrigo Cetina Presuel1 

José Manuel Martínez Sierra2 

1 Doctor por la Universidad Complutense de Madrid, Investigador en la Harvard Law School, Director Ejecutivo del Real Colegio Complutense en Harvard y Co-Chair de la Sección de Derecho de la International Association for Media and Communication Research. USA rcetinapresuel@law.harvard.edu

2 Jean Monnet ad Personam Chair in European Union Law and Government y Faculty Director del Real Colegio Complutense en Harvard. Es Co-Chair del European Union Law and Governmente Study Group en el Minda de Gunzburg Center for European Studies at Harvard y Profesor responsible de la Harvard European Law Association en la Harvard Law School. USA Jose_martinez@harvard.edu

RESUMEN:

Con miles de millones de usuarios, las redes sociales (ej. Facebook) son actores dominantes en un mercado de noticias online altamente concentrado. Ostentan gran poder sobre la información que distribuyen al usuario y sobre las organizaciones e individuos que la producen. Además, utilizan algoritmos para realizar funciones propias de los medios: deciden sobre la importancia de las noticias y como deben distribuirse. Sin embargo, no entienden adecuadamente su papel como informadores de la misma manera que lo hacen los medios tradicionales ni parecen querer reconocer que cumplen funciones propias de los editores de noticias, así como las responsabilidades propias de dicho papel. Este trabajo argumenta que es esencial que las redes sociales entiendan y acepten su rol como distribuidores y editores de noticias y resalta las responsabilidades esenciales que deben de asumir para satisfacer las necesidades informativas de sus audiencias y proteger el derecho a la información del público.

Palabras clave: redes sociales; algoritmos; distribución de noticias; edición de noticias; responsabilidades y deberes; derecho a la información

ABSTRACT:

With billons of users, social media platforms (e.g. Facebook) are dominant players in a highly-concentrated online news market. They have great power over the distribution of information to their users, and over the organizations and individuals that produce it. Social media platforms use algorithms to perform functions traditionally belonging to news editors: deciding on the importance of news items and how they are disseminated. However, they do not acknowledge the role they play in informing the public as traditional news media always have and tend to ignore that they also act as publishers of news and the responsibilities associated with that role. This paper argues that it is essential for social media platforms to understand and embrace their role as both news publishers and distributors and highlights the essential responsibilities they must undertake so they can satisfy the information needs of their audiences and protect the public’s right to information.

Keywords: social media platforms; algorithms; news distribution; news publishing; duties and responsibilities; right to information

Introduction

Over the past decade, we have seen social media platforms invest in initiatives that integrate the distribution of news content in the products they offer. Companies like Facebook and Alphabet (Google’s parent company) provide users with news content based on the creation of profiles about them. This is done by collecting user-provided information and the constant monitoring of the online activities of these users. For the purpose of this work, we will collectively refer to companies that use these methods, and more specifically to Facebook and Google as Social Media Platforms.

Large amounts of data about users collected by these companies allow them to create sophisticated, although flawed profiles (Solove 2011; Pariser 2011) of users. This information has proven to be immensely valuable for advertisers who are eager to spend their dollars on advertising on social media platforms. This has allowed these companies to become the biggest players in the advertising business and major players in the media market worldwide with an audience that counts itself by the billions. As of 2017, Google and Facebook own a combined 63.1% of the online advertising market (eMarketer 2017) and they follow a business model that seems to seek total market dominance (Mason 2015). In this paper, we outline challenges and explain the negative consequences of this emerging new media concentration.

Regarding Twentieth-century mass media organizations many authors (Nerone 2002; McChesney and Nichols 2002; Bagdikian 2004; Meiklejohn 1948; Barron 1967; Sunstein 1993 & 1994) have worried about the negative effects’ concentrated control over means of expression can have on the quality of public debate and on democratic life, namely, on the ability of citizens to stay informed and on ensuring the presence of a plurality of voices and opinions in the news. These worries should be present when analyzing social media platforms because “the problem of media concentration has not disappeared with the advent of the Internet” (Yemini 2018) and because “the same factors that have produced concentration and undermined diversity in traditional mass media have carried over in substantial measure to cyberspace” (Margarian 2007).

News delivery through social media platforms differs from traditional news distribution in important ways. The use of algorithms enables online platforms to serve tailored content to users. They can also be used to exclude information and opinions that they may have an interest in not circulating, for example if such information does not please advertisers or is not engaging for users. Algorithms can also be used to give some content preeminence over others either by deciding at what time they should appear on a feed or in what order. This raises questions related to “the impact of platform curation algorithms on agency and exposure to diversity” (Dwyer and Martin 2017).

There is evidence that these platforms take into account freedom of expression and of the press considerations (Klonick 2019) when they moderate content posted by their users, but there is also evidence that social media platforms try to conciliate these decisions with their own business goals and that those goals may be given precedence (Cintron 2014; Gosh and Scott 2018; Klonick 2019).

Since algorithms that automatically make decisions about content can be said to be simply “following orders” from those that programmed them and deploy them on social media platforms, it can be argued that what they do is enforce the editorial policy that their owners favor. This remains true even if social media platforms do not create content themselves and depend on the content created by either legacy or digitally native news organizations, even if it is created by individual users or independent journalists.

For Emily Bell (2016), this means that they behave as news publishers and should raise questions about what their ethical and legal duties are. What are the goals that motivate the decisions social media platforms make over content distributed on their platforms in general, but particularly the decisions over news content? What if the drive for earning profit is the most important factor when making decisions about news distribution? Do social media platforms simply choose which news content to show based on a desire to keep people on their platforms so they can engage in constant user-surveillance to extract data that allows them in turn to fuel the targeted ads that have made them into dominant players in the market? Does this drive social media companies to privilege content perceive as engaging or entertaining rather than newsworthy or truthful? In turn, does this desire for a share of the audience and advertising revenue drive legacy and digital news organizations to produce content likely to be privileged by social media platforms?

This paper explores two hypotheses: the first is that social media platforms almost completely control the distribution of news online by controlling the online news industry as a whole and making other players, such as print news and digital-only news media companies, dependent on them for news distribution, but without acknowledging the -often negative- consequences and effects they are having over the distribution of information online.

The second hypothesis this paper explores is that social media companies also perform an editorial function through the algorithms deployed on their platforms and thus should accept their role as news publishers and follow guidelines that journalists have traditionally followed when performing their editorial function.

This paper also takes stock on the current state of the news market online and argues that social media platforms are indeed performing the role of a publisher and should embrace such a role which then necessitates analyzing what the duties and responsibilities of social media platforms performing that role ought to be. In the conclusions, we argue that social media platforms need to acknowledge and embrace the responsibilities of their role as news distributors and editors because of the great amount of power they now wield over a highly-concentrated online news market. We argue that such duties and responsibilities mandate that they must make a commitment to freedom of expression and of the press and the respect for the rights to seek, receive and impart information above their business models and goals.

Methodology and Justification

This work is a qualitative and descriptive analysis and a reflection on both the current state of the online news market with social media platforms as its central players and the responsibilities we believe they should embrace due to the central role they play in keeping the general public informed. In qualitative studies such as this, thinking and interpretation is generally developed through the writing process itself and that writing is itself a part of the analytic process typical of qualitative research (Richardson 2000). This reporting can include peer-reviewed literature and documentary sources.

Since covering the totality of the news market would not be possible in one single paper, this work focuses on the news market online and on two specific social media platforms, Facebook and Google and largely explores the state of the market in the United States and the European Union while leaving out other parts of the world. Also, when talking about news media organizations, this work only analyzes the presence of so-called legacy news media online (“news that appears on websites operated by traditional news organizations” Diel 2017) and digital-only news media outlets.

In Section 2 below, this paper describes the current state of the news market online by explaining how social media platforms have emerged as the dominant force in such a market, controlling everything from the advertising business fueled by extracting the personal data of their users, to access to the audience that news media outlets seek to serve their news content to. In order to do so, this paper draws from a literature review of peer-reviewed academic research and reviews documentary materials such as accounts in the media, essays, trade publications and reports that allow us to describe the current state of the online news market. This allows to argue that social media platforms wield great power over news distribution but are not responding to the challenges that being such a dominant force entail.

In section 3, this paper argues that social media platforms in fact perform the role of publishers, largely through the use of algorithms, automated processes that allow them to control what news content, when and how reaches their users, while still maintaining a degree of human intervention. To argue that they should be considered, and see themselves as publishers, the paper draws mostly on a review of peer-reviewed literature and other academic publications that argue that they should be recognized as publishers and outlines the challenges that performing this function creates. To support these statements, the section also includes documentary materials (media accounts, trade publications) that describe how this publishing function is performed largely through the use of algorithms as mentioned above.

Section 4, summarizing what has been established in sections 2 and 3, and drawing from current academic literature on the subject, analyzes what the duties and responsibilities of social media platforms performing the role of dominant news distributors and publishers ought to be.

The fifth and final section outlines our conclusions, including a call for social media platforms to commit to important responsibilities that mandate them to protect and foster freedom of expression and of the press and the respect for the rights to seek, receive and impart information.

Social media platforms as gatekeepers and news distributors.

When commercially available Internet for private use started to become widespread, Internet Service Providers (ISPs) were the gatekeepers. Companies like America Online in the United States provided Internet service but also served as filters for the content. They indexed it and managed how users could discover and access such content. In those times, while some Internet-savvy users knew how to navigate and discover content on their own, many others were depended on a fragmented and curated Internet which was made available to them by those gatekeepers.

For a while, it looked like with the disappearance of these kinds of intermediaries and the rise of social media platforms, power over online means of expression was finally in the hands of the many, of the users. As Bro and Walberg (2014, 98) explain, this was seen as a model “where traditional news media might be gradually eliminated as the prime intermediary between private citizens and authoritative decision-makers.” The fact is that just the opposite seems to be true.

We still depend on gatekeepers today, but those are those social media platforms like Facebook or companies like Alphabet, which owns the Google search engine and associated services and another major player in the news distribution landscape: YouTube. This means that, contrary to the optimistic predictions of the past, a group of authoritative decision-makers may have been replaced by an even smaller, and more economically powerful group of authoritative decision-makers, social media platforms. They are the new, all powerful gatekeepers, the new governors as Klonick (2018) has described them.

These new governors, platforms like Google and Facebook have immense power when it comes to deciding what information, and, especially interesting for this work, what news-content reaches users. It is not only a matter of technical capability; it is an issue of numbers. They count their users by the billions. Whatever audience any given news organization may have pales in comparison. They follow an online advertising business model that is in turn based on pervasive surveillance of people in order to collect large troves of data and gain insights from it than can be turned into a profit (Zuboff 2015). Distributing any type of content that may interest their users means that those users will remain engaged and spend time on their services and that more data will be collected.

News content distribution seems like a natural fit for all of these services as it has the potential to keep those users engaged, just as any other content generated by the users themselves. A combination of both can deliver the engagement and attention that these platforms need from these users.

While social media platforms have the technical capability to decide what content appears on their platforms, and even if they do engage in content moderation, they often tolerate all kinds of expression, beneficial to the public or not, as long as it enables their business model. According to Balkin (2018), this is in turn enabled by intermediary liability legislation. Citron (2014) has done remarkable work documenting just how prevalent hateful speech is on social media platforms. In turn, Kate Klonick (2019) has worked to shed light on how Facebook, one example, navigates the moderation of content on the platform, balancing the company´s stated mission with the interests of their users and “the company’s bottom-line.”

The same applies for the sources of news themselves. In general, social media platforms maintain that anybody should be allowed to speak on their platforms, including by spreading the news. Many legacy news organizations, with a strong reputation like the New York Times or the Washington Post disseminate their contents through social media platforms and compete with so-called native digital journalism enterprises, some of them big media companies in their own right. Some are reputable and recognized, some perhaps less so. Among those that do not have the best of reputations we can count the ones that produce highly partisan, slanted content and that tend to spread questionable information, when not engaged in outright disinformation efforts (Benkler et al 2017). Some of these less prestigious outlets have been tied to foreign government interference in third countries (Sheth and Bertrand 2017) and are often used by politicians to discredit media organizations they dislike (Elranger 2017). At the very least, they are cynical attempts at money-making (Subramanian 2017). Regardless, all of the above have a space on social media and are afforded the opportunity to disseminate content on them.

Information about users allows social media platforms to identify audiences which may be interested in this or that type of content. Figuring out users’ interest, allows platforms to serve them content that can keep them engaged at all times. While it is often the case that users’ newsfeeds in a platform like Facebook contain “very little substantive news intermingled with a lot of other personalized information: commentary, gossip, personal observations, commercial messages and so on” (Yemini 2018, 168), there is data that shows that users seem to respond to the availability of actual news through social media favorably and at least in part, use their social media platforms to seek and consume the news (Yemini 2018).

However, this means that they may only be committed to distributing news content as long as there is revenue to be obtained from it and this may also mean that they are not necessarily committed to distributing quality news content, just content that can lead to a profit for them.

Across social media sites, there is an increasing number of users that use social media platforms as sources for news. 67% of US adults reported getting at least some of their news on social media. 32% of YouTube users and 29% of Snapchat’s get news from those sites, and a majority of Twitter (74%) and Facebook users (68%) do so on those other ones (Shearer and Gottfried 2017). The United States has a total population of around 323 million people (US Census Bureau 2016) and overall, since 66% of Americans use Facebook, this means that 45% of the whole country’s population considers this particular social network as a source of news, and about 26% of them use more than one social network for these purposes (Shearer and Gottfried 2017).

In comparison, and according to data from Eurobarometer’s Media Use in the European Union Report (2016), 54% of Europeans reported using Social Media at least once a week and nearly four in ten use them every day or almost every day. Meanwhile, a growing majority of Europeans see social networks as sources of political news (56% agree that “online social networks can get people interested in political affairs” and 56% see them as “a modern way to keep abreast of political affairs”), although 48% of Europeans still agree that political information “from online social networks cannot be trusted (Media Use in the European Union Report 2016).

While we don’t have worldwide data available, Facebook had 2.2. billion active users by the end of 2017 (Statista 2017). If we assume that US and EU trends repeat around the world, then we could also assume that large numbers of people in other parts of the world use social media platforms for news consumption as well.

Social media platforms have managed to create an audience “which consumes more news than before, through a greater number of media platforms, at increasingly various points throughout the day” (Bro and Walberg 2014, 96 citing Newman 2013).

The drive to keep this audience has led social media platforms to attempt serving users with more news content, even entering into partnerships with other news-making organizations. Traditional media have also come to understand that they need social media platforms to distribute their content if they want to face challenges like loss of advertising revenue and the difficulty of replicating such revenue in their digital editions.

Interest of technology companies in news distribution may not be new. For example, according to Eli Pariser (2011, 36-37), Mark Zuckerberg, CEO of Facebook, decided from early on that one of the goals of his company was to link users to news they were interested in. Eric Schmidt, CEO of Google (now Alphabet), predicted, back in 2010, that in 5 or 10 years most news would be consumed in electronic devices that would be capable of presenting news and advertising tailored to users’ interests. (See Marcos Recio, Sánchez Vigil and Olivera Zaldua 2015). At the same time, online-only news platforms came to the realization that, in order to reach an audience, they must diversify their distribution channels beyond just their website. They know that having a social media presence is a must, as those platforms are where many users spend most of their time online.

Most news media organizations have sought to have a presence on social media platforms in order to reach out a potential audience there. Social media platforms reacted to this and sought to leverage the interest of media companies. For example, in 2014 Facebook started giving more prominence to news content and modified its platform to do so, implementing trending features and hashtags and encouraging news outlets to distribute their own content through their pages (Constine 2018).

Just a year later both Facebook and Google were entering into direct partnerships with news outlets and offering to handle distribution of news content through their platforms by letting users access news content through products specifically created for that purpose, such as Facebook´s Instant Articles or Google´s AMP mobile. Both products focused on giving access to news content in mere seconds and without the need of leaving the social media platforms (Byers 2015; Kafka, 2015). This meant that traffic would no longer be directed from Facebook to news organizations websites. What social media platforms offered in return was a share in the advertising revenue generated by clicks and access to the large audiences of social media platforms. In the case of Facebook, the company offered news outlets the opportunity to reap great rewards “in terms of scale, engagement, and revenue-from being able to serve mobile content that loaded at lightning speeds via Facebook’s app” (Brown 2018), by giving away control of their news content to the platform.

But initiatives to boost the availability of news content on social media have hardly proven to be the bonanza promised to news publishers. For Ingram (2018) this has resulted in the “dismantling of the traditional advertising models largely at the hands of the social networks, which have siphoned away the majority of industry ad revenue” and have left many media companies struggling. Today, the big tech companies that control social media platforms own practically all of the advertising market, the traditional means of subsistence of news media companies: “journalism has collapsed both as a practice and as an industry as advertisement revenue fled to online platforms and a cacophony of new voices asserted their newfound potency, certified by high Google search ranks or millions of Twitter followers” (Vaidhyanathan 2018). It´s likely that many saw partnerships with social media platforms as the only choice.

To a large extent, the news media organizations of today depend on social media for survival in two key ways: 1) they need them to reach their audiences, as a large portion of them spends most of their time online; and 2) since social platforms control most of the advertising market today, they need them to survive financially, as they are heavily dependent on Google and Facebook’s ad-revenue sharing programs. Social media platforms have positioned themselves as the gatekeepers of most of the content that circulates online through this dependence and market hegemony. These companies concentrate unprecedented “global media publishing power in… behemoth digital corporations (Dwyer and Martin 2017, 1085).

All of this means that social media platforms have disproportionate power over news distribution and thus, the traditional critique of media concentration is pertinent here: it can reduce the ability of citizens to search, receive and impart information (Bagdikian 2004) and that the concentrated control of the means of expression can “adversely affect the quality of public debate” (Yemini, 2018 citing Meiklejohn 1948; Sunstein 1992, 1993 & 1994).

Problems with media concentration present in the pre-Internet world are once again present in the online news distribution ecosystem with “the same economic factors that have produced concentration and undermined diversity in traditional mass media (carrying) over in substantial measure” (Margarian 2017). Today, most content, including the news, reaches users through “highly-concentrated online intermediaries” (Yemini 2018, citing Napoli 2015 and Noam 2016). As stated by Nerone (1995), it is desirable to have a pluralistic media landscape that gives voice and exposure to different peoples and different interests. That is key to a well-informed public and is considered a pillar of modern democracy. If the main news providers we have are only a handful of companies, in the case of Facebook, the main provider of news for 1 out of 5 people in the world, what pluralism can we expect? (Halleck 2015).

These companies are driven by a hyper-capitalistic vision of news distribution that gives little consideration to public service characteristics of news distribution. Attention tends to focus on enriching investors, with little interest in serving a democratic function (McChesney & Nichols 2002).

On top of this, Platforms like Facebook will often fail to gauge the consequences of their actions (Hughes 2018).

Here are just some examples:

1) As reported by The Guardian (Lewis 2018), YouTube’s (owned by Google) recommendation algorithms are optimized to promote conspiracy videos online in order to drive engagement. This has allowed “deliberately misleading narratives” to thrive on the platform and YouTube has become “the main driver” in helping extremist content reach viewers online (Beery 2019).

2) After the 2016 Presidential election in the United States, Facebook faced scrutiny about the role it had in spreading false information and how this affected the outcome of the election (Mac 2018). In response, Facebook modified their algorithms to focus on friends posts rather than news (Krantowitz 2018). An unintended consequence was that many legitimate news outlets saw their revenue plummet (Krantowitz, 2018b). While Facebook has publicized its efforts to tackle the spread of false information (Jamieson 2016) they have been accused of not caring enough by journalists that had been hired to help with solving the problem (Levin 2017 & 2018).

3) The government of Sri Lanka blocked Facebook and WhatsApp (owned by Facebook) after Facebook ignored requests to take down posts that spread hate speech and called for attacks on the Muslim community in the country (Taub & Fisher 2018).

It is true that, in recent times, social media platforms have started to admit more and more that they do have responsibilities, due mostly to social pressure (see Solon 2016; Mac 2018; Thompson & Vogelstein 2019 or Rosenberg 2019 for a few examples of how unpopular the platform has become to public opinion). But even if social media companies start taking more responsibility, there are other problems related to both the actual technical capabilities of these new gatekeepers and their willingness to take responsibility for their actions because while “they have the overwhelming concentration of technical financial and moral power, (this power) is in the hands of people who lack the training, experience, wisdom, trustworthiness, humility and incentives to exercise that power responsibly” (Stephens 2019).

If we rely on a couple of gatekeepers with great power over news distribution, it follows that there is great danger in the fact that they control and decide what news will reach the users, what content will be given priority, what content will not be made as prominent or what news will never reach the users.

The problem of how to curtail the concentration of media power in the hands of these companies is an issue for regulators to analyze. There are opinions for (Hughes 2019) and against (Clegg 2019) breaking up these types of companies and so far, the European Union has been much more proactive than the United States in this front, but there are worries that too much interference may provide governments with tools for censorship (Satariano 2019). Authors like Balkin (2018) or Elkin Koren & Haber (2016) have explored the implications of governments regulating how social media platforms should moderate content and how fear of not complying with the law may just result in the arbitrary removal of content.

We do believe that regulation is necessary, even inevitable at this point. Even the heads of social media platforms have called for regulators to enact laws to regulate them (Zuckerberg 2019). But while this happens, we also believe there is value in pointing out the issues these companies need to tackle and in suggesting what guidelines they may follow.

It is worth analyzing how social media platforms may also wield significant power over the news from an editorial perspective. The use of sophisticated algorithms does not just give them distribution capabilities of content created by third parties and their large audience not only gives them market dominance, it also gives them considerable influence that can be exercised through editorial actions over such content, whether they are implemented directly by their employees, or via automated processes that such employees design and implement. For this, we will now center on the specific case of Facebook and how it uses its algorithms to deliver the news to its users.

Social media platforms as publishers: Facebook and the role of algorithms

As we saw in the previous section, social media platforms have positioned themselves as the biggest news distributors of our times, measured by size of audience and market share. Progressively, as they took an interest in the news market, these platforms have been assuming agenda-setting functions and influence usually ascribed to human editors (Scheufele 2000; Palmer 2000; Schultz 2017; Lee 2009). Human news editors select news stories according to their criteria and values and make decisions about the inclusion and exclusion of materials (De Vito 2017). In social media platforms such as Facebook, these editorial functions are now performed by algorithms, and the criteria of news editors has been supplanted by the criteria of those that created and implement said algorithms. Algorithmic criteria “are used to make decisions about the inclusion and exclusion of material and which aspects of said material to present in an algorithmically driven news feed” (De Vito 2017).

While social media are usually conceived as networks of connected people, these connections are never direct, they happen through platforms and interfaces designed by third parties (Haim & Graefe 2017). Just as connections between users are not direct, information is not directly transmitted by one user to the other, or in the context of this paper, from one news organization to their audience. All information transmitted online is first processed by algorithms. For example, when users log into Facebook, what they see is a “list of updating stories that appear front and center on Facebook home pages (and that) display an algorithmically curated or filtered list of stories selected from a pool of all stories” available (Eslami et al 2015).

When users see content available to them at any given point in time, what the company calls the News Feed, they are not seeing content merely presented in chronological order. The content they see is determined by a series of automated processes, the algorithms, that decide what the user will see at any given moment. This is decided, at least in part, through the collection and analysis of data that aims to determine what the preferences and interests of any given user are. The order in which users see content depends on various factors, including how often they visit Facebook, how often they interact with their friends and other companies present on the platform and how often and to what degree they interact with content that has appeared on their feed in the past (Bakshy, Messing and Adamic 2015). The algorithms make decisions based on profiles created by matching “registration and location details, with viewing and sharing data as well as metadata such as content keywords” (Dwyer and Martin 2017, 1084).

Those algorithms determine, for example, what posts from friends a user will see, and also what news content from media companies will appear, in what moment and when they will appear in the feed. All of this based on the profile the social network has created on any given user (Luckerson 2015). Their algorithms “curate everyday online content through information prioritization, classification, associating and filtering” usually without users being aware of their existence and use (Eslami et al 2015).

Facebook News Feed “is programmed to be viral, clicky, upbeat or quarrelsome. That’s how its algorithm works, and how it determines what more than a billion people see every day” (Tufekci 2016). Facebook “wants to choose the best content out of several thousand potential stories that could appear in your News Feed each day and put those in the first few dozen slots that you’ll actually browse through” (Constine 2016). Facebook algorithms decide 1), what may be of interest for a particular user, 2) at what point in time they should see this or that content, and 3) in which order the content should appear.

Ultimately, the objective of Facebook´s algorithms is selecting the most relevant and engaging stories and showing them in the News Feed in order to keep users engaged with their platform the longest amount of time possible. This mean Facebook decides what is given the most prominence on its News Feed, what content gets relegated and given less visibility or what content is taken down from the platform. Algorithms exert various types of control over content as they have a key role in selecting what information is considered more relevant of us, a crucial characteristic of our participation in public life (McKelvey 2014).

Simply put, these social media platforms make editorial decisions of when and where content is made available to the user, even if those decisions are not directly implemented by a human, but through automated processes designed by humans. Even if done through algorithms, this activity has parallels with the role and decisions a publisher would make in a media company or newspaper.

Human decision-making based on a certain set of values is very present in the process. Algorithms may be able to work automatically and some of them may even have the ability to learn and evolve on their own, but since they are created by people, they are subject to the views, values and even biases and mistakes of those that have programmed them: They are “far from being objective, impartial, reliable and legitimate… (and they) are created for purposes that are often far from neutral: to create value and capital; to nudge behavior and structure preferences in a certain way; and to identify, sort and classify people” (Kitchin 2016, 17-18). They are certainly not exempt from bias, error, unintended consequences or even manipulation (Friedman & Nissenbaum 1996), The values and views of their creators (Mager 2012), even efforts to rid them of bias and mistakes and even pressure exerted by third parties may influence how they are designed. This has consequences for the way information is provided (Gillespie 2014) and how the public sphere is created (Crawford 2013; Seaver 2014; Ziewits 2015).

For Silverman (2015), social media platforms wield an occult and diffuse, yet very influential power. Such power can influence our lives through software, but it is subject to very little scrutiny. The decisions that software is programmed to make are influenced by the desires of vendors and the values of programmers who have created the algorithms. Technological design simply obfuscates the ideology and political agendas of their creators (Morozov 2011; Natheson 2013). Automatic processes are human created and they do not eliminate human bias, they merely reproduce it.

When Facebook decided to leave all news content distribution decisions in the hands of algorithms, the results were not quite what the company expected, automated decisions even helped spread disinformation, not to curtail it (Dewey 2016). Google has had the same experience. As mentioned before, the algorithms used by YouTube to suggest videos to users tend to recommend disinformation or disturbing and violent content (Swearingen 2018). By error or by design, they seem to privilege certain pieces of information over others, often with undesirable consequences.

In fact, automated processes are just as fallible as the humans that create them. Algorithms can be manipulated, whether by the social platforms themselves or by third parties, and we cannot expect that they will always make correct decisions regarding news content, a task that on itself is difficult for seasoned human journalists themselves.

There is also the very important issue of transparency. The distribution of news through algorithms has great influence over our information flows. As news algorithms supplant “traditional editorial story selection, we have no window into its story curation process that is parallel to our extensive knowledge of the news values that drive traditional editorial curation” (DeVito 2017) because often who creates algorithms and what methods they use to create and implement them is obscure (Pasquale 2015; Tufekci 2015). Sometimes, according to Napoli (2014), creators don´t even fully understand how their algorithms work and are not always able to grasp what consequences they may generate. We often don´t have “a clear picture of what the algorithm is, much less what values it is embedding into its story selection process” (DeVito 2017).

Both the news-distribution and publisher capabilities that social platforms have, and the fact that they seek to exercise those capabilities via automated processes that lack transparency, mean that those platforms have great power to steer information flows and shape public opinion via automated means but that it is not easy to hold them accountable for it. They have a great deal of power to influence democratic life and may impact citizens’ fundamental rights, including their freedom of expression, but it is not easy to know how they are doing it.

One could expect that companies like Facebook or Google are aware of the power they wield and that they are aware of the growing responsibility they have in guaranteeing an adequate flow of information that aligns itself with democratic values. However, while they have an important degree of responsibility regarding their power to shape public discourse, they often seek to evade such responsibility, or at the very least do not fully seem to understand it.

Mark Zuckerberg, CEO of Facebook, has long maintained that his is a neutral platform, and should not be considered a media company , but rather a technology company (Rodriguez 2016; Allen, 2017) that just happens to carry news content, among other types of content.

He has often taken the position that while his company does distribute information and that it is responsible for such distribution, their aspiration is not to be a news medium and that their activity should not be considered as similar to a publisher and thus, should not have to bear the burden of responsibility of news publishers. All of this being said, Facebook has not been afraid to define itself as a news publisher when it has needed to defend its interests in court (Levin 2018). And while it is true that social platforms may be protected by freedom of expression or of the press just like with other media companies - Freedom of expression through algorithms has been explored in the American legal tradition context by Benjamin (2013) and Wu (2013) - there is a case to be made that they are no different than those traditional media companies in terms of their responsibilities.

However, according to Napoli (2015), when comparing social media platform business models with those of traditional media, the institutional representations of the public interest are largely absent, beyond any indirect representation of them through the contents produced by traditional media that are in turn distributed through the social media platforms.

But as Emily Bell points out (2016), social media platforms such as Facebook must recognize that they make critical decisions about platform access, about journalism, about the capability of citizens to freely express themselves; and about the inclusion or blocking of certain kinds of content, including the news.

Whatever can be said about traditional media companies or the journalistic profession, there has never been a lack of critical voices that come from inside newsrooms and media organizations. There have always been academics and professionals that keep watch of the journalistic profession and that are aware of the value of a service that is performed for the benefit of society and that must not steer too far away from its democratic function (Moore et al, 2018). As Napoli points out (2015), traditional media have operated by taking legislation into account in combination with professional codes of conduct that are based on specific public interest principles.

A sense of duty has, more or less, always existed, or at least an awareness that such responsibilities exist. Certainly not all media, nor the people that direct the companies or work for them, show the same degree of responsibility and adherence to professional codes of conduct, but surely, few of them would ever say that they have no duties or that they do not have to abide by a certain set of rules, or that they do not perform a function that is essential for democracy and that must work in the interest of the public.

But with social media platforms “there are serious media diversity questions associated with the rise of social media platforms as news providers, not in the least who controls the future of traditional news players, who will fund news production, and with what political and social outcomes in mind (but) social media companies do not have the historic association with democratic processes and media accountability of traditional news media” (Dwyer and Martin 2017, 1019-1092).

Whether they want it or not, whether they acknowledge it or not, social media platforms perform news-distribution and news-publishing acts that require a solid understanding of what is expected of any entity that engages in the dissemination of information and news-content to the public.

Social media platforms must acknowledge that if they are able make editorial decisions just as traditional publishers can, while enjoying the protections and freedom afforded to the press, they also must accept the duties that come with making these editorial decisions, even via algorithms. They must be aware that this implies the recognition that they have the responsibility of respecting the rights of freedom of expression of their users, and that they must promote their rights to seek, impart and receive information, and that they must act accordingly to protect them.

By identifying ethical challenges that social media platforms share with traditional news publishers we can also identify the elements that will allow us to draw a blueprint of desirable conduct that social media platforms may follow in their role as publishers.

The journalistic duties of social media platforms as publishers.

Spanish Communication Law and Policy Scholar Jose Maria Desantes used to say that the person is the ultimate holder of communication rights and as such, it is the duty of citizens to keep watch of information professionals and media companies (Desantes, 1974). Thus, citizens should always ponder the following questions when trying to determine what they should expect from those professionals and companies in order to safeguard their own freedom of expression, in particular their rights to seek, receive and impart information (Desantes, 1974):

  • - What interest do these companies serve, by supporting certain stakeholders explicitly, or by being critical of opposing interests?

  • - What are their sources of funding?

  • - What information needs of consumers are they responding to, and what needs should they respond to?

  • - Which consumer needs do they ignore or omit consciously or unconsciously?

  • - What are the global consequences of their actions?

  • - Who, to what extent and with what guarantees is responsible?

  • - How are messages distributed and used? How can they be used and distributed justly, and how can they be unjustly used and distributed?

  • - What is their influence in the information and opinion flows?

The first two and last two questions are particularly pertinent for the analysis of the duties of social platforms that perform the editorial functions of news publishers. Social media platforms must think critically about their own actions. They must be aware that the algorithms they use to distribute the news and to implement editorial decisions regarding those news have been created by humans and may be prone to design flaws, to biases and to the agendas of their creators and owners, who imbue them with their values, political opinions and perspectives. The implementation of algorithms for news distribution can have all sorts of consequences: they have the potential to “extend the gap between the political information rich and the political information poor, and between the political left and right (Thorson & Wells 2016), can create filter bubbles (Pariser 2011) and even increase anxiety and unhappiness in people flooded by news curated by algorithms (Vaidhyanathan, 2017).

One way to take responsibility and attempt to solve the possible challenge of lack of transparency in how these algorithms actually work (Anderson, Bell, & Shirky 2017; Fung, Graham & Weil 2007). In the context of journalism practice, Duez (2005) has defined transparency as the ways in which those that practice journalism and those that are on the outside are able to monitor, check, criticize and even intervene in the news making process. Currently, the processes social media platforms use to decide how some content is preferred over other and how it is given value “are not transparent, objective or balanced” (Tufekci 2016). As said before, how algorithms work is not known to the public (Tufekci 2015) and many do not know they influence what we see online (Pasquale 2015).

Transparency serves to explain why certain people choose to take certain actions and sheds light on the motives behind them (Balkin 1999) and can help ensure accountability and build trust (Breton 2006). Social media platforms must make an effort to be more transparent and to better explain to their users what data they collect from them and how they use them to make decisions about content distribution. They must also be transparent about how algorithms shape what each user experiences online and what information they can access.

In the case of companies like Facebook or Google, it is no stretch to think that what drives the design of algorithms and the functions they perform (what contents they prioritize) responds mostly to their commercial interests. They must aspire to be transparent and let the user know who is behind the algorithms and the company: who they are, what they think and what their agendas are. It is the role of the citizen to be critical of their activities based on this information and it is their job not to accept the use of data as algorithms as legitimizing the setting of agendas or the imposition of world views by scientific means (Petre 2015).

Social media platforms should strive to be as transparent as possible about their means of financing, for example, letting users know that they sell advertising and that ad revenue goals may influence the editorial decisions they make and implement via algorithms. Citizens also need to be made aware that the business model of generating revenue from advertising that social platforms based their enterprise on, is itself reliant on user surveillance.

Just as transparency in news media is a key ethical principle that journalists should strive to follow, it should become a key principle pursued by social media platforms in their role as editors of news content: “the notion of algorithmic transparency in the news media is an attempt to articulate the mechanisms by which information about algorithms may be made public. Disclosing information about how algorithms drive various computational systems would allow users to determine the values, biases or ideologies in operation in order to understand underlying points of view of a news product” (Diakopoulos & Koliska 2017).

The political and social values that social media platforms promote while performing their role of publishers and distributors of content should be protected by the rights freedom of expression and of the press. These rights might come under pressure by multiple factors, including influence -or pressure- by governments and other powerful interest groups, such as investors. For their users, it will not be easy to discover who exerts this influence so it should be the job of social media platforms to attempt to educate users about who is influencing what news they get.

An adequate amount of transparency can foster positive attitudes towards these technologies (Kizilcec 2016) and help users understand the nature of those influences, so it is actually in their best interest. Social media platforms should understand that disclosing which groups, companies and governments influence the way in which they decide which news to publish and how, who they distribute them to and how they decide which content they prioritize through their algorithms are some of their key responsibilities as publishers.

Unfortunately, social networks tend to talk about transparency initiatives but often do not act on them. They sometimes claim they will take action to give users more control and explain to them how they data is collected and processed, and how algorithms use these data. But those explanations tend to be complex, when not deliberately obscure. Facebook, for example, has been talking about empowering users, simplifying their terms of use and privacy policies for a long time, and yet it has struggled to take meaningful steps towards it (Lev-Ram 2018).

As Silverman (2015) points out, terms of use and privacy policies are notoriously difficult to navigate and usually don’t reveal too much in terms of what social media platforms know about us. What information about us they collect and how they process it, is still notoriously obscure, despite the many initiatives these platforms have announced over the years (Goldman and Himel 2018). True transparency implies that users are capable of understanding how the system works. This is necessary for people to be able to control these algorithmic tools and that is not the tools the ones that end controlling the people (Pariser 2011).

The consequences of control of the digital distribution of news through algorithms of which we know nothing of and that are deployed by just a few big companies bears reflection. Absent any legislation, these powerful private actors could be the ones that “set the basic rules of free expression (and) not nation states” (Balkin 2018), in the foreseeable future.

Such reflection should be made by both the social media platforms and the media companies that do business with them and enter into agreements to share their contents online. Many media businesses are struggling financially. Often, they have no choice but to ally themselves with social media platforms in search for relief, but these enterprises should ponder the cost of these alliances for them and for society at large, as it implies the cession of a certain degree of editorial control to social media platforms that is not inconsequential.

At the same time, social media platforms must be aware of the news media landscape they have contributed to create. News outlets depend on them financially now and this means that apart from seeking profit, they must be fully aware of the responsibility they have to first, ensure their survival, and second, ensure that their news content reaches the public in the fairest way possible. Social media platforms should also see it as their responsibility and be very careful not to further disrupt the role that journalism plays in a democratic society. It is essential that they understand that they share responsibility in keeping citizens adequately informed so they can make adequate decisions in the social and political dimensions of democratic life. If they plan to continue enjoying the spoils of being the dominant actors in the journalism market, social media platforms must see promotion of the democratic function of journalism as an essential part of their mission. They may be the biggest gatekeepers on the Internet right now, but they could also see themselves as keepers of journalism and democracy, not just as agents of domination and profit.

Desantes (1974) said a journalist, and by extension publishers, must try to determine what information needs the media are covering at any given moment in time and must then attempt to discover what others they need to cover. This makes it necessary to analyze what messages are being ignored or omitted- intentionally or not- by media professionals and organizations.

There is no doubt that a positive side of social media platforms is that they can reach wide audiences effectively and that they can disseminate the news with little direct monetary cost to the users. It is unquestionable that access to news content has increased, at least in terms of available content and the sheer numbers of people that can be reached. Today, many users can access content through mobile phones at very competitive prices for both the terminals and data plans. There is no doubt that the same technology makes it possible for citizens to receive information in quick and effective ways, and that there is actual potential to foster a diversity of voices involved in the information process.

Unfortunately, it doesn’t look like online platforms can ensure that users are receiving relevant, truthful and quality information. Sometimes there is reason to believe that they do not really care (Levin 2018). We know that their algorithms foster the creation of filter bubbles (Pariser 2011) and that they enable the spread of disinformation (Benkler et al. 2017) or any other content of questionable value for our society (Swearingen 2018).

It is precisely because social media platforms seem to care about generating revenue first and informing the public second (Klonick 2019), why it is doubtful that they are implementing algorithms capable of determining the information needs of users and the best way to meet them.

Facebook uses data collected from users to ensure that the right advertisement will reach the right user at the right time. Every time a user visits Facebook, an algorithm will choose between thousands of ads and will show those that it deems the most relevant ones to each user profile.

A problem derived from this is that advertisers may choose to show ads to certain users but ignore others that may not be potential, or desirable, clients. This may lead media to concentrate their resources in creating news content that responds to the needs of those advertisers and not users. What we mean by this is that, just as algorithms seek the ads that have the best potential to sell, media companies will only try to create news content that algorithms will “think” will sell better with the users. This may in turn lead to the proliferation of narratives that do not reflect the lives or problems of different groups in society and will not respond to their information needs. This leads to a press that is not at all pluralistic and not at all representative, of the likes that Bagdikian (2004), McChesney (2002) and Nerone (1995) have warned us about.

On social media platforms, what messages, information and opinions reach users is decided by algorithms that also decide what information is presented more prominently and as a consequence, what sources are presented as the most important and trustworthy.

Software makes decisions for us, but such software is influenced by the desires of commercial entities and software creators (Silverman 2015), not by the information needs of users. As stated before, unless companies commit to transparency, there is no real way of knowing which interests are influencing what news content is reaching citizens.

Responsibility falls both in media companies and social media platforms, who should accept their role as news content editors and publishers and should, as a consequence, make sure that their algorithms do not do a disservice to journalism. It is important that both are conscious of their responsibilities towards the general public and democracy. Perhaps, as both Pariser (2011) and Silverman (2015) suggest, they should seek closer alliances with other journalistic enterprises that can share with them the ethical criteria they have acquired through experience.

All of this is related to the global consequences of their actions and the influence they have over information flows and in swaying public opinion. Bell (2016) asks what type of information society we aim to create and how we intend to shape it, a question that social media platforms, media companies and society at large should ask and seek answers to.

Based on that answer, citizens should demand that these powerful gatekeepers in the online information ecosystem stick to solid ethical guidelines and should demand that they demonstrate their intention of embracing their responsibilities towards the general public.

For Ananny (2016) contemporary information flows are influenced by four dynamics: work routines of news makers, the rhythm of social media platforms, algorithms, and existing regulation. More and more, what news are deemed important and deserve coverage could depend on what is considered important by algorithms and not by journalists and by those in charge of editorial and publishing duties at media organizations. It would be regrettable if we reached a situation in which journalistic enterprises decide to create content only based on their potential to become viral (Silverman 2015), or to be favored by algorithms in order to have the maximum distribution on social media based on their potential to generate revenue (Pariser, 2011) and not on the merits of information value.

Leaving editorial decisions in the hands of automatic processes and basing those decisions in supposedly precise data that prioritize revenue potential poses a serious danger to the quality of news content and to quality journalism, especially if human influence and agency becomes minimal.

Social media platforms and the media will exclude important messages unconsciously if they do not realize that their algorithms and their data are prone to errors, defects and inconsistencies. They will begin to consciously exclude messages if journalists do not behave adequately and they only seek to produce news content that panders to the parameters of algorithms and if they are willing to pursue their commercial interests to the very end, even if those may be aligned, at least at times, with the interests of disinformation actors (Gosh & Scott 2018).

Social media platforms such as Facebook have built a business around constant surveillance of the user in order to mine data that can give insight into what will get their attention. This attention is the fuel for a lucrative advertising market. The distribution of quality news content sorted out with the goal to inform and expose users to all points of views in order to fuel public debate is not part of the equation. Social media platforms business model provides no incentive for seeking those. They always will, whether they realize it or not, end up excluding the messages with the most information value in favor of those contents that produce the most revenue and instant gratification. This is because algorithms are deployed to build a house of numbers where business is always drive by quantity, not quality.

This means that transparency and working to refine their algorithms so that they can better serve the information needs of their constituents may not be enough. Behaving in an ethically responsible way may mean that they need to take a long, hard look at this business model.

Ultimately, if social media platforms are willing to embrace the ethical responsibilities of their role as editors and news distributors, they should demonstrate their commitment to freedom of expression and of the press and respect for the rights to seek, receive and impart information by first understanding that above profit, there must be a duty to satisfy the most important information needs of citizens. Then, they must act on it.

Conclusions

Social media platforms must acknowledge that, whether they want it or not, they play a key role in the way citizens consume news, information and opinion today. They are the gatekeepers of our current times and they wield tremendous power. If they are to continue wielding such power, they should assume the duties and responsibilities that their position imposes upon them.

The freedom of expression and the press of social media platforms should also receive the protection that is afforded to other media and journalistic enterprises. While this means that they have the right to use their algorithms to achieve whatever legitimate goals they set for themselves, they must understand that, in this particular line of work, they must strive for the public good and that entails certain duties due to its important role in preserving democracy within society. Social media companies are free to continue relying on computer engineers and programmers to design algorithms, but they should seek the cooperation of people with experience in journalism, so that ethical values and practices learned through the exercise of their profession can in turn be transmitted to those that create the software that delivers the news to the people.

Social media platforms need to make greater transparency efforts and inform users of the types of information they collect and how they use it, helping people to understand the consequences that this data mining can have in their lives, and in this particular, the consequences and impact algorithms have in their rights to seek, receive and impart information.

Social media platforms also need to take a long, hard look at their business models and incorporate to their profit-driven goals other goals that help ensure that news distribution is more based on quality than quantity and on accuracy rather than speed. Above all, they should understand that their line of work demands a commitment to the public and a duty to serve their information needs adequately.

Social media platforms must know and embrace the responsibilities of their role as news distributors and publishers especially given their dominant position in the market and the great power their wield over our means of expression as the all-powerful gatekeepers of a highly concentrated online news market. Their commitment to freedom of expression and of the press and the respect for the rights to seek, receive and impart information must be placed above their business models and should be a key factor in its re-evaluation. If not, nothing will change.

References

Ananny, M. (2016). Networked News Time. Digital Journalism. [ Links ]

Bagdikian, B. H. (2004). The New Media Monopoly (20th Edition ed.). Boston: Beacon Press. [ Links ]

Bakshy, E., Messing, S., & Adamic, L. (2015, May 7). Exposure to ideologically diverse news and opinion on Facebook. Sciencexpress. [ Links ]

Balkin, J. (1999). How Mass Media Simulate Political Transparency. Cultural Values, 3(4). [ Links ]

Balkin, J. (2004). Commentaries: Digital Seepch and Democratic Culture: A Theory of Freedom of Expression for the Information Society. NYU Law Review, 79(1). [ Links ]

Balkin, J. (2018). Free Speech is a Triangle. Columbia Law Review, 118, 2011-2055. [ Links ]

Barthel, M., Elisa, S., Gottfried, J., & Mitchell, A. (2015, July 14). The Evolving Role of News on Twitter and Facebook. Retrieved February 8, 2016, from www.journalism.org: http://www.journalism.org/2015/07/14/the-evolving-role-of-news-on-twitter-and-facebook/Links ]

Bell, E. (2016, March 7). Facebook is eating the world. Retrieved March 7, 2016, from Columbia Journalism Review: http://www.cjr.org/analysis/facebook_and_media.php Links ]

Benjamin, S. M. (2013). Algorithms and Speech. University of Pennsylvania Law Review, 161, 1495-1533. [ Links ]

Breton, A. (2006). The Economics of Transparency in Politics. Aldershot: Ashgate. [ Links ]

Bro, P., & Filip, W. (2014). Digital Gatekeeping. News media versus social media. Digital Journalism, 2(3), 446-454. [ Links ]

Brown, P. (2018, February 2). More than half of Facebook Instant Articles partners may have abandoned it. Retrieved from Columbia Journalism Review: https://www.cjr.org/tow_center/are-facebook-instant-articles-worth-it.phpLinks ]

Bruns, A. (2005). Gatewatching: Collaboative Online News Production. New York: Peter Lang. [ Links ]

Butcher, T. (2016). The algorithmic imaginary: exploring the ordinary affects of Facebook algorithms. Communication & Society. [ Links ]

Byers, D. (2015, May 13). Facebook launches 'Instant Articles'. Retrieved from Politico: https://www.politico.com/blogs/media/2015/05/facebook-launches-instant-articles-207025Links ]

Citron, D. K. (2014). Hate Crimes in Cyberspace. Cambridge: Harvard University Press. [ Links ]

Clegg, N. (2019, May 11). Breaking Up Facebook is Not the Answer. Retrieved from The New York Times: https://www.nytimes.com/2019/05/11/opinion/facebook-nick-clegg-chris-hughes.htmlLinks ]

Constine, J. (2016, September 6). How Facebook News Feed Works. Retrieved from Techcrunch: https://techcrunch.com/2016/09/06/ultimate-guide-to-the-news-feed/Links ]

Constine, J. (2018, February 3). How Facebook Stole the News Business. Retrieved from Techcrunch: https://techcrunch.com/2018/02/03/facebooks-siren-call/Links ]

Cramer, H., Evers, V., Ramlal, S., van Someren, M., Rutledge, L., Stash, N., . . . Wielinga, B. (2008). The Effects of Transaprency on Trust in and Acceptance of a Content-Based Art Recommender. User Modeling and User-adapted interaction, 18(5), 455-496. [ Links ]

Crawford, K. (2013, April 1). The Hidden Biases in Big Data. Retrieved from Harvard Business Review: https://hbr.org/2013/04/the-hidden-biases-in-big-dataLinks ]

Desantes Guanter, J. M. (1974). La Información como Derecho. Madrid: Editorial Nacional. [ Links ]

De Vito, M. (2017). From Editors to Algorithms. Digital Journalism, 5(6), 753-773. [ Links ]

Deuze, M. (2005). What is Journalism? Professional identity and Ideology of Journalists Reconsidered. Journalism, 6(4), 422-464. [ Links ]

Diakopolous, N., Cass, S., & Romero, J. (2014). Data-Driven Rankings: The Design and Development of the IEEE Top Programming Languages News App. Proceedings. Symposium on Computation and Journalism. [ Links ]

Diakopoulos, N., & Koliska, M. (2017). Algorithmic Transparency in the New Media. Digital Journalism, 5(7), 809-828. [ Links ]

Diel, S. (2017). New Media, Legacy Media and Misperceptions Regarding Sourcing. KOME, An International Journal of Pure Communication Inquiry, 5(1), 104-120. [ Links ]

Dwyer, T., & Martin, F. (2017). Sharing News Online. Social media news analytics and their implications for media pluralism policies. Digital Journalism, 5(8), 1080-1100. [ Links ]

El-arini, K., Paquet, U., Herbrich, R., Van Gael, J., & Aguera y Arcas, B. (2012). Transparent User Models for Personalization. Procedings International Conference on Knowledge Discovery and Data Mining, 678-686. [ Links ]

Elkin Koren, N., & Haber, E. (2016). Governance by Proxy. Brooks Law Review, 82. [ Links ]

Elranger, S. (2017). 'Fake Ne (Bro & Filip, 2014)ws', Trump's Obession, is Now a Cudgel for Strongmen. Retrieved 2018 February, from The New York Times: https://www.nytimes.com/2017/12/12/world/europe/trump-fake-news-dictators.htmlLinks ]

eMarketer. (2017). Google and Facebook Tighten Grip on US Digital Market. Retrieved April 2018, from eMarketer.com: 2017 https://www.emarketer.com/Article/Google-Facebook-Tighten-Grip-on-US-Digital-Ad-Market/1016494Links ]

Eslami, M., Karahalios, K., Sandvig, C., Vaccaro, K., Rickman, A., Hamilton, K., & Kirlik, A. (2016). First I "like" it, then I hide it: Folk Theories of Social Feeds. ACM. [ Links ]

Espiritusanto, O., & Gonzalo, P. (2011). Periodismo Ciudadano. Evolución Positiva de la Comunicación. Madrid: Ariel. [ Links ]

Faris, R., Roberts, H., Etling, B., Bourassa, N., Zuckerman, E., & Benkler, Y. (2017). Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election. Berkman Klein Center for Internet & Society Research Paper. [ Links ]

Friedman, B., & Nissenbaum, H. (1996). Bias in Computer Systems. ACM Transactions on Informations Systems, 14(3), 330-347. [ Links ]

Fung, A., Graham, M., & Weil, D. (2007). Full Disclosure: The Perils and Promise of Transparency. New York: Cambridge University Press. [ Links ]

Gillespie, T. (2014). The Relevance of Algorithms. In T. Gillespie, P. Boczkowski, & K. Foot. Media Technologies: Essays on Communication, Materiality, and Societty (pp. 167-194). Cambridge: MIT Press. [ Links ]

Goldman, R., & Himel, A. (2018). Making Ads and Pages more Transparent. Retrieved May 2018, from Facebook Newsroom. [ Links ]

Gosh, D. & Scott, B. (2018). #DIGITALDECEIT. The Technologies Behind Precision Propaganda on the Internet. Harvard Kennedy School Shorenstein Center on Media, Politics and Public Policy. [ Links ]

Haim, M., & Graefe, A. (2017). Automated News. Digital Journalism, 5(8). [ Links ]

Halleck, T. (2015, January 30). Facebook: One Out of Every Five People On Earth Have An Active Account. Retrieved February 8, 2016, from ibtimes.com: http://www.ibtimes.com/facebook-one-out-every-five-people-earth-have-active-account-1801240Links ]

Hayes, A., Singer, J., & Ceppos, J. (n.d.). Shifting Roles, Enduring Values: The Credible Journalist in a Digital Age. Journal of Mass MEdia Ethics, 22(4), 262-279. [ Links ]

Hazard Owen, L. (2015, November 30). "Why not be all the way in?" How publishers are using Facebook Instant Articles. Retrieved from Niemanlab.org: http://www.niemanlab.org/2015/11/why-not-be-all-the-way-in-how-publishers-are-using-facebook-instant-articles/Links ]

Hughes, S. (2018). Mark Zuckerberg made mistakes in Fake News and Privacy. Retrieved May 2018, from The Wall Street Journal: https://www.wsj.com/articles/mark-zuckerberg-facebook-made-mistakes-on-fake-news-privacy-1523289089Links ]

Hughes, C. (2019, May 9). It's Time to Break up Facebook. Retrieved from New York Times: https://www.nytimes.com/2019/05/09/opinion/sunday/chris-hughes-facebook-zuckerberg.htmlLinks ]

Ingram, M. (2018). The Facebook Armageddon. Retrieved from Columbia Journalism Review: https://www.cjr.org/special_report/facebook-media-buzzfeed.phpLinks ]

Isaac, M. (2019, March 30). Mark Zuckerberg's Call to Regulate Facebook, explained. Retrieved from New York Times: https://www.nytimes.com/2019/03/30/technology/mark-zuckerberg-facebook-regulation-explained.htmlLinks ]

Jackson, B. F. (2014). Censorship and Freedom of Expression in the Age of Facebook. New Mexico Law Review, 44, 121-167. [ Links ]

Kadri, T., & Klonick, K. (2019). Facebook v. Sullivan: Building Constitutional Law for Online Speech. Southern California Law Review, Forthcoming. [ Links ]

Kafka, P. (2015, October 7). Google Unveils 'AMP'-- Its Answer to Facebook's Instant Articles. Retrieved from Vox: https://www.vox.com/2015/10/7/11619288/google-unveils-amp-its-answer-to-facebooks-instant-articles Links ]

Kelly, M. L. (2018). Media or Tech Company? Facebook's Profile is Blurry. Retrieved May 2018, from NPR: https://www.npr.org/2018/04/11/601560213/media-or-tech-company-facebooks-profile-is-blurryLinks ]

Kitchin, R. (2017). Thinking critically about and researching algorithms. Information Communication & Society, 20(1), 14-29. [ Links ]

Kizilcec, R. (2016). How Much Information? Effects of Transparency on Trust in an Algorithmic Interface. Procedings Conference on Human Factors in Computer Systems. CHI. [ Links ]

Klonick, K. (2018). The New Governors: The People, Rules, and Processes Governing Online Speech. Harvard Law Review, 1598-1670. [ Links ]

Lee, J. (2013). No. 1 Position in Google Gets 33% of Search Traffic. Retrieved April 2018, from Search Engine Watch: https://searchenginewatch.com/sew/study/2276184/no-1-position-in-google-gets-33-of-search-traffic-studyLinks ]

Lee, J. H. (2014). News Values, Media Coverage, and Audience Attention: An Analysis of Direct and Mediated Casual Relationships. Journalism & Mass Communication Quarterly, 86(1), 175-190. [ Links ]

Levin, S. (2018, July 2018). Is Facebook a Publisher? In public it says no, but in court it says yes. Retrieved from The Guardian: https://www.theguardian.com/technology/2018/jul/02/facebook-mark-zuckerberg-platform-publisher-lawsuitLinks ]

Levin, S. (2018, December 13). 'They don't care': Facebook fatchecking in disarray as journalists push to cut ties. Retrieved from The Guardian: https://www.theguardian.com/technology/2018/dec/13/they-dont-care-facebook-fact-checking-in-disarray-as-journalists-push-to-cut-tiesLinks ]

Lev-Ram, M. (2018, April 24). Facebook Tries to Bring More Transparency to Opaque Set of Guidelines. Retrieved from Fortune: http://fortune.com/2018/04/24/facebook-tries-to-bring-more-transparency-to-opaque-set-of-guidelines/Links ]

Mac, R. (2018, December 20). Literally Just a Big List of Facebook's 2018 Scandals. Retrieved from Buzfeed News: https://www.buzzfeednews.com/article/ryanmac/literally-just-a-big-list-of-facebooks-2018-scandalsLinks ]

Magarian, G. (2007). The Jurisprudence of Colliding First Amendment Interests: From the Dead end of Neutrality to the Open Road of Participation-Enhancing Review. Notre Dame Law Review, 83(185). [ Links ]

Mager, A. (2012). Algorithmic Ideology: How Capitalist Society Shapes Search Engines. Information, Communication & Society, 15(5), 769-787. [ Links ]

Malik, M. M., & Pfeffer, J. (2016). A Macroscopic Analysis of News Content in Twitter. Digital Journalism. [ Links ]

Mason, P. (2016). Postcapitalism: A Guide to Our Future. New York: Farrar, Straus and Giroux. [ Links ]

Marcos-Recio, J., Sanchez-Vigil, J., & Olivera Zaldua, M. (2015). Producción Científica sobre Comunicación y Medios en las revistas de Documentación (2000-2014). Revista Española de Documentación Científica, 38(4), 108. [ Links ]

McChensey, R., & Nichols, J. (2002). Our Media, Not Theirs: The Democratic Struggle against Corporate Media (Revised Edition ed.). New York: Seven Stories Press. [ Links ]

Mchesney, R. (1997). Corporate media and the threat to democracy. New York: Seven Stories Press . [ Links ]

McKelvey, F. (2014). Algorithmic Media Need Democratic Methods: Why Publics Matter. Canadian Journal of Communication, 39, 597-613. [ Links ]

Meyer, P. (2004). The Vanishing Newspaper: Saving Journalism in the Information Age. Columbia, Missouri: University of Missouri. [ Links ]

MeikleJohn, A. (1948). Free Speech and its Relation to Self-Government. Clark: The Lawbook Exhange. [ Links ]

Moore, R., Murray, M., Farrell, M., & Youm, K. (2018). Media Law and Ethics, 5th edition. New York: Routledge. [ Links ]

Morozov, E. (2011). The Net Delusion: The Dark Side of Internet Freedom. New York: Public Affairs. [ Links ]

Napoli, P. M. (2015). Social media and the public interest: Governance of news platforms in the realm of individual and algorithmic gatekeepers. Telecommunications Policy. [ Links ]

Natheson, I. S. (2013). Super-Intermediaries, Code, Human Rights. Intercultural Human Rights Law Review, 8, 19-175. [ Links ]

Nerone, J. C. (Ed.). (1995). Last Rights. Urbana and Chicago: University of Illinois Press. [ Links ]

Newman, N. (2013). Reuters Institue Digital News Report 2013. Oxford: Reuters Institute for the Study of Journalism, University of Oxford. [ Links ]

Newman, N., Fletcher, R., & Nielsen, R. K. (2016). Digital News Report 2016. Oxford: Reuters Institute for the Study of Journalism, University of Oxford . [ Links ]

Newton, C. (2019, April 3). How extremism came to thrive on YouTube. Retrieved from The Verge: https://www.theverge.com/interface/2019/4/3/18293293/youtube-extremism-criticism-bloombergLinks ]

Palmer, J. (2000). Spinning into Control: News Values and source Strategies. New York: Leicester University Press. [ Links ]

Pariser, E. (2011). The Filter Bubble: What the Internet is Hiding From You. New York: Penguin Press. [ Links ]

Pasquale, F. (2015). The Black Box Society. Cambridge: Harvard University Press . [ Links ]

Petre, C. (2015). The Traffic Factories: Metrics at Chartbeat, Gawker Media and the New York Times. New York: Tow Center for Digital Journalism. [ Links ]

Pew Research Center. (2016). State of the News Media 2016. Pew Research Center. [ Links ]

Pierson, D., & Lien, T. (2018, January 12). To please users and halt fake news, Facebook makes it harder for real news outlets. Retrieved from Los Angeles Times: https://www.latimes.com/business/la-fi-tn-facebook-shares-20180112-story.htmlLinks ]

Richardson, L. (2000). Writing: A method of inquiry. In N. K. Denzin & Y. S. Lincoln (Eds.), Handbook of qualitative research (2nd ed., pp. 923-948). Thousand Oaks, CA: Sage. [ Links ]

Rosen, J. (2012). The Deciders: The Future of Privacy and Free Speech in the Age of Facebook and Google. Fordham Law Review, 80. [ Links ]

Rosenberg, S. (2019, March 6). Facebook's reputation is sinking fast. Retrieved from Axios: https://www.axios.com/facebook-reputation-drops-axios-harris-poll-0d6c406a-4c2e-463a-af98-1748d3e0ab9a.htmlLinks ]

Satariano, A. (2019, May 6). Europe is Reining in Tech Giants. But Some Say It's Going too Far. Retrieved from New York Times: https://www.nytimes.com/2019/05/06/technology/europe-tech-censorship.htmlLinks ]

Scheufele, D. (2000). Agenda-Setting, Priming and Framing Revisited: Another Look at Cognitive Effects of Political Communication. Mass Communication & Society, 3(2-3), 297-316. [ Links ]

Schultz, I. (2007). The Journalistic Gut Feeling: Journalistic Doxa, News Habitus and Orthodox News Values. Journalism Practice, 1(2), 190-207. [ Links ]

Seaver, N. (2014). Knowing algorithms. Media in Transition, 1-12. [ Links ]

Shearer, E., & Gotffried, J. (2017). New Use across Social Media Platforms 2017. Retrieved April 2018, from Pew Research Center Journalism and Media: http://www.journalism.org/2017/09/07/news-use-across-social-media-platforms-2017/Links ]

Sheth, S., & Bertrand, N. (2017). Trump Jr's Meeting with a Russian Lawyer sheds new light on the extent of election interference. Retrieved February 2018, from Business Insider: http://www.businessinsider.com/evidence-russia-meddled-in-us-election-2017-6Links ]

Social TNS & Opinion. (2016). Standard Eurobarometer Report 86 Autumn 2016: Media Use in the European Union. Brussels: European Commission. [ Links ]

Silverman, J. (2015). Terms of Service. Social Media and the Price of Constant Connection. New York: Harper Perennial. [ Links ]

Singer, J. (2007). Contested Autonomy. Journalism Studies, 8(1), 79-95. [ Links ]

Solove, D. J. (2011). Nothing to Hide. The false tradeoff between privacy and security. New Haven: Yale University Press. [ Links ]

Statista. (2018). Number of monthly active Facebook users worldwide as of 1st quarter 2018 (in millions). Retrieved May 2018, from www.statista.com: https://www.statista.com/statistics/264810/number-of-monthly-active-facebook-users-worldwide/Links ]

Subramanian, S. (2017). Insie the Macedonian Fake-News Complex. Retrieved February 2018, from Wired Magazine: https://www.wired.com/2017/02/veles-macedonia-fake-news/Links ]

Sunstein, C. (1992). Free Speech Now. University of Chicago Law Review(255). [ Links ]

Sunstein, C. (1993). Democracy and the Problem of Free Speech. New York: The Free Press. [ Links ]

Sunstein, C. (1994). A New Deal for Speech. Hastings Communication and Entertainment Law Review, 17(137). [ Links ]

Swearingen, J. (2018). YouTube’s Algorithm Wants You to Watch Conspiracy-Mongering Trash. Retrieved March 2018, from New York Magazine: http://nymag.com/selectall/2018/02/youtubes-recommendation-algorithm-favors-conspiracy-videos.htmlLinks ]

Taub, A., & Fisher, M. (2018, April 21). Where Countries are Tinderboxes and Facebook is a Match. Retrieved from New York Times: https://www.nytimes.com/2018/04/21/world/asia/facebook-sri-lanka-riots.html?module=inlineLinks ]

Thompson, N., & Vogelstein, F. (2019, April 16). 15 Months of Fresh Hell Inside Facebook. Retrieved from Wired: https://www.wired.com/story/facebook-mark-zuckerberg-15-months-of-fresh-hell/Links ]

Thorson, K., & Wells, C. (2016). Curated Flows: A Framework for Mapping Media Exposure in the Digital Age. Communication Theory, 26(3), 309-328. [ Links ]

Tufekci, Z. (2015). Algorithmic Harms beyond Facebook and Google: emergent Challenges of Computational Agency. Telecommunications and High Tech Law (13), 203-2017. [ Links ]

Tufecki, Z. (2016). The Real Bias Built in at Facebook. Retrieved March 2018, from The New York Times: https://www.nytimes.com/2016/05/19/opinion/the-real-bias-built-in-at-facebook.htmlLinks ]

Vaidhyanathan, S. (2018). Anti-Social Media: How Facebook Disconnects Us and Undermines Democracy. New York: Oxford University Press. [ Links ]

Yemini, M. (2018). The New Irony of Free Speech. Columbia Science and Technology Law Review, XX, 119-194. [ Links ]

US Census Bureau. (2018). US Census Bureau US Population Numbers. Retrieved from Census.gov: https://www.census.gov/topics/population.htmlLinks ]

Wu, T. (2013). Machine Speech. University of Pennsylvania Law Review, 161, 1495-1533. [ Links ]

Zuboff, S. (2015). Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30, 75-89. [ Links ]

Zuckerberg, M. (2019, March 30). The Internet Needs New Rules. Let's Start in these Four Areas. Retrieved from The Washington Post: https://www.washingtonpost.com/Links ]

Recibido: 15 de Enero de 2019; Aprobado: 10 de Agosto de 2019

Creative Commons License Este es un artículo publicado en acceso abierto bajo una licencia Creative Commons