Social media and its intersections with free speech, freedom of information and privacyAn analysis

  1. Francisco Segado-Boj 1
  2. Jesús Díaz-Campo 2
  1. 1 Universidad Complutense de Madrid
    info

    Universidad Complutense de Madrid

    Madrid, España

    ROR 02p0gd045

  2. 2 Universidad Internacional de La Rioja
    info

    Universidad Internacional de La Rioja

    Logroño, España

    ROR https://ror.org/029gnnp81

Zeitschrift:
Icono14

ISSN: 1697-8293

Datum der Publikation: 2020

Titel der Ausgabe: Métodos computacionales en Comunicación

Ausgabe: 18

Nummer: 1

Seiten: 231-255

Art: Artikel

DOI: 10.7195/RI14.V18I1.1379 DIALNET GOOGLE SCHOLAR lock_openDialnet editor

Andere Publikationen in: Icono14

Zusammenfassung

Atualmente, há um crescente debate sobre o impacto das redes sociais na sociedade. Os potenciais efeitos negativos desses meios de comunicação despertaram o interesse e a cautela dos acadêmicos. Esta pesquisa enfoca três interseções de redes sociais e liberdades fundamentais: liberdade de expressão, liberdade de informação e privacidade. Primeiramente, são analisadas as redes sociais e sua evolução desde o nascimento, no início do século XXI, destacando seus aspectos positivos. Assim, propõe-se identificar más práticas relacionadas às redes sociais e liberdades fundamentais. Uma revisão da literatura que destaca essas más práticas é apresentada. Esta revisão destaca questões como censura arbitrária, limites à liberdade de expressão, desinformação, diversidade de fontes, visões e perspectivas, conteúdo do usuário e configurações de privacidade e criação de perfis de dados. Por fim, são propostas algumas soluções para cada um desses assuntos.

Bibliographische Referenzen

  • Angwin, J. (2017). “Facebook’s Secret Censorship Rules Protect White Men from Hate Speech But Not Black Children.” ProPublica. Retrieved from: https://www.propublica.org/article/facebook-hate-speech-censorship-internal-documents-algorithms
  • Bakshy, E., Messing, S., & Adamic, L. A. (2015). “Exposure to ideologically diverse news and opinion on Facebook”. Science, 348 (6239), 1130-1132. Retrieved from: http://science.sciencemag.org/content/348/6239/1130
  • Bozdag, E. (2013). “Bias in algorithmic filtering and personalization”. Ethics and information technology, 15 (3), 209-227. Retrieved from: https://link.springer.com/article/10.1007/s10676-013-9321-6
  • Bozdag, E., & van den Hoven, J. (2015). “Breaking the filter bubble: democracy and design”. Ethics and Information Technology, 17 (4), 249-265. Retrieved from: https://link.springer.com/article/10.1007/s10676-015-9380-y
  • Brasseur, A. (2014). Internet and politics: the impact of new information and communication technology on democracy Report | Doc. 13386. Parliamentary Assembly – Council of Europe. Committee on Culture, Science, Education and Media. Retrieved from: http://assembly.coe.int/nw/xml/XRef/Xref-XML2HTML-en.asp?fileid=20329&lang=en
  • Cheever, N. A., & Rokkum, J. (2015). “Internet Credibility and Digital Media Literacy”. The Wiley Handbook of Psychology, Technology, and Society, 56-73. https://doi.org/10.1002/9781118771952.ch3
  • Conroy, N. J., Rubin, V. L., & Chen, Y. (2016). “Automatic deception detection: Methods for finding fake news”. Proceedings of the Association for Information Science and Technology, 52(1), 1-4. Retrieved from: https://dl.acm.org/citation.cfm?id=2857152
  • Ellison, N. B., Steinfield, C., & Lampe, C. (2007). “The benefits of Facebook “friends:” Social capital and college students’ use of online social network sites”. Journal of Computer & Mediated Communication, 12 (4), 1143-1168. ttps://doi.org/10.1111/j.1083-6101.2007.00367.x
  • Fox, A. K., & Royne, M. B. (2018). “Private information in a social world: assessing consumers´ fear and understanding of social media privacy”. Journal of Marketing Theory and Practice, 26 (1-2), 72-89. https://doi.org/10.1080/10696679.2017.1389242
  • Gil de Zúñiga, H., Jung, N., & Valenzuela, S. (2012). “Social media use for news and individuals' social capital, civic engagement and political participation”. Journal of Computer-Mediated Communication, 17 (3), 319-336. https://doi.org/10.1111/j.1083-6101.2012.01574.x
  • Huckle, S., & White, M. (2017). “Fake news: a technological approach to proving the origins of content, using blockchains”. Big data, 5 (4), 356-371. https://doi.org/10.1089/big.2017.0071
  • Möller, J., Trilling, D., Helberger, N., & van Es, B. (2018). “Do not blame it on the algorithm: An empirical assessment of multiple recommender systems and their impact on content diversity”. Information, Communication & Society, 21 (7), 959-977. Retrieved from: https://www.tandfonline.com/doi/abs/10.1080/1369118X.2018.1444076
  • Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin UK.
  • Pollach, Irene (2007), “What’s Wrong with Online Privacy Policies?” Communications of the ACM, 50 (9), 103–08. Retrieved from: https://www.cs.stevens.edu/~nicolosi/classes/17fa-cs578/ref3-2.pdf
  • Postanjyan, Z. (2012a), “The protection of freedom of expression and information on the Internet and online media”. Doc. 12874 Parliamentary Assembly – Council of Europe. Committee on Culture, Science, Education and Media. Retrieved from: http://assembly.coe.int/nw/xml/XRef/Xref-DocDetails-en.asp?FileID=13080&lang=en
  • Postanjyan, Z. (2012a), “Addendum to The protection of freedom of expression and information on the Internet and online media report”. Doc. 12874 Add. Parliamentary Assembly – Council of Europe. Committee on Culture, Science, Education and Media. Retrieved from: http://assembly.coe.int/nw/xml/XRef/Xref-DocDetails-en.asp?FileID=18082&lang=en
  • Rihter, A. (2011). “The protection of privacy and personal data on the Internet and online media report”. Doc 12695. Parliamentary Assembly – Council of Europe. Former Committee on Culture, Science and Education. Retrieved from: http://assembly.coe.int/nw/xml/XRef/Xref-DocDetails-en.asp?FileID=13151&lang=en
  • Swart, J.; Peters, C. & Broersma, M. (2018). Sharing and Discussing News in Private Social Media Groups, Digital Journalism, https://doi.org/10.1080/21670811.2018.1465351
  • Tandoc Jr, E. C., Lim, Z. W., & Ling, R. (2018). “Defining “Fake News” A typology of scholarly definitions”. Digital Journalism, 6 (2), 137-153. https://doi.org/10.1080/21670811.2017.1360143
  • Tsesis, A. (2017a). “Social Media Accountability for Terrorist Propaganda”. Fordham L. Rev., 86, 605. Retrieved from: https://heinonline.org/HOL/LandingPage?handle=hein.journals/flr86&div=28&id=&page=
  • Tsesis, A. (2017b). “Terrorist speech on social media”. Vand. L. Rev., 70, 651. Retrieved from: https://heinonline.org/HOL/LandingPage?handle=hein.journals/vanlr70&div=17&id=&page=
  • Wang, Y., & Kosinski, M. (2018). “Deep neural networks are more accurate than humans at detecting sexual orientation from facial images”. Journal of personality and social psychology, 114 (2), 246. Retrieved from: https://osf.io/hv28a/download/?format=pdf