Project Statement – Visual Essay

For my visual essay, I want to further my knowledge into the inner workings, restrictions and rules that moderators on social media are hired for and how they decide if a flagged piece of content should be removed/blocked from the general public to see and research the social inequalities that come from moderation.

I started my project in finding out what a social media moderator is and what they do for their job was imperative to my visual essay. New Media Services (2019) defines “A social media moderator is the one handling and managing the activities done in the social media community. Their task is to see and to uphold rules, restrictions, and legalities within the area of management. They are also responsible for monitoring the comments and feedback from social media followers.”

Social media companies like Facebook, Instagram and Twitter have always governed themselves, with little to zero government regulation on how their business and practices are run. This can be seen in how they moderate their users’ content from videos, photos and self-expression that are posted to their platforms. DASELER, G. (2019) states “The First Amendment to the Constitution protects Americans from government censorship. But private companies like Twitter, Facebook, and YouTube have the right to post or prohibit what they please on their platforms, just as scores of other companies have before them.” Promotes the question if theses private companies should be allowed to moderate their users content.

Though there are two sides to social media one being the sharing of positivity and kindness throughout the world it also comes with its negatives through online trolls and terrorist groups aloud to spread hate and falsified information amongst the general population. With the forever expanding and exponential growth of the internet with more user’s gaining access to social media and uploading and sharing content of their own. DASELER, G. (2019) overserved that “Facebook having 200 million monthly users in the United States alone, or about three fifths of the U.S. population. In a single minute, the site receives 500,000 new comments, 293,000 new statuses, and 450,000 new photos. In the same amount of time, 400 hours of video are uploaded to YouTube, and 300,000 tweets are posted to Twitter.” Which in suits the issues of content being mistakenly or purposefully judged for being negative as alike the LGBTQI and Black communities pages and posts being deleted for sharing their concerns and ideas while white politicians are blatantly racist or spread miss leading information on the same social media platforms with little to no consequences.

I also want to further my knowledge on how moderators are trained as well as the guidelines that they are inhibited to obey by and how this impacts their specific social media pages. With my initial research I have discovered a journal where they gave a two-hour training to moderators of a blog forum to have a proactive approach to reduce the risk to adolescents with depression or anxiety where they found in C, Clair, M, Long, C, Boyle, L & Radovic, A n.d., “study, after examining the role of the moderator in a Web-based intervention for adolescents with depression or anxiety, we found that moderating such an intervention was feasible and resulted in no safety concerns. Additionally, moderators exhibited various approaches that may impact user engagement.”

I also researched recent case studies that display social inequalities around moderation of different social media platforms to further my knowledge. The first case study from the BBC news blog post, being about how transgender people on the social media app TikTok were being censored or even their accounts being deleted. because of the content they were producing and sharing to their audience. Cristina Criddle (2020) states “Videos, including some of people discussing their lives, were taken down and others had the sound removed, according to some posters.” Bring question to the negative and injustice moderation amongst this community of users. For my second case study was from The Guardian blog post, being about Instagram’s murky ‘shadow bans’ against marginalised communities on the platform. As Chanté Joseph (2019) “The vagueness of Instagram’s shadow-banning policy leaves users confused as to what is and isn’t appropriate”. confuses the community about what is classed as inappropriate and is taken down. For my final case study was from the ABC New blog post about the #freethenip movement for gender equality for life and social media. One of the proctors from the day quotes by Samantha Turnbull (2018) “I think the best way to get our equality and power back is with our brains, not our bodies,” — Julia Chapple”. These case studies furthered my knowledge into the social inequalities that social media companies enforce when it comes to moderating their users’ content. 

Through using images and minimal words to educate and provoke questions from the reader of the visual essay. I found quite challenging yet rewarding as the process required meticulous thought and planning to achieve the desired interpretation from the reader. I also learnt that finding good creative commons images is quite challenging. I chose to use Prezzie software to create my visual essay as have used it before and was confident in my ability to depict my message and made it easy to create unique transitions and visual to my visual essay. I chose simple fonts and colours which contrasted nicely with the images to best suit them but not draw too much attention. As for my selected images I chose high quality pictures and graphics in my visual essay for the readers pleasure. The images and graphics were based around the social inequalities and well as the positive for moderation on social media platform and give the reader a greater insight. 

References:

Cristina Criddle 2020, ‘Transgender users accuse TikTok of censorship’, weblog post, 12 February, viewed 15 June 2020, < https://www.bbc.com/news/technology-51474114>

Chanté Joseph 2019, ‘Instagram’s murky ‘shadow bans’ just serve to censor marginalised communities’, weblog post, 9 November, viewed 15 June 2020, < https://www.theguardian.com/commentisfree/2019/nov/08/instagram-shadow-bans-marginalised-communities-queer-plus-sized-bodies-sexually-suggestive>

DASELER, G 2019, ‘Web of Lies: The challenges of free speech in the age of social media’, American Conservative, vol. 18, no. 4, p. 43, viewed 15 June 2020, <https://search-ebscohost-com.ezproxy.uow.edu.au/login.aspx?direct=true&db=f6h&AN=136967534&site=eds-live&gt;.

Samantha Turnbull 2018, ‘Topless protesters join Free the Nipple movement for gender equality’, blog post, 15 August, viewed 15 June 2020, < https://www.abc.net.au/news/2018-08-12/topless-protesters-join-free-the-nipple-movement/10109872>

New Media Services 2019, “SOCIAL MEDIA MODERATION GUIDE: THE IMPORTANCE OF MODERATING CONTENTS ON SOCIAL MEDIA’, weblog post, 1 April, viewed 15 June 2020, < https://newmediaservices.com.au/2019/04/01/social-media-moderation-guide/#:~:text=A%20social%20media%20moderator%20is,feedback%20from%20social%20media%20followers.>

Windler, C, Clair, M, Long, C, Boyle, L & Radovic, A n.d., ‘Role of Moderators on Engagement of Adolescents With Depression or Anxiety in a Social Media Intervention: Content Analysis of Web-Based Interactions’, JMIR MENTAL HEALTH, vol. 6, no. 9, viewed 15 June 2020, <https://search-ebscohost-com.ezproxy.uow.edu.au/login.aspx?direct=true&db=edswsc&AN=000488607200001&site=eds-live&gt;.

Visual Essay Idea

Social Media Content Moderators

Question: Are Social Media Content Moderators Necessary?

For my visual essay, I want to explore the in a working, restrictions and rules that moderators on social media are hired for and how they decide if a flagged piece of content should be removed/blocked from the general public to see, and evaluate the effectiveness of moderators and if they are necessary.

Social media companies like Facebook, Instagram and Twitter have always governed themselves, with little to zero government regulation on how their business and practices are run. This can be seen in how they moderate their users’ content from videos, photos and self-expression that are posted to their platforms. DASELER, G. (2019) states “The First Amendment to the Constitution protects Americans from government censorship. But private companies like Twitter, Facebook, and YouTube have the right to post or prohibit what they please on their platforms, just as scores of other companies have before them.” Promotes the question of these private companies.

Though there are two sides to social media one being the sharing of positivity and kindness throughout the world it also comes with its negatives through online trolls and extremeness groups aloud to spread hate and falsified information to the general population. With the forever expanding and exponential growth of the internet with more user’s gaining access to social media and uploading and sharing content of their own. DASELER, G. (2019) overserved that “Facebook having 200 million monthly users in the United States alone, or about three fifths of the U.S. population. In a single minute, the site receives 500,000 new comments, 293,000 new statuses, and 450,000 new photos. In the same amount of time, 400 hours of video are uploaded to YouTube, and 300,000 tweets are posted to Twitter.” Which in suits the issues of content being mistakenly or purposefully judged for being negative as alike the c pages and posts being deleted for sharing their concerns and ideas being silenced while politicians are blatantly racist or spreading misinformation on the same social media platform are being allowed to with little to no consequences.

I also want to further my knowledge on how moderators are trained as well as the guidelines that they are inhibited to obey by and how this impacts their specific social media pages. With my initial research I have discovered a journal where they gave a two-hour training to moderators of a blog forum to have a proactive approach to reduce the risk to adolescents with depression or anxiety where they found in C, Clair, M, Long, C, Boyle, L & Radovic, A n.d.,  “study, after examining the role of the moderator in a Web-based intervention for adolescents with depression or anxiety, we found that moderating such an intervention was feasible and resulted in no safety concerns. Additionally, moderators exhibited various approaches that may impact user engagement.”

For the first source that I decided to include in this visual essay pitch is an article from EBSCOhost  DASELER, G. (2019) as my first intentions of this project was to gather a further understanding of how social media works and find the positives and negatives to using humans in moderating content.

As for my second key source in my initial research for this project was another journal article Windler, C, Clair, M, Long, C, Boyle, L & Radovic, A n.d., which provided myself with a greater understanding into how moderators are trained to respond to specific content and the comments that are contained within them, in the goal to reduce teen depression and anxiety on social media.

References

DASELER, G 2019, ‘Web of Lies: The challenges of free speech in the age of social media’, American Conservative, vol. 18, no. 4, p. 43, viewed 18 May 2020, <https://search-ebscohost-com.ezproxy.uow.edu.au/login.aspx?direct=true&db=f6h&AN=136967534&site=eds-live&gt;.

Windler, C, Clair, M, Long, C, Boyle, L & Radovic, A n.d., ‘Role of Moderators on Engagement of Adolescents With Depression or Anxiety in a Social Media Intervention: Content Analysis of Web-Based Interactions’, JMIR MENTAL HEALTH, vol. 6, no. 9, viewed 19 May 2020, <https://search-ebscohost-com.ezproxy.uow.edu.au/login.aspx?direct=true&db=edswsc&AN=000488607200001&site=eds-live&gt;.