It has been reported that Facebook has recently updated its new set od guidelines which allow users to moderate sensitive content including hate speech, nudity, violence and self-harm. These new rules and regulations have raised controversy on a global scale as many tag such content as offensive and negative.
The new rules came under heated debated when videos of killing remained on Facebook for few hours. Not only this, Facebook when took down a historic photo of “Namplam girl” also caused a lot of argument.
However, Facebook defends itself by saying that it wants to remain objective in matters that lie under gray areas by giving the decision power in the hands of reviewers. Further the head of global policy management, Facebook says that it really is challenging however essential to review online material which is why the company is hiring additional 3000 people in its moderating team of 4500 workers.
They have also outlined in the summary form that how Facebook’s moderation team is expected to screen different threats of violence. Similarly, they have worked a lot behind the making of this document like the moderators would have to learn the names and faces of more than 600 terrorist leaders, decide when a beheading video is newsworthy or a celebratory one.
The audience or media has to understand the fact that many times videos like these are important to show and it will not provoke people to get violent at all. Some of the rules regarding using abusive and foul language have also been revised and no there is much relaxation for people in this domain which is not a very appreciative effort.
In support of publishing self-harm videos, the company says that it better to leave them live as it can alert other people and alarm them for rescue or help. The company wants to provide emotional and psychological support to its users with its new guidelines.
It further mentions that we do not the intent or reason of the user behind publishing sensitive content so we cannot take the decision on their behalf by moderating or banning their posts. However, other reviewers can influence the decision with their opinion. Similarly, content that is offensive or racist like ‘Irish or blonde are stupid’ are removed automatically while moderators ignore them on the first hand.
Not only this, the moderators will also put down serious threats that contain details such as timings, weapons or venue to carry out the given threat. The guidelines mainly focus on moderating graphical content. As per new policies, videos of animal abuse will also be allowed to raise awareness. However, still, images and videos of child abuse will be marked disturbing.
The in-depth analysis of Facebook new guidelines for moderator profile owners shows that the company wants to provide a platform that truly believes and practice freedom of speech but also prevent users from publishing real world violence as it eliminates or automatically removes content related to sexual child abuse and terrorism.