Censorship has been replaced online by a system of content moderation controlled by companies, and these rely on both automation as well as human moderators employed to sift through content. The choice is not between the alleged neutrality of the impersonal machine and the errors and finiteness of human moderation, as both work in tandem.
What does content moderation on social media actually entail - how much artificial intelligence and human labour is being used, who is responsible for decisions around the removal of content or about what complaints to ignore? An insider from social media companies shares 5 concise insights on how social media giants actually work.
This paper explores what online violence against women is; what can be done to stem and ultimately eliminate it; and whose responsibility it is to do so. It does this by building upon the issues identified in two research projects, namely the research on state accountability to eliminate violence against women by the Due Diligence Project (DDP) and the research on corporate and state remedies…