Feminist reflection on internet policies

Changing the way you see ICT

Freedom of expression, the role of intermediaries, and misogynist hate speech: Security in exchange for rights?

Erika Smith
Erika Smith on 30 September, 2013 - 20:15
1 comments | 1591 reads
Erika Smith lives in Mexico and is responsible for implementing the national project "End violence: women's rights and safety online" as well as for accompanying the annual local and global campaign of Take Back the Tech! between the days of November 25th and December 10th.
Share this

The Latin American Regional Forum on Internet Governance was held in August, and brought together internet experts from government, business, and civil society. As a feminist from Mexico who documents cases of technology-related violence against women, the debates on freedom of expression were of particular interest. I took the opportunity to interview representatives of free speech and digital content from the freedom of speech organisation Article 19 – with different perspectives from Brazil and Mexico, as well as views of other experts on the subject.

The UN Special Rapporteur on freedom of opinion and expression, Frank La Rue, made the issue of human rights and the internet a central theme in the Forum. He said that his special rapporteur reports marked _ “the use of the internet as one of the key elements for the exercise of freedom of expression… The internet is a controversial issue, it has risks, it is an element that requires some function of responsibility, by states, by families, but it is always primarily a communication tool… If there is a threat to national security or terrorism – in the face of that threat some governments feel they have to balance certain rights – such as privacy and security protection… and many citizens agree with losing some rights to strengthen others. “However, he noted that concessions of human rights allows for a certain level of authoritarianism – and that any “security guaranteed through the removal of rights destabilizes the system itself.”_

These and many other similarly valuable observations by La Rue offered a general framework for the debate, but the intervention by a young colleague from Guatemala to the rapporteur felt like a challenge for all freedom of expression activists. She asked what to do in the face of attacks on women human rights defenders via internet, as well as, in general attacks on women and girls. This is a central concern of the Women’s Rights Programme of APC, since the violence experienced by women in the streets, in homes, and in society is also faced on the internet – affecting the freedom of expression of women as well as their right to live free from violence. When states and legislators take up the matter in the name of “the protection of women and children”, they invariably do so with proposals that violate basic guarantees – security in exchange for rights.

La Rue said that prevention is key: “The main policy should be prevention, rather than punishment,” noting that states may misapply sanctions in erroneous and subjective ways. Prevention is undoubtedly part of the answer, and so I interviewed experts on freedom of expression and digital rights on their responses to the violence that the women who feel attacked are living with every day.

Antonio Martinez Velázquez, officer of communication and digital content for Article 19 in Mexico, was one of the experts consulted, excerpts from the interview follow:

“Our starting point is how we understand the internet”, insists Martínez, noting that Article 19 prefers to turn to technical concepts to explain arguments on regulation. “The network is nothing but a protocol for transporting data from point A to B in the most efficient manner. It is only an agreement, or protocol, that allows data to be transported more efficiently. We cannot confuse it with radio or television or other types of media.”

Whereas other media comes from identifiable actors, on the internet it comes through a disaggregated group of actors that are not necessarily connected to the content. There are intermediaries – service providers, or services running on the network, or governments, or civil society, but no one regulates the content…

The content is something intelligible, but what goes through the tubes is data – binary data packets that are not intelligible. And to preserve this, we need to understand that what we see is also liquid information, it is data that flows. There is no privileging of one content or another in terms of the packets that travel. There is no active discrimination to say that this data is important and that other is not. That is net neutrality…

“… The non-responsibility of intermediaries – which was legislated in 1996 and established that none of the network intermediaries had responsibility for what is published on the net – made the internet explode.”

Martínez emphasizes again, as have so many other internet rights advocates, the “three virtues of the network: no one owns it, everyone can use it, and everyone can improve it.”

“We got on that network which, if not horizontal, is at least nodal. Everyone can have a blog, but not all are heard to the same degree due to a variety of factors.”

ES: But what happens when a person’s exercise of freedom of expression is impaired or at least diminished due to the massive waves of hate speech on the internet, of misogynist hatred in many cases, trying to silence women and affecting even their physical safety through the extreme violence expressed?

Martínez notes that “hate speech traditionally does not have a universal connotation” and therefore has not contemplated misogynist hatred. He considered the tactics of the feminist campaign #fbrape to be misguided, because they ignored the important concept of self-regulation. He warned: “You cannot require an intermediary to regulate content because it can have unintended consequences, as was the case with Anita Sarkeesian, a feminist who studies video games. She posted an analysis of the subject on YouTube, and those who disagreed with it categorized it as hate speech and it was removed.”

“The characteristic of self-regulation is that the community itself censures what is wrong, according to an external morality and ethics. In forums like Forochan or Reddit people express themselves without filters and in a true situation of equality… What has been identified, on Reddit, is that people have increasingly stopped feeling offended or stigmatized. Their intention is to share and not to offend. Self-regulation works.”

He noted the tendency towards moralizing legislation, as in the case of the proposed law against cyberbullying in the State of Nuevo Leon (Mexico), which has now been vetoed._ “It had vague wording. It spoke of ‘any offence’ made by electronic means, without defining the characteristics of ‘offence’. It seems that everything offends us. To have less expression for the sake of guaranteeing a subjective thing leads to a lot of censorship and not to more expression. Problems related to freedom of speech are resolved with more speech, not less.”_

Martínez notes that violence and threats can fit into another category. “When certain elements of credibility and authenticity are met, a threat via Twitter should be treated like a threat that arrived on a paper under your front door, and this should activate law enforcement”. This is why Article 19 provides a guide to risk protection protocols, _“You answer a series of questions to assess your own risk – it lowers your paranoia. Rarely do people question the context of the message, which helps you weigh its magnitude and do a self-assessment of risk.”

“We always forget to put governments to the test. Put it to the test legally – if they don’t respond go to the state commission, go to the courts for denial of rights, but put it to the test. We have to demand that the government do its job”.
The officer for Freedom of speech for Article 19 in Brazil, Laura Tresca, shared, in a separate interview, her perspectives on the reality of her country, emphasizing the issue of due process:

“Article 19 thinks that the removal of content must be via court order. Democratic institutions must be reinforced.”

“There was a case of a woman blogger in a very violent state in Brazil where an anonymous comment on her blog was questioned by the authorities. It was a comment on the budget and salary of an officer, which the blogger approved for posting. She was held responsible. There was no judgement in this case because an agreement was signed in which she agreed not to talk about it again.”

“Sometimes you do not need a judgement for there to be censorship – the judicial process caused self-censorship so as to avoid paying heavy fines.”

This was why Article 19 Brazil did a preliminary investigation on intermediary liability in Brazil, which was published in August.

“An online search was done of legal decisions – not all are published, but we looked at all that we could. We did an analysis of whether they received a legal notice, if they removed content before due process, and if there was communication between the parties to reach an agreement – we looked at this for all decisions. It was an exploratory report using a very broad concept of internet provider.”

ES: What about cases of violence against women that is related to technology?

Tresca explained that in Brazil there is an independent nonprofit called SaferNet. _ “The same is true with symbolic violence on the internet, it should be taken to the legal system, because they have the power to take action. We recognize that there are emergency cases. In Brazil there is a mechanism that I find interesting, which is Safernet. Its initial focus was on children, but they now receive complaints of all kinds, and verify cases. They do two moves simultaneously – they file a legal complaint, and at the same time inform providers, if they have agreements with terms of service, that this content is potentially violating its terms. It is very effective, though clearly not perfect.”_

“The terms have to follow national laws and be in line with human rights standards. There needs to be advising on the terms of use, so that they do not lead to censorship of content that is dissident, or not as common or acceptable. Sometimes it is a matter of national legislation being in conflict with the human rights framework – but that is not a problem for the provider.”

“We did a study on the dynamics of defamation and the classist ways that concepts such as libel, slander, and contempt are used. We found that the poor are criminalized for contempt, for not following orders from an authority, but if you are at a higher class level they file a civil case for defamation. We know that the same thing happens with gender based violence: if it’s a poor girl, she asked for it, whereas a higher class girl is seen as a victim. Class discrimination is insidious.”

“There should always be due process. There is a tendency to criminalize behaviour that does not follow the principles of human rights. There must be sanctions, but they must be civil.”

“For attacks on physical and psychological integrity, protection is needed. We must take a holistic view of protection. Violence against activists is meant to intimidate and change their behaviour and the things they defend. They have to be able to continue to do their work without intimidation.”

“The right to freedom of expression is not an absolute right. It has its limitations, which are included in international treaties.”

ES: And what about, for example, the attacks on women, girls, and women human rights defenders that were mentioned by the Guatemalan participant?

“We see in many cases that the girl is already exposed, she has already been filmed, and it’s already up on the internet. What can be done? How can there be quick intervention? There must be a mechanism to activate the system – and for the system to provide a simple, encompassing, and effective response.”

 

Responses to this post

Katcha
49 weeks 6 days ago

I dont think that the #fbrape campaigners ignored the concept of self-regulation, what they were pointing to cases when brutal images condemning rape or domestic violence ( http://www.womenactionmedia.org/facebookaction/examples/) were reported as abusive/hate speech content by users but disregarded by facebook's moderators by appending humor disclaimer, while they routinely remove images of breastfeeding or mastectomy scars images. In other words, Fbrape campaign was pointing to prejudicial interpretations of Facebook’s own terms and guidelines by their moderators that contribute to culture of vaw and gender stereotypes.

I am also not convinced that we can perceive internet services and social networking platforms providers as mere intermediaries “without any connection to the content”. Companies like Google and Facebook routinely use the content, sells user's data to advertisers, and their business is based on content - so it is not so straightforward answer for me if they should not be also responsible for it

re 'Reddit, is that people have increasingly stopped feeling offended or stigmatized. Their intention is to share and not to offend. Self-regulation works”
- I am in favor of self-regulation, but we need to be cautious when assessing its effectiveness to also monitor what impact it has on rights of women, minorities and marginalized people (whose voice is heard and whose voice is silenced)e.g. recent Reddit case http://www.genderit.org/feminist-talk/rediff-and-rape-threats-what-rediff-could-have-done-support-kavita-krishnan

Post new comment

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.
By submitting this form, you accept the Mollom privacy policy.