Balancing rights and interests: Best practices to counter online abuse and violence

7 March 2016

Online abuse and gender-based violence are understood as being a part of gender-based violence. In addition to existing structural inequality and discrimination between genders, disparity in access to, participation in and decision making over the internet and technology development are all factors that play a part in their manifestation online and through the use of other information and communications technologies (ICTs). As such, online abuse and gender-based violence disproportionately affect women in their online interactions, encompassing acts of gender-based violence such as domestic violence, sexual harassment, sexual violence, and violence against women in times of conflict, that are committed, abetted or aggravated, in part or fully, by the use of ICTs. (Document of the IGF BPF on Online Abuse and Gender-Based Violence Against Women, 2015)

The IGF Best Practice Forum on Online Abuse and Gender-Based Violence Against Women panel took place in Joao Pessoa, Brazil in November 2015. Many multisectoral voices convened to give and receive feedback on this intensive intersessional work process, and in particular on the report developed by the multistakeholder group, going through some of the key highlights and recommendations from the BPF but oppening it up at different junctures for inputs and responses. [1]

This BPF on Online Abuse and Gender-Based Violence Against Women is aimed at being one step in the direction of getting multiregional stakeholders to take proper cognisance of the issue. The BPF's work is community-driven, bottom-up and multistakeholder. Anyone could join the online spaces and meetings that took place throughout the nine months before the IGF in Brazil.

Throughout the nine months of intersessional work, the BPF looked at different stakeholders and their roles when it comes to prevention, redress and remedy for online violence against women. The BPF asked all stakeholders to help address the following question: What are effective practices and policies that address, mitigate and/or prevent online abuse and gender-based violence against women?

Read more: What are BPFs?

Getting to the roots

“...information-technology-related violations, abuses and violence against women, including women human rights defenders, such as online harassment, cyberstalking, violation of privacy, censorship and hacking of email accounts, mobile phones and other electronic devices, with a view to discrediting them and/or inciting other violations and abuses against them, are a growing concern and a manifestation of systemic gender-based discrimination, requiring effective responses compliant with human rights.” (UN General Assembly, 2014)

The phrase “this is not about women, everyone suffers abuse online” is a common one. “People in public face abuse, but then there's something specific that women in public positions face which is often targeted towards their sexuality and their gender, and it's of a different flavour and of a different volume,” expressed Jac sm Kee.

Caroline Criado-Perez took part in a successful campaign to retain a female face on one of the Bank of England’s pound notes. As a result she suffered severe online violence, including rape threats and other abuse. Her supporters – including a prominent politician and female journalists – faced similar abuse online.

As mentioned in the BPF document, “while violations of users’ rights online may affect all users in differing ways, incidents of online abuse and gender-based violence rest on existing disparity and discrimination.” Gender-based violence is a “manifestation of historically unequal power relations between women and men, which have led to domination over, and discrimination against, women by men and to the prevention of the full advancement of women.” (Council of Europe, 2014: Convention on Preventing and Combating Violence Against Women and Domestic Violence)

As part of a practice called “Top 10” in at least two peripheral neighbourhoods of São Paulo, profile pictures of girls aged between 12 and 15 are mixed with phrases describing the girls’ alleged sexual behaviour, and the girls are then ranked according to “how slutty they are”. The practice has reportedly led to school dropouts and suicides. The InternetLab, an independent research centre that has done extensive research on the practice, believes it to be quite widespread in Brazil. (Mariana Valente, InternetLab, Brazil)

According to Kee, and later echoed by the other panellists, when dealing with online violence against women we should not focus on the surface or the technology, “but really open the root causes – at the end of the day, it's really an issue of discrimination, it's an issue of power and inequality.”

Google representative Hibah Hussein said that they “are really focusing on the fact that a lot of these issues aren't just online issues. They are issues with offline roots. You can't just play whack‑a‑mole with the content online and the problem will go away,” she affirmed.

An audience member intervened to stress: “We need data and we need community. We also need to connect the dots, because we talk about the women's party; this means a lot about sexuality, blasphemy, and sexuality is the reason why women's body is attacked, and we need to connect those things and you will never forget there are different non-normative bodies and the attacks are directed to any of them.”

An effort worth mentioning was made in the BPF document by looking at girls and young women, women in rural contexts, religion, culture and morality, women in the public sphere and women in technology fields, women of diverse sexualities and gender identities, and women with disabilities, because this has a real impact in terms of access to technology as well.

Access to justice and legal remedies

“Where countries consider developing legislative responses to the issue, it is important that relief and redress be prioritised over criminalisation.” (BPF document)

One of the issues that often emerge when discussing violence against women online is the need to balance competing rights and interests. The apparent tension between freedom of expression and the need to address online violence against women is one of the core spaces of dissent.

As stated in the panel by David Kaye, UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, there are legal and state sanctions which are available. “There is quite a great amount of existing law that is simply often not applied in this space because of the idea that this is only expression when, in fact, harassment is an act, not just an expression,” he added.

“More inquiry is needed about the use of technology, such as computers and cell phones, in developing and expanding forms of violence. Evolving and emerging forms of violence need to be named so that they can be recognized and better addressed.” (United Nations Secretary-General's In-depth study on all forms of violence against women, 2006)

Kaye raised two important issues: first, the need to have definitions of the problem that do not over regulate, because “very often the tools that we would want to use in order to counter harassment will be the same tools that are used to censor.” Second, the issue of who decides, referring to a range of different kinds of actors. “We are talking about social change, the kind of campaigning that APC and others do which is critical in order to shed a light on harassment and to counter it, in an effective kind of campaign‑oriented way,” Kaye commented.

The Working Group on Broadband and Gender of the UN Broadband Commission for Sustainable Development recently released a report addressing online violence against women and girls, intended as a worldwide wake‑up call. The highlights, according to Gary Fowlie, head of the ITU Liaison Office to the United Nations, are pretty straightforward: in 86 countries, only 26% of law enforcement agencies are taking appropriate action; women in the range of 18 to 24 years old are likely to experience sexual harassment; one in five internet users lives in countries where that online harassment is unlikely to be punished. “Obviously women are reluctant to report this, their victimisation, as a result of fear of social repercussions,” he added.

“We need to look at some basic recommendations, things like sensitisation, preventing cyber violence, and maintaining responsible internet infrastructure through technical solutions, more informed customer care practices, and sanctions. Probably this is the most controversial part, developing and upholding laws, regulations and government mechanisms which there are, and are in place, but maybe are not being upheld to the manner that they would be,” said the ITU representative.

When a Muslim woman left her forced marriage, her ex-husband (who is not based in the UK) started setting up fake profiles for her on social media. He not only alleged that she is a prostitute, but also offered her services and shared her personal contact details. She was disowned by her family and received requests for her “services”. While police are involved, little support has been offered because the woman’s husband (the perpetrator) is outside the UK.

Some legislative initiatives like the Istanbul Convention take a very strong approach to addressing the issue of violence against women. The Istanbul Convention, according to Patrick Penninckx, head of the Information Society Department of the Council of Europe, “is not just about criminalization of acts, it's also about what we can do.”

In this sense, Penninckx warns that we need to be careful not to legislate in many different directions, because “then we are opening ourselves to legislation that goes in different directions based on morals, based on history and religion and other aspects,” he affirmed. “First of all we define very clearly what we want to be doing and secondly who has the responsibility to do what – whether that is courts, judges, internet intermediaries – and then to see what positive action to undertake, whether that's stimulating service providers to think about their own role, stimulate ethical thinking, stimulate internal controls, stimulate training, skills training, awareness raising. And it's a complexity of things and that's also what the Istanbul Convention tries to provide,” he added.

One example of this multi-directional legislation can be found in Australia. Narelle Clark, deputy CEO of the Australian Communications Consumer Action Network and director of the Internet Society Australian Chapter, has been working for some time through a project with the Domestic Violence Resource Centre Victoria. This legal centre developed a guide to all of the different laws that are in place in Australia, right across the different sectors that are involved with online stalking and abuse. Interestingly, they found over 70 pieces of legislation dealing with these problems, under seven main legal jurisdictions. On top of the amount of legislation, very few seem to have any harmony between them. Moreover, in spite of this overwhelming legislative body, Clark said that for the most part, what domestic violence workers find when they get down to the police station with the whole stack of evidence is the response: “Look, love, just get off Facebook.”

To help counter this, they put together a guide called smartsafe.org.au, which gathers all the applicable pieces of legislation in Australia on the matter. This was shared with domestic violence workers so they would know what type of evidence to collect and what technical steps to take to assist women suffering online violence.

It is also important that the law enforcement agencies understand this issue, and treat it with the seriousness it deserves, because “when women have come to report such abuse to the law enforcement, they are not taken particularly seriously because it's not understood that an online threat is just as real as a threat in the real world,” stated Franz Marivick, the Organization for Security and Co-operation in Europe Representative on Freedom of the Media. “It's something that governments and authorities need to do: more work on law enforcement, but also prosecution and judiciary to understand the context and to deal with it,” he added.

As a matter of fact, the first area the BPF looked at was that of public sector initiatives. Two things stood out: first, that there is a need to prioritise online VAW as an issue within the public sphere, and second, that there is a need to recognise the forms of harm beyond physical violence. Attention is often placed just on physical violence, but there is a whole range of other kinds of impact, from psychological harm, to impact on mobility, to economic impact, to education, among others.

Marivick also noted that there are existing criminal laws that can deal with this, and there is no real need to introduce new criminal legislation in this area, “because any new legislation can quite easily stifle freedom of expression and freedom of the media and that could be quite problematic.”

Private sector approaches

After breaking up with her boyfriend, a woman found that several naked pictures and a sex tape of her were published without consent on a Facebook group that she shared with co-workers. The photos were also distributed on two porn websites (one based in Argentina and another in the USA). She reported the situation to Facebook, Google and other websites concerned, but did not receive a response. She then made a complaint at the Personal Data Protection Centre of the Buenos Aires Ombudsman’s Office. While the content was finally removed, it is still possible to access some images from other websites that have republished the content from other jurisdictions.

Internet intermediary responsibilities were also under the spotlight in the BPF document, as well as the role of states.

“Should internet intermediaries have more of a responsibility? And how can this be implemented in a way that makes sense?” asked Jac sm Kee, member of the IGF Multistakeholder Advisory Group and manager of the Association for Progressive Communications Women's Rights Programme, during the panel. In this sense, one of the things that came out of the BPF work is the usefulness of looking at the “Protect, Respect and Remedy” Framework as a constructive way to address this issue.

“Even if there are measures and complaints mechanisms, there's a need for greater ease for reporting, not to make it that you have to jump through three million hoops before you can report something, not really understanding the process,” explained Kee. Transparency around reporting mechanisms – not just in terms of how the complaints mechanism works, but how many reports companies get – was another essential question. “How many reports of harassment have you actually got to know? How many have responded? What are the different ways you can respond to this?” questioned Kee.

The Ranking Digital Rights project evaluated 16 of the world’s most powerful internet and telecommunications companies on their disclosed commitments, policies and practices that affect users’ freedom of expression and privacy. Some of the questions raised and assessed in the evaluation done by the Ranking Digital Rights project include: Is the company clear about what the rules are? Does it communicate that clearly to the users? Does the company engage with stakeholders about how to formulate its terms of service in a way that serves and respects the user's rights? Is it formulating the terms of service also through the engagement of users conducting human rights impact assessments? Is part of that assessment including an assessment of their enforcement of the terms of service? To what extent are companies being transparent about what they are taking down, why they are taking it down, on whose request? Are there grievance and redress mechanisms in place?

The results showed that there is a lot of work ahead. Lack of transparency, lack of understanding, and lack of enforcement of the companies' terms of service are some of the highlights. As the project representative Rebecca McKinnon stated in the BPF panel, “Only six of the 16 companies we looked at release any information about the amount of content they are removing due to government requests.” Even more concerning, none of the companies released any information about the volume and the nature of content they are taking down, restricting the enforcement of their terms of service.

“In trying to resolve problems that occur on platforms, one of the reactions of policy makers is to hold the platforms legally responsible for the bad, evil, unacceptable behaviour of many of the users of the platform,” stated McKinnon. But what they have learned from studies on intermediary liability around the world, is that “when the law is placing strong legal responsibility on platforms to police content, it always leads to over-censorship.”

“I'm not aware of any case where the restrictions are done in a very sensitive way that only deals with real harassment and don't end up leading to the censorship of activists and take down accounts of people who have a right to be speaking and who are engaging in legitimate speech, and women who are trying to get their message out,” she said.

According to McKinnon, that is always a danger: when the platform feels it has to do something but they don't really understand the issue well enough, and they don't have enough staff to look at every single case with enough nuance. “That's the problem with kind of heavy-handed approaches to the law, which also ends up not helping women. The wrong people end up getting censored,” she explained.

“Part of the problem is that the process is really a black box and there's a lack of accountability and not enough stakeholder engagement and assessment of what's going on. I would argue that perhaps that's a way forward that can help find the right balance as opposed to a 'heavy handed-we will throw everybody in jail' kind of approach,” affirmed McKinnon.

Hibah Hussain, from Google, pointed out in the panel that they have heard “loud and clear” from civil society groups and other actors “that you don't really want an intermediary to make blanket decisions about what qualifies as harassment. You can't have a one-size-fits-all solution.”

Hussain said that on Google's side, they have been building partnerships with some of the groups in the BPF report, which have been working on these issues for a really long time. “We have partnerships with help lines. We have partnerships with the (US) National Network to End Domestic Violence to make sure that technology can facilitate safe spaces and can empower marginalised communities,” she stressed.

In response to this, Jan Moolman, APC Women's Rights Programme project coordinator, intervened from the audience to add that “while it's really great to hear that Google is reaching out to women's organisations, they are mostly in the United States and our work is really showing us that the responsiveness of private sector actors and companies is really not the same in terms of women's experiences of the global South.”

A 16-year-old girl in Pakistan was filmed having sex with an older man and repeatedly blackmailed for sex thereafter. Her family were also subsequently blackmailed for money. The girl’s father said the incidents had happened because the girl had been granted the 'freedom' to attend school, bringing “shame and dishonour” to the family.

The extreme importance of understanding the social context of online violence was also raised by Nighat Dad, founder of Digital Rights Foundation. In Pakistan, at the same time that the number of internet users is increasing, so are the cases of violence against women, explained Dad. “There is no legal mechanism to report these cases of violence; there is a government authority, but really useless at this point in time,” she added.

When women face online harassment, they have no idea where to go and how to report the cases. Due to sociocultural impositions, they cannot go back to their families and ask for help or support. Dad shared a case that illustrates how important it is for the platforms to understand the social context in which online violence takes place.

Recently two hackers created different women's profiles, by disclosing real women's profile pictures – which can easily be downloaded from Facebook – and their names, their college name, their addresses and their phone numbers. These women reported these pages to Facebook many times, asking the platform to take down the pages because all the information there was putting their lives at risk. Some of the family members of the women got to know about this. Facebook would always come back saying that those pages did not violate their community guidelines. Nighat herself got in touch with the Facebook policy team and flagged the importance of looking at the social context where that violence occurs. “Maybe some forms of violence online are not really dangerous or risking in a society, but in some societies those are really putting women's lives at risk,” said Dad during the panel. Platforms should understand the social context where their users live, and the local languages, and strongly invest in building their capacities in this sense. “They are making money out of our data, right? We are their product. They need to understand that they are there for the users and respect their privacy and understand the social context,” she concluded.

Safety first: Digital privacy and anonimity

Videos of what purports to be a well-known 12-year-old female actor allegedly masturbating in her room were shared online in June 2015. While the identity of the person(s) who uploaded the videos remains unknown, the videos were shared repeatedly on social media platforms. No action appears to have been taken against the perpetrator(s).

“A significant portion of online abuse and gender-based violence tend to happen using anonymous accounts or accounts with pseudonyms and/or false names, making it difficult to identify perpetrators. On the other hand, anonymity is recognised as a valuable tool for women to be able to exercise their rights online,” states the BPF document. Flagged in the document as a very relevant issue, one of the tensions that often come up when addressing violence against women online is around anonymity and privacy. “As feminist activists, anonymity is absolutely key and it's always the political act of resistance that's in the history of feminist activism. However, is there a point in which your right to anonymity and privacy is forfeited because you have abused it essentially? What is that point?” questioned Kee. And she added: “Anonymity is important for safety and expression, but at the same time anonymity is something that can be used for abuse. So how do we kind of balance between the two?”

In a small village in Mexico, an active parishioner and teacher was accused of cheating on her husband, and their children of being fathered by others, on a Facebook page dedicated to gossip in the community. The accusations damaged her reputation in the community, made some parents unwilling to trust her as a teacher, and also led to her being abused by her husband.

The Latin America and Caribbean Women's Health Network faced systematic cracking of their website immediately following the launch of several campaign activities in September 2013 to decriminalise abortion in the region. This was seen as a serious extension of the harassment and intimidation of women's human rights defenders who worked on the high-risk issue of promoting women's sexual and reproductive health and rights.

As UN Special Rapporteur David Kaye stated: “All tools are subject to abuse, and that's clearly the case with respect to anonymity. My main concern and one of the reasons why I did the report on anonymity, is because anonymity is under threat as a general matter by law enforcement and intelligence agencies.” And Kaye made a key statement: “The absence of any tools of anonymity would be a very, very serious threat to activists and just to ordinary people who are searching for ideas and about their own sexuality, about their heritage, whatever it might be. I really wanted to flip the default, so that the default is anonymity. Anonymity is oftentimes a critical tool for people to enjoy their freedom of expression.”

Narelle Clark, from the Internet Society and the Australian Communications Consumer Action Network, stressed in relation to the Internet Engineering Task Force that “security online is fundamental to the Internet Society's mission. Throughout the creation of internet standards we addressed security at all fundamental levels and more recently we put in privacy conditions all the way up the technology stick. But that doesn't get to stopping violence against women. These are just some of the tools that we have to help women.”

A radio journalist received numerous threats of violence, rape and murder on social media after presenting a satirical video that questioned the opposition state government's intention to push for Islamic criminal law, or hudud, in Malaysia. The video was removed, and the journalist was probed for so-called “blasphemy”.

Franz Marivick, OSCE representative, said that they have been looking closely at the issue of the safety of female journalists. It is quite clear that female journalists' safety has been affected through online abuse. Attending to the needs of female journalists in this regard is imperative, since safety “is one of the prerequisites of freedom of the media and freedom of expression,” stressed Marivick. This is why the representative has issued a number of recommendations to governments, which are essentially the OSCE's main counterparts in this area. “They should recognise that the threats of online abuse directly attack freedom of the media and freedom of expression, because journalists that are being abused sometimes take themselves off social media,” he explained. “They may choose not to report on certain issues, on certain topics, because of the abuse that they have suffered. So that's leading to censorship.”

Agustina Callegari, from the data protection authority at the Ombudsman's Office in Buenos Aires, Argentina, mentioned the impact of the duration of this kind of violence. “We are focusing on a chronic violence, because when you put any mention online or a comment online against women, it will be online for the long term. It's not only for the moment. As a data protection centre and a privacy centre, we are trying to focus on this issue, because we don't believe it's a problem of the moment, because of the way the information flows on the internet,” she said.

Community access and empowerment

The BPF document offers many examples of community-led initiatives around digital safety, training, awareness-raising campaigns, development of apps, technical solutions and help lines, showing that civil society plays a significant role in confronting online violence against women.

Hibah Hussain commented that “as Google, we recognise that the internet is only going to be this wonderful, robust place, if everybody can participate online without fear of harassment, without fear of threatening.”

“What we do is rely on users to really flag content for us, and making the tools for reporting as easy to use, as intuitive as possible for users around the world,” explained Hussain. “We don't allow content that promotes or condones violence, that has the primary purpose of inciting hatred based on sexual orientation, gender identity, gender, race, ethnic origin, age, and veterans status.”

However, the lack of mechanisms available in online platforms to enable effective responses to cases of online violence against women was flagged in the BPF document as one of the underlying factors that can contribute to this violence.

According to Hussain, other areas that Google has addressed are digital literacy and safety, making sure that people understand how to use technology in a way that is responsible and enables them to have full control over their online presence.

Kaye added, “We don't want to put the burden on the target of harassment to deal with it; but at the same time, there are tools out there that we can use to shut it off to a certain extent. There needs to be some discussion about the tools we have in order to block, the tools that technologists can offer in order to allow us some control in the face of harassment, which is really a crisis in many places.”

Some of the areas Google is dealing with, according to Hussain, include building scalable systems, understanding context and subtlety, and also identifying good policies that are not over broad. And she noted that they also “have been focusing a lot on counter‑speech, and helping people take advantage of online platforms to get their message out and really push back.”

With regard to counter-speech as a widespread recommendation to tackle online violence, APC's Jan Moolman pointed out that we need to acknowledge that we are not dealing with an equal playing field. This discussion “has to recognise women's existing inequalities and existing exclusions,” she said. “There's a lot of emphasis on freedom of expression in a way that's locating it also outside of the broader realities of existing inequalities and also it's not only about speech. It's also about online harassment. It's also about the fact that our research shows that one out of three women, the violence experienced is related to someone that they know. Yes, speech matters but I think it's one of the areas that we actually have moved quite further along into and we need to look at other issues as well,” Moolman stressed.

The BPF document flagged women’s unequal participation as decision makers in the development of technology platforms and policies as another important underlying factor that can contribute to online violence against women.

Narelle Clark referred to the Internet Engineering Task Force, which is “where all the technology standards are developed for how the internet works,” she explained. “Like a lot of other professional organisations, we started to adopt standards for behaviour within our own technical development communities. And that runs in parallel with the other safe spaces that a lot of women have developed throughout the whole community of developers. So you have forms like Sisters at IETF and sisters more broadly, where women can come together and build support for each other and be stronger in their development and their sense of their own technical careers.”

The ways forward

As assessed by IGF MAG and APC representative Jac sm Kee during the panel, the Best Practice Forum process was a challenging one, but very constructive in terms of gathering all of the different stakeholder thoughts, work and responses in this area. “Where do we go from here?” she asked, inviting the panellists and audience to answer.

Kee started herself by stressing the need for greater facilitation of conversations within different groups of actors and different initiatives in order to do more targeted research, as well as to develop some possible solutions. “It should be a focus on education, on internet literacy,” she added, so that “people understand that what they are doing online has a real effect on the offline world of other people.” Clark agreed and added that there is a need for “balance between the technology solutions and the education of both women, domestic violence workers and law enforcement workers who work with women, to try to help them build their privacy and build their strength online.”

“We need data, we need context, we need intersectionality, when we are discussing solutions for gender issues,” stressed Kee. Rebecca McKinnon agreed on the need for more data, research and fact-based solutions. “But we need to bring the right communities into conversations with the right people and the companies. Companies often are not quite sure who to reach out to. And there are starting to be some mechanisms and intermediary organisations that help connect people in companies with the right sort of NGOs and communities, in particular in places where the context can be provided and where the real conversation for problem solving can be had.”

Nighat Dad stated that what is needed is not new legislation, but “to use existing legislation and more training of law enforcement agencies and judiciary.”

“It's naive to think that it will go away without effort. It's naive to think that we are in a linear process towards eliminating violence against women. It's a daily effort and a daily struggle from all of us,” concluded the Special Rapporteur on freedom of expression, David Kaye.

Download the document of the IGF BPF on Online Abuse and Gender-Based Violence Against Women here

Share this

Footnotes

[1] Participants in the panel that took place at the IGF in November 2015 in Brazil: Jac sm Kee (member of the IGF Multistakeholder Advisory Group and manager of the Association for Progressive Communications Women's Rights Programme), Rebecca McKinnon (Ranking Digital Rights and Global Voices), Gary Fowlie (head of the ITU Liaison Office to the United Nations), Agustina Callegari (data protection authority at the Ombudsman's Office, Buenos Aires), Narelle Clark (deputy CEO of the Australian Communications Consumer Action Network), Anri van der Spuy (IGF Secretariat), Nighat Dad (founder of the Digital Rights Foundation, Pakistan), Hibah Hussain (Google), Mariana Valente (director of InternetLab, Brazil), Franz Marivick (OSCE Representative on Freedom of the Media), David Kaye (UN Special Rapporteur on freedom of expression), and Patrick Penninckx (head of the Information Society Department, Council of Europe).

 

Post new comment

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.
By submitting this form, you accept the Mollom privacy policy.