Illustration by Neema Iyer for GenderIT.

What is (In)Visible 

(In)Visible is research that explores the digital security threats against Muslim women and Human rights defenders in the Horn of Africa. The research critically assesses the digital landscape within the Horn of Africa, the policies and laws that govern the space, and the lived experiences of Muslim Women Human Rights Defenders. 

(In)Visible also grounds the experiences of the women activists in feminist theories on religion, violence and gender. The name of the paper ‘(In)Visible’, is derived from the concepts and practices that govern surveillance and violence, that a person needs to be visible enough to be monitored and acknowledge the power that controls them, while adhering to patriarchal scripts on how they should engage publicly to avoid violence. The (In) within the Visible, represents resistance from the Women Human Rights Defenders, where they can organise and work discreetly to protect themselves and their community. 

The paper also deals with contestations between mainstream activism that relies on a person’s visibility to gain support and effect change, and the current realities of Muslim Women Human Rights Defenders on visibility being a double-edged-sword. 

Finally, (In)Visible clearly maps the policy, funding, advocacy and technology designs and processes that need to improve or be changed to ensure that Muslim Women Human Rights Defenders are protected. 

Why did we embark on this research? 

My (Mardiya) work primarily focuses on gendered digital surveillance against Muslim women, queer communities and minoritised people in general. I investigate the implications of technology surveillance, and highlight how gender, technology and religion intersect. 

Digital security was an emerging interest and when the concept of surveillance in the digital comes up, there is always a digital security response to the issue. Yet, in my framing of surveillance, it is a gendered social system that is amplified by or facilitated through different types of technologies and infrastructure. This means that technologies, be it CCTV cameras, AI surveillance systems, spyware, platform surveillance etc, exist to carry out an existing disciplinary and structural system against a certain group of people. 

(In)Visible, which is a project funded and supported by Musawah as part of the We Cannot Wait Consortium, provided a rare opportunity(funding) to look at and document religion, women and technology. There are no reliable outlets for people to engage with such work, given the challenges of connectivity and fear and other work burnout.

(In)Visible allowed us to highlight that to undo online gendered violence we would have to also undo the offline systems and norms that allow these forms of violence, threats and attacks to exist in the first place.

Embarking on this research journey also allowed us to rethink our approach to digital security in relation to surveillance, because, to paraphrase Munn (2022) the “properties of algorithms are not internal to the models themselves but rather a product of the social systems within which they are deployed”. In other words, and concerning this work, digital security measures, training, and functionalities should be a product of the social systems they are deployed into and cannot be abstracted from the contexts and realities of the people who need and use them. 

We rarely see work on technology that specifically addresses its intersection with gender and religion. Nor do we get the opportunity to focus on African Muslim women. (In)Visible allows us to highlight online gendered violence; intersectionally we must also show how the offline systems and norms allow these forms of violence, threats and attacks to exist in the first place. Thus, grounding our conception and understanding of digital security in the sociological aspects of life. 

Curiosity and Research Excitement: How We Set Our Ambitious Methodology 

The initial process was informed by a literature review on the topic within the Greater Horn of Africa. The approach was to inform the gaps within the space and what we should most likely focus on. While the research was envisioned to be qualitative, our anticipated methodology was to bring together different types of qualitative data collection – which we realised quickly that it was ambitious. 

First, our goal was to map and collect social media data in the form of content targeted at the Muslim Women Human Rights Defenders. Of course, the assumption here was that these existed in a somewhat normative frame on how violence, extremism and harassment happen. The idea of collecting data on harassment was also because of the anticipation that most of the Human Rights Defenders are visible and active online. 

Second, we wanted to conduct focus group discussions (FGDs) to expand some of the themes from our literature review, and further understand the experiences of the Human Rights Defenders based on the data we had collected from the social media analysis. This was also based on an assumption that we would be able to easily schedule conversations with 5-8 people per session. However, we did account for the risks involved and possibilities of the human rights defenders not being easily reachable because of the political climate at the time. 

Finally, the FGDs were to be complimented by key informant interviews and semi-qualitative surveys for in-depth narratives and analysis. 

Collectively, these methods were to give us the comprehensive information that allowed us to identify the threats, actors, the socio-political systems that allowed these threats to happen, and short to long-term holistic digital security recommendations.

The Messiness of a Research Journey: Balancing our Expectations, Realities and allowing Learning

Logistical Realities

A key lesson throughout every research I have conducted is that there will never be enough time. Acknowledging time constraints was important to designing a process that would allow us to gather the information we needed. Hence, our first step was realising that conducting research at the end of the year -December 2021- meant that many people had already clocked out of the year and were burned out. Similarly, the realities of Human Rights Defenders in the Horn of Africa are constantly volatile, which means that we may not get a high level of response. And Internet connectivity will affect the process of interviews. 

This meant that we had to prioritise synchronous key informant interviews, and asynchronous surveys for people who are not able to join the interviews. We also had to use messaging platforms such as WhatsApp for some conversations because zoom required a high bandwidth. 

And we were conducting research during a pandemic where there was a high risk of people catching Covid which will ultimately affect the data collection. Unfortunately, I (Mardiya) also caught Covid while conducting the research, and it was almost impossible to continue working through excoriating pain. 

Before We Consider Security, We Have to Address Access and digital literacy

A lot of the problems and questions were framed with the assumption that a certain type of security threat was occurring, and many Muslim Women in the Horn of Africa used social media in a certain way for their advocacy. It is important to note that most of the women we spoke with worked at the grassroots level, and few of them were part of human rights organisations. 

Most of the women activists shared how they were labelled as a threat to Islam, and to the morality of their community because they advocated for gender inequality.

A not-so-surprising finding was how many of the activists avoided using social media because they believed it increased their threats which they could not address since they rarely had any digital creation skills. Others also worked in topologies and locations that had bad connectivity, expensive data, and with communities who were not usually online. Some activists also had to balance their offline safety, state and community threats because they worked at the intersection of gender and religion. 

In the research report we include an aspect on “bad Muslims”, because the narrative that has emerged and shared by most of the women activists was how they were labelled as a threat to Islam, and to the morality of their community because they advocated for gender inequality. Thus, many of the activists had to negotiate with community leaders who were often men, to ensure a level of security and receptiveness for their work. Being labelled as “bad Muslims” was not exclusive to their communities, but within the broader feminist movement as well. At another level, having to negotiate at a certain level with patriarchy and not becoming visible in a mainstream or normative way, most Muslim activists would also be accused of not being radical enough. Both issues intersect at the point of control, where each community expects to fit the women and activists within their frame of believer and/or feminist.

A few Rights Defenders who were active online faced racial, gendered and institutionalised harm. For example, some shared that they were called b*tches, racial slurs for speaking Arabic, and received unsolicited sexual images in their Direct Messages (DMS). 

This also meant that using social and other online media increased the possibilities of being targeted online with little to no skills that will allow them to put in place basic security measures was a burden. The limited use of social media was also financial because they shared that they would flourish with teams whose work is to build their capacity and support them with online advocacy work and strategies. 

These issues made documenting the final output interesting because we had to challenge our assumption through problem framing and highlight why a lot of work surrounding security needs to be contextual, cannot stand alone, and most importantly start from access, financial and capacity building. 

Lessons on Fear: Pushing Boundaries, Re-Conceptualizing Ethics and Trade Offs

I recall asking Neema Iyer, “how radical are we allowed to get with this work done”? 

It was a question that was lingering from beginning to end. Fear informed how we conceptualised and designed the research. It existed through our data collection process and enabled us to think through every step to ensure safety, which paid off in the end. 

The fear we felt forced us to consider our positionality in the project as part of the research project rather than outsiders looking in. That is, the people asking the questions, collecting data, critiquing and pushing boundaries. A lot of ethical considerations of research centres the risks the participants face by engaging in work that is deemed ‘controversial’. Little enquiry is done on the risks the researchers are exposed to while investigating certain topics, locations and issues. This research, and the lingering fear on all sides, challenged us to consider the intersection of participants’ and researchers’ risk, and how they differed to design and implement research that centred on safety. 

Conducting research on gender, religion and technology was our own way of pushing boundaries, but the question of “how radical”, was it contemplating how much negative attention and risks associated with this project. Many people would rather not get involved in research on gender, feminism and religion. It was too risky and had multiple contestations.

Fear informed how we conceptualised and designed the research. It existed through our data collection process and enabled us to think through every step to ensure safety.

For example, there are many scholarly works that are dedicated to showing that feminism and Islam are incompatible, nor do Muslims need feminism. Some others are dedicated to proving that a person cannot be Muslim and a feminist or by some rights activists rejecting the feminist label they engage in some form of betrayal. The research demonstrated that whether Muslim women claim feminism in a mainstream way or reject it, they will still face threats because these arguments serve as a distraction to the fact that most of these realities are embedded in hegemonies of control. 

The goal of the research was to map out the threats that Muslim women activists experience, and the institutions that carry out the threats and violence. This is considered necessary to design holistic personal security measures and address systemic platform design and policy issues. However, considering our lessons from work on gendered violence, many people who are perpetrators of violence become more violent and offended when they are named. Hence, we plan to push the first boundary. 

The second uncomfortable, and likely the most controversial, and personally scary aspect of this work was citing Muslim scholars who are accused of blasphemy for their theorisation of the Qur’an and the Divine beyond masculinity. However, I believe that to ultimately undo and challenge violence, online and offline, we have challenged our conceptualization of a masculine divine entity. 

The third aspect was directly showing that technology intersects with religious systems and hegemonies, and as stated previously, it is a product of social systems it is deployed into. As such, if a technology was not created or designed to be used by a certain group in a safe and accessible way, its affordances can be used to harm and exclude them.

Wins or Trade Offs? Delivering a Paper and Webinar with Close Access

After concluding the research analysis and report writing phase, Neema and I engaged in a very long and critical discussion on how to disseminate this work. Especially when the central argument is that visibility is something many Muslim women activists cannot afford, and we need to identify alternative ways to advocacy, financial support and solidarity beyond being mainstream and visible in a normative way. 

We asked questions around, ‘Do we engage in a public launch?’, ‘Should we add our names to this research?’ ‘Do we upload full research online?’, ‘How do we monitor who accesses the work?’, ‘How do we centre care and not further harm ourselves and the Human Rights Defenders who shared their stories?’, ‘How do we host a webinar and reduce attacks and possibilities of zoom bombing?’, ‘Do we publicly share the webinar recording on online?’

Interestingly, these are not questions we would typically centre when launching or publishing other projects. For example, a public launch meant that we risk having adversaries weaponizing the information against us, and Rights Defenders in the region. Centring ethics of care meant that as researchers the entire journey of the research does not further harm all stakeholders involved, even if it meant that we give up the potential ‘popularity’ of such critical work. 

Given various forms of risk analysis conducted to ensure everyone is safe and benefits from the work, we launched the paper with close access where we required people who wanted the full copy to fill a form and we assess their information to determine trustworthiness. Our decision to focus on the small community of people who need this work gave us the opportunity to host a discourse rich round table with activists in Sudan, Uganda, Kenya and Ethiopia. The round table was intentionally designed to centre the people whose lives are represented in the work, and provided a safe space to discuss patriarchy, religion, state surveillance, protest and queerness. 

Centring ethics of care meant that as researchers, the entire journey of the research does not further harm all stakeholders involved, even if it meant that we give up the potential ‘popularity’ of such critical work.

In addition, we learned that good research creates space for growth, learning and chaos. Learning happened when we were challenged by the realities presented and had to rethink our research design and anticipated arguments based on the on-ground data. Meanwhile, growth was our ability to honour the message of (In)Visible which is that we cannot put out work in a way that contradicts the stories and issues raised throughout the paper. 

With that established, we continue to reflect on questions such as, “how do Muslim Women Human Rights Defenders reconcile between donor’s constant desire to publish and post on social media platforms vs their own safety? How do the human rights defenders alert Muslim women to their safety circles where they can help them when they ‘invisiblize’ themselves? And finally, how can we build safer spaces for critical discussions and discourse for Muslim women activist like the ones Pollicy hosted as part of this project?”

Footnotes

Cited Work

Gabriel, I. (2022). Toward a Theory of Justice for Artificial Intelligence. Daedalus, 151(2), 218–231. https://doi.org/10.1162/daed_a_01911

Munn, L. (2022, August 23). The uselessness of AI ethics. AI And Ethics. https://doi.org/10.1007/s43681-022-00209-w

Add new comment

Plain text

  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <br><p>