Today I sat in a workshop in Rio de Janeiro. A workshop in Rio de Janeiro? A capoeira, volleyball or football workshop, you must be thinking. Even though I’m just 25 metres away from a beautiful beach, imagine, I sat in a room in a hotel, full of people with laptops… on their laps.
Such is life in the second Internet Governance Forum. And let me tell you that it’s worth it. One of the 97 workshops unfolding here in Rio was called “Content regulation and the duty of states to protect fundamental rights”, brought to you by the APC’s women’s programme, the APC WNSP for all of you acronym-lovers.
Should states regulate content on the internet? What content is acceptable online, what content is not? – such were two of the many questions that looked at whether or not content should be regulated or left alone for people to filter on their own terms.
Is harmful content the same as illegal content?
One of the panellists, Matthias Traimer from the intergovernmental Council of Europe, insisted on differentiating between illegal content on the one hand, and harmful content on the other. “Most workshops I’ve attended here at the IGF talk about these different types of contents without realising that they make reference to totally different set of laws and regulations,” the lawyer pointed out.
“Harmful content is a policy challenge,” he added. And who wouldn’t agree with that? How can pornography, violent content or other types of sexual content be filtered or controled when they are detrimental to vulnerable populations?
Another speaker, Adrianna Veloso, a Brazilian journalist and video maker, played the provocateur, challenging what Traimer was saying. From the “perspective of an end-user,” as she calls herself, she refocused the debate on the concept of freedom of expression. “Drug prohibition here in Brazil,” she argued, “has lead to violence and corruption.” Stretching this to pornography, she bluntly insists, “It has always been there. Internet only accelerates its dissemination, it’s an intermediary,” arguing for the non-regulation and non-filtering of content by anything remotely close to the state.
Interestingly, Veloso completed her comparison of online explicit content with the drug availability in the streets of Rio, by arguing that content regulation broadens the digital divide that already exists, citing the “top-down” governmental programmes of digital inclusion. “Telecentres, infocentres, hotspots, all these are set up by the government. There, people cannot access pornography,” and this contributes to the fact that many don’t engage with technology, she affirms. “Of course, there is harmful content out there, but it’s up to the viewers to decide what to do with it.”
There will always be the question freedom of expression, and you would be hard-pressed to find anyone rejecting this key human right concept. But shouldn’t the state intervene, when some content is harmful? “Yes, it should,” Traimer says. And there is more. “It doesn’t always mean a top-down intervention,” he specifies.
The constitutional and media law professor believes in an active state. “It is not possible to find a uniform concept of morals.” But Traimer believes that by taking a hard look at types of contents, it will be feasible to define a set of criteria that will do a good job at defining what harmful content is.
Chat Ramilo, the APC WNSP’s manager, accepted Traimer’s invitation into the world of criteria, mentioning that “content which eroticises or magnifies violence, such as domestic violence, or rape for that matter, should be THE criteria by which we can establish what harmful content is.”
That’s where the words of Malcolm Hutty, head of public affairs at a London internet service provider (ISP) actually complement Traimer’s policy talk. “ISPs should not be the intermediary to decide what content is harmful or not,” he categorically told the by now over-stimulated audience.
“We support self-regulation,” Hutty said, before adding another layer of complication to the discussion. “But, with self-regulation, we need to see that there is a change in transparency.” And that might well be true, since it’s harder to monitor self-regulation than an open and transparent public or para-public content regulation organisation.
ICT policy vs self-regulation
At this point, someone from the audience almost jumped around the chairs to comment. And she did get the microphone. “People interested in self and co-regulation should check out the FSM, a German association for voluntary self-regulation in online media, for which she works. A sticker, a free T-shirt and some more self-promotion with that?
Having clearly put some thought into this, Traimer identified ICT policy as being the best way to achieve fair filtering of harmful content. “Information and communication technology policy is often seen as extra policy, but in fact, it’s a more integrated vision that informs all ministries of a state.”
This all speaks to a soft state intervention that doesn’t require hard law. It speaks to the capacity of a state to think in an integrated manner about where and how harmful content can be best combated. This could be attained through the injection of government funds in content awareness-raising in schools for instance. At least, that’s how I understand Traimer’s point.
Either way, content regulation is not a classical case of regulation. There might be plenty of ways to regulate and this is why I think we’ll need a round three, with the next IGF, to really dig into what content self-regulation, co-regulation, state regulation and no-regulation really mean and entail