The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> TAWFIK JALESSI: (No audio) ‑‑ hate speech, conspiracy theories, cyber bullying, online harassment of professionals and so on and so forth. That's not public good. That's public harm. At the minimum that's public (?) there's one way is to watch the world passing by and stay passive or to try to do something about it. UNESCO decided to try to do something about it. And people say, why UNESCO? Where are you coming from?
Well, UNESCO has a 30‑year track record on freedom of speech, Freedom of Expression, Safety of Journalists. For 30 years we have been awarding the UNESCO World Press Freedom Prize. I think we have a record of defending human rights and the fundamental human right is freedom of speech, Freedom of Expression. Now, it happens freedom of speech moved online. How can we ensure that freedom of speech online is for the public good and not for public harm?
We know of casualties, people who have died because of misinformation, disinformation, and hate speech. Let me give you one statistic from our work and research. 73% of women journalists have been subject to online harassment. 73% of women journalists are subject to online harassment. 20% of them ended up being physically attacked. This is not just something happening on cyber space and so what? Skip it. Don't look at it. Well, 20% of the cases it wasn't just online and these women were subject to attacks and physical violence.
This is one. Said in open plenary session, in Sweden he said one in four women journalists end up quitting her job because of online harassment. Not just being harassed online. One in four journalists in Sweden decided to quit journalism because they couldn't take it any longer. There are many other statistics. Let me go to Brazil, we heard from keynote speaker at UNESCO in June. 78% of Brazilian take What's App as their primary source of information. You are saying it's true and you are in Brazil. Yes, of course. This is an alarming statistic, because what information is being disseminated on social media on What's App or TikTok or YouTube or Instagram? Is it verified information? Is it fact checked information? Is it quality information? A country like Brazil with over 200 million people and 78% of them use What's App as the source of information. I find is alarming.
You can say it's a fact of life. We like social media. We're on social media every day. We have statistics that the youth spend at least two hours a day on social media, two hours a day on social media. Maybe they don't spend two hours a day studying or doing homework or doing sports or other things, but they are on social media. It is very important for our media and information literacy programmes, because it's not only conferences and also advocacy. It is also capacity building. And you have a major programme called media and information literacy with a goal of making citizens media and information literate, so they know how to use information on these platforms. Hopefully they can develop critical thinking and they can distinguish between fake news and verified news and so on and so forth.
I think the issue ‑‑ what happens to why such a conference and why UNESCO? So the last bit of why UNESCO, last year in the media ‑‑ again, sub‑Saharan Africa, we helped UNESCO the World Press conference. There was a major Declaration which is called the wind hook because the initial World Press conference took place in wind hook 30 years ago. This is the Declaration in which it was clearly stated we have to ensure that information is a common public good. Including through online means and channels. So we are taking that concept of information as a common public good, and we're trying to address it in the context of digital platforms.
Through an inclusive multi‑stakeholder approach, the 193 Member States of UNESCO, Civil Society, NGOs, academia, research institutions, but also media professionals, but also the technology companies on the platform. Without them, it will remain a Declaration. Because these are very powerful, very rich technology players who operate these digital platforms. You have to ensure they buy in. They're part of this conference, not only in attendance, but in participation in the consultations leading up to the conference. And that's a major development that they are party to this.
Why? Because they can also be beneficially from the outcome. What is the outcome? Modern ‑‑ sorry. Global, modern regulatory platform for digital platforms to ensure information for public good while safeguarding Freedom of Expression online.
Why am I saying this? It's a long sentence. Two long sentence. Let me cut it into two halves. If I say regulating digital platforms for information to be a public good. You may say two weeks ago, are you now in the business of secession. What do you mean regulating platforms? I fear censorship? I said, while safeguarding freedom of speech online. That's the balancing act, how to regulate while we ensure free speech online. But we don't want to see the public hazards. We don't want to see the public harm on the platforms.
I hope you understand the why we are doing this. I hope it's clear to you what is the target outcome, this global, modern regulatory framework for digital platforms that we will hopefully inform and inspire national regulators to set up their own regulatory policies. But it is global, share it by all, including the tech companies and platforms. They say how can we respect national regulatory systems. There are 194 of them in the world. 194 regulatory frameworks today in the world. There's no one single tech company that can fully respect or know what is in each one of the 194 regulatory systems.
(Person speaking online)
>> A global modern regulatory framework. Digital platforms are in.
(Overlapping Speakers).
>> Did I say something wrong? I'm not sure who they are. Just to catch up because we have other business. I'm saying that digital platforms are global by nature, are transforming by nature. We know this. We need to have a global reg tear framework. Not only national regulatory systems.
So this is an important session for us, because we look forward to your questions, to your inputs, to your feedback. We want this session to be as interactive as possible. I tried to set the stage for it. My colleague Rachel will proceed and my colleague Andrew also presenting what is in draft one of our work so far, and hopefully your input will help us enrich that current text and move it to draft 2, which will be presented in February at the conference. We hope that you will be able to join us in Paris in February of 2023. Thank you.
>> ANDREW PUDDEPHATT: Thank you so much Assistant Director General and the way forward in terms of the model framework and also the conference. Now, UNESCO work takes place, of course, also in countries. Before coming to the moderated discussion we're lucky here to speak briefly about a social media for peace project. Because we have in four countries looked at policies and practices on managing online content, and we are lucky to have John Okande from the Nairobi office with us who can speak about Kenya as an example. So also the concrete situation in countries that inform our work that is set by (Off Microphone).
>> JOHN OKANDE: Thank you, Cedric and also (?) for painting that picture. I remember when we met during the World Press Freedom Conference and we were having sideline discussions on how important it is to have from your popular word ‑‑ I don't know how popular it is ‑‑ but we need digital hygiene. And this digital hygiene has to be guided by a set of principles that ‑‑ and hopefully help us create a healthy digital ‑‑ we have a short presentation just highlighting what we've done and still ongoing interventions and social media 4 peace project. We have a short presentation to more or less build up on what IBG has mentioned.
The project has three components. And mainly the overall objective is strengthen society's resilience to harmful content spread online while protecting Freedom of Expression and enhancing peace narratives on social media platforms. Within the Kenyan context, we've been able to align our interventions in three components. And that one we were able to carry out through national research on one existing legal mechanism within the country. And the gaps that are there within those frameworks. And the second component piloting new tools to improve the curbing of online harmful content. We were also able to map out and assess what are the existing practices within the public entities, the reg tear frameworks within the country? And what Civil Society groups are doing.
So basically we have research and reports prepared. There's a global summary capturing the Kenyan context. We have two very good reports that have come up.
Next. We've been able to have consultations within the Kenyan context with the conflict communities where they've basically highlighted what is happening within their different spaces. If most of you can remember, Kenya in 2007 went through a tough time where more than 1,000 people were killed in postelection violence. Some of these tools were leveraged to disseminate head narratives. With the two narratives within the Kenyan context, the national definition of hate speech within the cohesion committee is still a little bit vague. There are cases on hate within digital platforms the moment they're taken to court, there is that legal where you can't hold someone accountable when that specific definition is not succinctly defined in law.
So highlights of some of the successes that we've been able to realize, we've been able to engage young people leveraging on UNESCO normative initiatives on literacy, training them how to identify and flag out narratives. We've been able to do this with fact checking organisations and other Civil Society groups. So we've also been able to create microlanding content developed by young girls to identify hate. Using that evidence to inform which necessary policy interventions we need to support national stakeholders to respond to these elements. But there's also another front that is evidence‑based advocacy where we work together, support coalitions and networks to be able to flag out this and hold public institutions responsible and highlight what is happening.
So those are some of the elements. And just a reason success is we were able to bring together public content moderators working with government agencies to equip them with practical skills on how they can fact check online content in Swahili language. Because this is one thing we've noticed and highlighted that content moderation practices is in mainly English. If it's ‑‑ they don't understand the national context. They don't understand the language. And they are predominantly defined to pick up narratives in English. There's a big gap. Yes.
Last slide. Next, please. The slides are moving so fast. I'll just highlight quickly a few findings. There is existence of online hate and disinformation. And this is actually affecting human rights online. National legislation to address some of these elements within the Kenyan context with some degree of inconsistence, for areas reasons. Within Freedom of Expression. There's also an unequal allocation of social media platforms roar sources to content moderation. What we see in the Global South is totally different from what is allocated in the Global North.
Digital platforms are not yet prepared to analyze and classify hate speech and disinformation according to local contexts and languages for obvious reasons.
CSOs are active to monitor and to respond to online harmful content, but there are no strong coalitions to actually cooperate on these interventions. Some challenges, lack of uniformity. I think I highlighted that, a clear definition of hate speech within different contexts, lack of transparency, concentration of powers and decision making, and some of the recommendations that are coming out of Kenya is, one, there is need for preparation of global standards and solutions that can be localized and contextualized. I'm glad that IBG highlighted that.
In Kenya there's an ongoing process going to monitor coalition on content and Freedom of Expression in Kenya. Basically the coalition will help national authorities. That is by and large other stakeholders to come up with codes of conduct and principles that prevent actions of hate speech and the necessary ICT tools to identify potential harmful content.
There's the need of localisation of content moderation practices. We realize local voices are not mainstreamed in these practices. And there is continued need for digital literacy, fact checking, particularly for public content moderators. We see there's a clear gap within public institutions on how to handle this. I'll end my presentation there. Thank you, secretary.
>> Thank you, this has many dimensions. I think we could have spent much more time on it. You spoke about the legal dimensions and the capacity work on the information of literacy and other challenges on resource challenges but also content moderation. We can now really ‑‑ to the model regulatory framework, and we have the pleasure of having Andrew Puddephatt joining online, I hope. Can we put him online, please? On to the screen. He's a Senior Consultant to UNESCO. He's one of the authors of the ‑‑ key author for this model framework. He will try to present it in five minutes. And then we will come to the moderated discussions. Thank you.
>> ANDREW PUDDEPHATT: Thanks, Cedric. Can you hear me okay? Just a check.
>> CEDRIC WACHHOLZ: Thanks.
>> ANDREW PUDDEPHATT: Hi, everyone. Sorry I can't be with you. As Cedric said, we're looking now at the global guidance UNESCO's producing or seeking to produce that would enable internet companies to realize information as a public good. With UNESCO that's Freedom of Expression and access of information and managing what's damaging to human rights. We can't pretend this is a very easy task. There are many, many different aspects of internet policy being discussed at the moment, privacy, access, digital divide, competition policy.
But I think the way the content is managed is one of the more controversial and in my experience that attracts the most diverse range of views across the planet.
So building global guidance for regulators or governments seeking to regulate is a challenging task and one that I think as the ADG says many stakeholder conversations to try to arrive at some kind of common approach. This meeting is an important part of that process.
UNESCO has five goals in developing this model. The first is much more transparency from the platforms about how they operate and how they manage content, not just what their policies are, which is one thing, but how they put those policies into operation. And those content management policies are operated in a way that's consistent with human rights, whether they use automated or human processes and how the automation works is a critical issue.
Thirdly, ways in which users can understand the provenance of the information they're seeing online, where it comes from, et cetera. And fourthly, some accountability from the companies so that if they have policies but don't implement them, there are meaningfully and tame timely ways to redress and decisions that are inconsistent with the company's own policy.
Fifthly, and I think unusually, in terms of regulatory systems, we're recommending some independent oversight and review of the regulatory system itself, not just that you said it and let it run with the companies, but a periodic independent body to review that to see if it's reaching the goal of information as a public good.
Now the framework we've looked at does not involve regulators or governments deciding on individual pieces of content, whether they should be removed or left out. What the framework does is it looks at the systems and processes companies have to manage that content online. So it's very much looking at the systems and process which is very similar to the approach adopted by the Digital Services Act in the European Union which is a systems and process piece of legislation.
Having said that, not just let the companies decide what they report on but specifies a number of areas where they think the company should report. Firstly, transparency requirements about the company's method of operation and the way it deals with and manages content and users online. Secondly, how it should show how its content management policies are consistent with the national covenant on civil and political rights, how it shows users can understand the provenance of information and how they provide you with an effective complaint system. And generally, what are the different techniques used by the company to manage damaging content. Not just removal in some cases that's not appropriate. Is it downgraded in search, is alternative information provided, is information give about I provenance of the information? Et cetera, et cetera. There are different techniques companies use and having some clarity about that.
Another issue on which we would like or we are suggesting companies should report is the proposals they have on media literacy, how they protect the integrity of elections, which is a hot topic in areas many of you are familiar with, what mitigation processes they have in place to deal with risks on their platform? Finally, what languages they operate in, how they provide access to different languages who want to complain about the platform and how they provide access to data for researchers.
All of that is wrapped up. Those are the areas where we think the regulators should require the companies to provide information about its processes. We conclude on a section on the constitution and powers of the regulatory system. We call it a regulatory system in the guidance, because there are many, many different forms of regulation around the world. There are dedicated internet regulators and broadcast with an extended remit and countries with multiple regulators. You may have a human rights, data protection, privacy regulators with overlapping mandate. We call that a system of regulation. We try to identify key principles that should govern that independent regulatory system.
Coming out of that consultation which has been extensive which as the ADG said will continue on through the February conference, there are key questions that have been identified that have come up time and time again with people we've consulted. The next stage on this discussion is to focus on those individual questions and get expert interventions and open it to wider discussion.
Cedric, this is where I hand it back to you to take some of those key questions.
>> CEDRIC WACHHOLZ: Thank you so much Andrew. This was content rich, the fact rules how you highlight the UNESCO's regulation focus on systems and processes to manage content rather than saying which content should be prohibited, the issues and, the questions.
We come now to discuss now these four questions. So the guidelines for participation are quite easy. You raise your hand if you're in the room. You will get a mic so you can ask the question. For those online, raise your hand in the Zoom function. And you can use the hashtag. The first question goes to Eliska Pirkova is with Access Now. Your organisation has done extensive work. The IGF Access Now is launching content governance and responsibilities in times of crisis. You're well positioned for the first question.
So what safeguards could we put in place to ensure that the regulatory guidance does not lead to suppression of legitimate expression protected under international human rights? We would like to project the question on the screen so that you can continue to think as an audience how to respond to it and what your questions and points are. Thank you. The floor is yours.
>> ELISKA: Thank you very much. It's a very extensive question. First of all, I just would like to congratulate UNESCO on completing this fascinating and super challenging project. I still remember the times when Access Now published the content governance which solved to establish more global position hand issues around Freedom of Expressions and platform responsibility. Let me say that it was a very challenging task for us to complete that report. Back then we were pretty much reflecting precisely on this question too. So what are those safeguards that we want to see in place to have the human rights‑centric platform governance model that prioritizes empowerment of users and users' basic rights.
I very much approve the approach of focusing on the systems and processes with regulating platforms. For years we saw the short-sighted solutions relying on combatting categories on potentially harmful or illegal content which pretty much led to nowhere to lack of transparency and processes that were happening outside of any public scrutiny and proper oversight.
So to start with those safeguards, focusing on the processes and more on due diligence obligations for online platforms to comply with is the key from our viewpoint. And the recent Declaration that was launched a couple hours ago that has a set of principles for platforms or more precisely social media companies how to develop adequate crisis response mechanisms very much depart from due diligence obligations and contains recommendations on independent audits, risk assessment, impact assessments, as well as meaningful engagement with Civil Society and trusted partners on the ground that operate in challenging circumstances, including the time of crisis.
So I think, however, where we will have the main challenge when we're incorporating those due diligence safeguards, focusing on the risk assessments and risks stemmed from processes and break them down more, content cure raising algorithms, content moderation systems, algorithms, but also online targeting which is the platform business model that is equally responsible for amplification of potentially harmful but legal content, delivery techniques, how different systems are being optimized, on what data, what data should be permissible? We know the abuse of data that are then being fed into this algorithm is huge especially when it comes to categories of sensitive and personal data. That will be the main challenge.
We now have those due diligence obligations on paper whether in the framework just presented here or already mentions Digital Services Act. A number of technical details, how to incorporate those safeguards are technically feasible is the next challenge for the expert community to figure out.
One final point there when we are discussing these safeguards, and how to actually truly technically implement the empowerment we would like to see in practice, it would be a question of enforcement. Who will be the regulator? And how are we going to ground the regulatory body especially if we envision a global framework. So we ultimately point out to some international regulator that should exercise that overview. This is precisely challenging when it comes to issues such as data access framework, for instance. Enforcement of meaningful transparency criteria.
We need those in order to understand the impact of these systems and processes on democracy, electoral integrity, or human rights in general. Of course, if data access framework is not properly or there is not a proper oversight in place, it can be easily abused. These are those main challenges that we have to address in order to make those safeguards truly effective against all potential risk of abuse that now exist there.
I'll stop there, because it was already extensive answer. I'm looking forward for more clarifying questions.
>> CEDRIC WACHHOLZ: Thank you, Eliska, that was a rich intervention particularly to the due diligence aspect and went to (Too low to hear) to give the floor to anyone in the audience. The mic is coming.
>> PARTICIPANT: Thank you very much. It's been a useful session so far. I'm here from the UK government where, as some of you might know, we are bringing forth the online safety bill which contains a number of measures with the overall aim to make the internet ‑‑ make the UK the safest place in the world to be online.
To the Digital Services Act in many ways, but given that it's being developed in the UK for a UK audience it has some slight difference. I want to touch on the online safety bill in response to this question we have here in terms of safeguards that can be put in place. So the first thing I would say is proportionality is key in everything we do in terms of content moderation and the actions that we're asking companies to undertake.
Throughout our online safety bill we want to make sure that there's no undue overaction by governments. We want to make sure that Freedom of Expression is boosted at every given opportunity.
We also have special protections for content of democratic importance and journalistic content as well in recognition of the importance of this kind of content in civil and public policy discourse as well. So making sure that those pieces of content are properly protected and not are being locked down for any particular reason.
I think there's a point around ensuring redress mechanisms so making sure that companies are accountable and users are able to challenge the decisions being made by companies, for example, content moderation decisions. For example, if you use a piece of content deleted, making sure they are able to pick that platform and make sure that can be reviewed as necessary.
Very quickly, a couple points as well that came to mind. So I think as already has been mentioned, the establishment of the very independent regulator looking after this is one way to make sure that we are continuing to safeguard human rights and Freedom of Expression. And linked to that to make sure that when we do take action which are going to restrict ‑‑ which are going to constitute a restriction of some sort. For example, if there's a decision that is made to block access to a certain website or service, making sure that there's proper oversight through the courts and that there's a proper redress mechanism there as well. Sorry. That was a long answer. I'll pass on to someone else now. Thank you.
>> CEDRIC WACHHOLZ: Thank you very much. Number two, we have five people wanting to intervene on that question. Present yourself briefly who you are, stick to one or two minutes. We really want to hear you all. We also want to hear more women at one point. Now we will have a first round ‑‑ (Too low to hear).
>> PARTICIPANT: Thank you. I come from Civil Society for advocacy. We're working on advocacy. So my question is the current for reporting the misinformation, fake news, and others, those reporting on the social media currently. When we look maybe in the Global North, maybe they use appropriately, but when we look in the Global South, even some specific groups they have reporters and challenge the freedom of speech in practice.
So how can we solve this challenge especially misusing of reporting tools to undermine freedom of speech? Thank you very much.
>> CEDRIC WACHHOLZ: Thank you. We take very good note of every intervention. So the online safety bill and the intervention we will respond.
>> PARTICIPANT: I'm from Ethiopia from the judiciary. It's a good presentation. I think everyone is raising hate speech and the harm it creates for society, for safety. I feel that ‑‑ I think we need to have an international instrument that will help us to combat hate speech. I know that it is a local context matter in combatting and preventing hate speech. But having an overarching that would get countries to combat hate speech is very critical.
If you don't have such galvanizing conventions like we have like for child rights convention as a convention, then every country will take it and ‑‑ force the government even Freedom of Expression that we are having conventions I think is very critical. Therefore, what is (?) thank you.
>> PARTICIPANT: I would also like to applaud you to this project. I'm with the law of science and technology at the technical University of Munich where we also look into these questions. I really follow your approach as concerning safeguards. I would like to flip perspective a little bit and say how do you with such process driven approach, how do you safeguard the interests of those most vulnerable, that don't have the capacity to oppose and to use for NGOs to express their needs and their questions? So the European Union and the VSA has a clause on participatory mechanisms, but it's a very lofty. I think this is really part overreacting and from me kind of reacting. If I may, a second short question, how do you turn this procedural approach into a safeguard? Because we looked at the transparency reports. So far and also and best practices and we're shocked that actually there are have few. So it's not clear how to apply other forms of risk governance to this area. And the great danger is if you don't have it rise to the top and have good processes for it, have the company review their processes, not only report them, but also improve them, it's hard to see how the situation can be improved in a meaningful time span.
So how do you make them improve their processes with your framework, that will be my second question.
>> CEDRIC WACHHOLZ: Thank you so much. We have one in the back. Across the most vulnerable for ‑‑ thank you for the question.
>> PARTICIPANT: Hi. Thank you very much. My name is Raul. I'm representing ally, Latin American internet association, private sector. I had the privilege to participate last year in the Declaration of wind hook 30. I'm very proud of the work we did with that Declaration. I think it's a balanced Declaration. Under the Declaration also include the remark the positive impacts of internet and the development in the exercise on protection of human rights especially Freedom of Expression. I think that's in line with what the President of ICANN said this morning in opening ceremony. Not everything about internet is bad. We have to remark the positive and contributions to humanity.
And the other thing that's important from wind hook plat 30 is it speaks ‑‑ the Declaration there are recommendations and ask for all the stakeholders, for governments, for private companies, and also for the community in general.
With regard to these discussions, we are very committed with the discussion. And we share the concerns that initiate this process. We co‑organized it with Civil Society organisations last week in Latin America. And we received the first draft of the ‑‑ we received the draft only ten days ago. So the first comment is that this process is very short to February. I really expect that the expectation of UNESCO is not to adopt the local? February but to initiate a longer discussion.
In the meeting last week there were a lot of concerns from different parties from different stakeholder groups, including Civil Society.
One of the concerns is that there are no mentions to governments in this document while wind hook plat 30 there are lots of requirements on governments.
It seems it's not ‑‑ it's clear that the regulation is the objective of the document and the objective of the process. My humble contribution is that the objective should probably not be the regulation itself but to protect the rights of the users and improve the transparency and many other things.
For that, we have to develop good practices in a multi‑stakeholder fashion with commitments from different parties and, of course, very probably there will be a need to some kind of regulations, but the regulations should not be the objective itself, but one of the pieces to achieve the objectives that we are trying to achieve.
Other concerns and I will stop here because I have many others, but I will ‑‑ we have to remind that half of the population live under governments that are not democratic. And a good part of the populations live under authoritarian governments. We have to be careful and I think this is a concern that is shared by many that the global guideline for regulation doesn't become a tool for empowering authoritarian governments to implement more restrictive policies to access in general to exercise rights.
>> CEDRIC WACHHOLZ: Thank you so much for these points and questions. One was directly to address to UNESCO and then talking here and the Assistant Director General's view. This February conference will be a consultation. It is not something where something would be adopted ‑‑ I will leave that to the ADG. Let me now have a brief response from Eliska and also from Andrew on the questions raised I will try my best to be brief.
>> ELISKA PIRKOVA: This is indeed the problem before we started discussing the issue of platform governance or platform liability and there are a number of, I would say, procedural justice safeguards that can be put in place in order to at least mitigate the level of abuse of notices. Some of them have already been mentioned. This is by the way connected to reporting illegal content. So we go a little bit beyond the scope of this panel today. Where, for instance, the right to submit the country notice, the statement of reasons on what basis the platform reacted, and what precise provisions in the terms of service were actually violated in clear language communicated to users, either those who are initially shared that content can help to mitigate a negative impact of abuse of notices or simply to provide the more precise definition of what is a valid notice, which is something we strongly advocated for in the already mentioned European Union Digital Services Act. We had the idea there should be specific system of notice to specific categories of content. There is also the system in Canada, so‑called notice and notice plus that you might look into that also provides more safeguards, how to tackle abusive notices. I'm happy to discuss it further.
Briefly on the UK safety bill. I would like to underline specifically singles out legal but harmful category of user generated content. Yeah. As of today ‑‑ that's fantastic if that's the case. That would be strongly against the essence of the VSA also against UN special Rapporteur commented on the UK safety bill. I'm not sure if I remember all the comments now, but if you want me to stop here, I can. Thank you.
>> CEDRIC WACHHOLZ: Sorry. We don't have time yet because ‑‑ Anriette will address the next question. She is the Chair of IGF MAG. She's led the Association for Progressive Communications for many years and now is senior advisor. The question to you Anriette is from your perspective, what provisions are needed to ensure the independence of regulatory systems? Is it unrealistic to think that an independent regulatory system can be established?
>> ANRIETTE ESTERHUYSEN: Thank you very much, Cedric. I also want to congratulate UNESCO for tackling this very difficult task. The short answer to that question is, yes, Cedric, I do think it's unrealistic. I think we've had in the telecommunications, information sector, we've been struggling with regulatory independence since 1990 or since the 1990s. And what we've learned and Alison is better equipped to speak about this than myself is that independence of a regulator in practice requires much more than having it executed as an independent regulator.
In fact, some regulators that are not independent might act in a more independent way. When I say not independent in that they’re supported by a government. Some regulators are independent financially but they don't act as an independent way. Regulators are constrained in developing countries. They fear litigation by operators. They fear political intervention by governments. We've also learned that often even if the regulator's independent, if the political environment and the policy environment more broadly with regard to communications doesn't give them a clear mandate that allows them to act independently, they're not independent.
And the moment we start talking about content, we are talking about human rights. How can a regulator that operates as an independent regulator in a country where there isn't political (?) or systemic respect for human rights be expected to play a role in regulating content from a human rights perspective?
So there are so many reasons whereas I really commend the system. I like the idea of regulatory system, I think to expect regulators, whether they are con verged or not, also it's not clear to me whether you want the emerging new regulators ‑‑ many countries now have information regulators. That's a consequence of data protection legislation and freedom of information legislation. They're struggling. They're struggling be effective data protection legislations of the they're struggling to enforce compliance with freedom of information legislation. Now we are going to expect them to play an effective pro rights, anti-huge big national platform role.
So I think it's tough frankly. I think the other thing here, and I think this is really one of the difficult things to deal with, and I'll be quick, Cedric. Is that hate speech is spoken by people, platforms. If we're going to introduce a regulatory system, combat harmful speech of any kind, it needs to include dealing with those that speak. I must say in my country in South Africa, we have quite effectively used our Human Rights Commission on national human rights institution and courts to prosecute people who are perpetrating hate speech online.
I think that has to be part of the equation. Then when it comes to the amplification by platforms, I think we need to look at principles. I think UNESCO is getting there. We need to look at how to target the business models and all the role players, that includes the advertisers that benefit from amplification of extreme speech and also involves collaboration with the regulatory institutions in the countries where those companies are registered.
So I think this is a very important project that by establishing 193 national regulatory institutions and expecting them to bring about what we want to achieve, I think we might want to rethink. Do the principles come up with the framework? Absolutely. We need to do that. Let's be very realistic about the feasibility of having that framework complied with at that distributed level.
Let's learn lessons from the 30 years of telecoms liberalisation.
>> CEDRIC WACHHOLZ: Thank you, Anriette, for your critical questions and approach. This is why we're having the consultations, because we want to build on your experiences and also looking for your guidance on which kinds of provisions that are needed to translate (Too low to hear). I will now go directly to Alison. We will have the speakers respond to the questions and afterwards, we will get the questions and Andrew will address in terms of time (Too low to hear) for intervention. Alison, you are the Executive Director of (?) Africa. Thanks for joining us. You’re a regulatory policy and leader and specialist that has worked across 20 African countries.
So the question to you is really about the notion of information as a public good and protecting free expression and the right to information. While dealing with damaging content. Is that notion of information as a public good easily understood? And what are the implications of regulations (Off Microphone).
>> ALISON GILLWALD: (Off Microphone) thank you very much for that. I must confess I'm a former broadcast regulator and telecom regulator. My approach might be a little more rigid, I think.
But going to the question about public good, the use of the terms of public good and the use of digital public goods, but before that, I wanted to respond to your five principles. And just to say I really, really feel there's a critical principle missing there. And that is access. I know you bring it up to specific communities like the search communities and specific access to information and communities. But really the governance of global digital public goods in terms of digital rights distinct from the underlying infrastructure and citizens access is to deny the majority of the world's population the rights to information. We're simply going to continue to (?) the Internet which currently they're excluded from. Even the vast majority of those online are not really able to optimize or to benefit from the information that is there, as you pointed out whatever it is (Off Microphone) content is in English.
So these are really important issues. I think they go to the issue of regulating public goods. I think this is an important issue because it's coming up in different forms and different parts of the and it's used outside of that. I think ‑‑ the problem is obviously that public goods are not just goods are good for the public. I think if we turn to the economic regulatory concept of this good, it allows us to make certain demands and certain obligations on the people who are publicly or privately delivering these public goods.
So I think it's very useful to return to it. Just to identify and it's obvious to many people, but the classical public good was public broadcasting. That's because it's non‑rivalrous and non‑excludable. In the dissemination in the transmission of those air waves, nobody else's information or content or utility is reduced in that process.
So digital goods we're very much for all these things very much for the non‑exclusionary and non‑rivalrous, many are made exclusionary by their private commercial that is derived from them in production. They're fundamentally non‑rivalrous. From a regulatory point of view, if broadcasters or telecommunication operators were using public resources such as air waves, they had to use those in a certain way because they were public resources.
Likewise, the internet is a public resource. It's a public good. And if people are operating on it in a fashion that is essentially (Too low to hear) there's no reason why they should not be regulated in the public interest like these other goods. The fact that the private sector or the private has expanded at the expense of the public realm over the last two or three decades, doesn't mean that you shouldn't be pushing back. There is, of course, in this proposal is an attempt to do so. But it's very much in the terms of the big operators and what they do.
Okay. In terms of what you do, you need to do this better and need to have better oversight but it's not saying, this resource is now being allocated purely on commercial supply side evaluation. The people with the most money have market access and these kinds of things are now able to play the game on their terms, but they need to moderate their behavior. Whereas, if you look at the kind of public goods regulation, then you would say, what would a demand side value be. What recognizes the value of this resource as an input, either if it's into markets or as input of information into democracies. This is a critical information conduit for democracy. Therefore, we are going to emphasize the demand side evaluation. And we're going to ensure, for example, in the access to the internet that we create a commons or a data lakes or we're going to create content and information that people can draw in and use.
While I think the system is kind of started at regulating the systems and the processes, I don't think it really goes far enough at all from the regulatory point of view. I'm talking about enabling a regulatory environment. I would be equally concerned about the sensory type nature of it. I'm looking at the enabling aspect of this not the compliance aspects that is the framework as we pointed out actually doesn't exist in many countries.
So the real challenge is the public goods regulation traditionally has happened and digital goods we're operating at the national level. That's the national reform models that we've got. The real challenge is how do we extend that to global governments because that's the requirement we have. That's the tricky part. I think that's where we have to not think in terms of traditional realms between states and markets and citizens and multilevel bodies and that sort of thing. I think we need to think far more creatively about new forms of public. We need to think of new forms of delivering. We do have experience of private delivery of public goods. We have ‑‑ it happened by regulation. So how in this environment, in this unenforceable in a way except by neutral (Too low to hear) how in this environment we get specific outcomes at a global level? I think this is one of the tricky things. I think there are mechanisms that we have to look at that don't only look at national regulation but look at different publics that are informally and formally regulated. A lot of informal regulation that has happened around advocacy groups is a new form of public regulations and a new form of public (Too low to hear). It's wonderful to look beyond national regulators (Off Microphone) already facing challenges and problems of enforcement.
>> CEDRIC WACHHOLZ: Thank you so much, Alison. That's wonderful to hear all the inputs and questions and points but also the point that using a formal regulator we're not going far enough but also the (Too low to hear) I'm going to ask Menno Cox to come and intervene online. Please on the third question. Menno, are you there? Can you hear us? I will introduce you first. You are the head of the sector in the youth and commission to DG‑CONNECT where you cover the global aspects of digital services and platforms. Last month the European Commission adopted the act that we had in our discussions already and whose implication will be felt worldwide.
Based on the experience of developing the VSA do you see it as setting out a global policy given how very political systems around the world?
>> MENNO COX: Thank you so much for giving me the floor. Can you hear me well?
>> CEDRIC WACHHOLZ: Thank you.
>> MENNO COX: Great. Yes. The question's also really important. I think that's reflected by the various comments we've already heard in this session today about specifically those hazards in trying to set out such a global policy.
Let me start, though, by saying it absolutely makes sense in my view to promote regulatory excellence on digital services and the way we mitigate the societal harms they create.
Within the EU, you already mentioned that the Digital Services Act is now law since the 16th of November. And we, as regulators, now focus on enforcing these rules. But they cover, of course, a 27-member state block. This is already quite an effective way also of getting platform companies to the table and developing this necessary regulatory expertise and also, of course, generating all of the benefits through enforcing the rules that you get from engaging the community around platforms and through the various transparency obligations that are included in the Digital Services Act.
In this sense I think it is important to stress that this 27-country set of rules will benefit the globe also directly. Because, indeed, the research that will be done on the Digital Services Act will be public and can inform about problems that we face around the world and we do indeed share a problem statement around the world. The research into content moderation in minority language is also an issue within the EU under the Digital Services Act, the EU market will also build up auditing capabilities necessary to indeed audit the algorithmic recommender systems, for example, to name one of the many elements that will have to be audited but that are specifically complex and for which you need specific expertise that will be built up and available to the rest of the world.
So in this sense, again, focuses on enforcement domestically but with direct benefits. I also believe that sharing the expertise that will be built up by the regulator will make sense. This is because we share the same context that's already been mentioned, of course. But the global platforms that are active around the world and have user basis in the billions of users, they face issues that we share. I do believe that the EU platform regulations to the Digital Services Act but also the digital markets act, in that sense, translate universal values. So this is not about selling a certain brand, either the EU Digital Services Act or Canadian or UK approach we heard earlier, but to see how we can safeguard our universal values for citizens around the world.
Here, I think one of the key universal values we talk about is no longer having to face a tradeoff between Freedom of Expression, the right to access information, and the safety of our citizens. I believe the Digital Services Act does that in a way that would also benefits citizens around the world, which is namely to use regulation as one important piece and a much broader puzzle to empower citizens. They do that, of course, through empowering society. We will have hopefully societal content moderation instead of content moderation by any given actor be it government, platforms, or other entities.
To mention also DSA has obligations for trusted flaggers and auditors and researchers. It addresses the whole ecosystem around platforms.
In a sense the essence of the of the approach is to leverage that ecosystem. What I didn't get to the question of a model regulatory framework for the world, what we perhaps should think about is what we're asking governments around the world to do. And it was already mentioned whether it makes sense to ask all 193 governments to impose obligations on platforms, which is how the framework is currently phrased or whether it is about indeed sharing best practices, learning, and at the same time getting platforms to protect citizens better in all jurisdictions where they are active.
But perhaps one role that the framework could play is actually explain the platform regulation can only be implemented if a minimum societal ecosystem is in place, and that the focus should be on building a required ecosystem of trusted flaggers, journalists, researchers, also international corporation structures, which are necessary for any regulation that focuses on processes and procedures and leverages this ecosystem to function. And we know from the 194 examples that were mentioned and tracked by the United Nations colleagues working for the High Commissioner, that there's little positive to say about any of those regulatory attempts that are done around the world.
So perhaps it is really about looking first at what is already there about sharing best practices, about supporting the building blocks that need to be in place, and at the same time focusing on the conversation with platforms as the regulatory framework also tries to do to make sure that the necessary protections and existing mitigation measures are extended to citizens around the world.
In the meantime, what is also very important ‑‑ I think it was mentioned as well already ‑‑ is that we have focus on research and possible effective mitigation tools, because we need to acknowledge that resources are limited, and that indeed the status quo is that there is a focus on English as the dominant language, that there are many jurisdiction no, sir which global companies are active, and definitely more has to be done, but also we now have an opportunity to think about the mitigation measures that we desire and that we believe safeguard human rights including through the use of technology and content moderators whose well‑being also has to be taken into the equation. So very much a call for continuing the discussion and this good initiative and thinking close by about who we are addressing and with what obligations or principles. Thank you.
>> CEDRIC WACHHOLZ: So many stimulating inputs and the thought there needs to be a tradeoff between Freedom of Expression and the regulatory framework. Then you mentioned a lot of challenges which I think ‑‑ now I will still give ‑‑ if there's a woman who would like to take the floor, they will have priority in our short time frame we have. Otherwise, I will ask Andrew to come back with us. Otherwise, (Too low to hear).
>> PARTICIPANT: I work at the center for communication governance which is a resource center based in Bali. I just have one broad comment that I was curious about by looking at the model framework which is generally that I'm not entirely sure how it interfaces with the current like safe harbor obligations that platforms would have. What we're asking them to do in a large sense is to more intensively moderate content which I think we can all agree is broadly necessary. It's not an interactive issue. I think when we're proposing a regulatory framework that heavily impacts how platforms look at content and especially ‑‑ I understand that it's not meant to be like a directive. The regulator is not meant to have control over the content in this case either of the but I think it's important to engage with what you're expecting platforms to do.
Because the safe harbor framework is just creating an aware that incentivizes compliance to a certain extent to maintain that status. When we're talking about safeguards, it's important to think about how we're enabling different points of view.
For example, when we're talking about the fact that we need to include content narratives, we're asking platforms to pick what those content narratives would be to highlight. How do you come up with a system in which you identify credible flaggers, trusted partners? I think it would be important to engage with how we're expecting platforms to do what they're going to do and how that will intersect with existing obligations under different platforms?
Also I'm echoing the need to be realistic about regulator capacity. Again, we're expecting ‑‑ I know this is where regulation generally is going overall. We're expecting audits, et cetera. To see how we can reduce that burden, share it across more stakeholders, have more of a multi‑stakeholder process so we're not expecting one authority to do all the heavy lifting. Thank you.
>> CEDRIC WACHHOLZ: Thank you for this intervention there were a lot of additional questions to Andrew but also important points. Andrew, can you ‑‑
>> ANDREW PUDDEPHATT: Sure. I can't pretend to answer all of the questions because, although I've taken detailed notes and I would be reflecting on a number of them. Maybe three or four general points.
Firstly, UNESCO is not a naive organisation in its Secretariat. It understands the world in which it operates and understands the imperatives that are driving the conversation. There are a very substantial number of pieces of legislation around the world in the process of passing Member States that are seeking to regulate content online. A number of Member States ask UNESCO what should we do? How should we do this? We're concerned about some of the things happening on the platforms in our country, some of the insights to violence, whatever, how do you deal with it? UNESCO is trying to say, if you intend to regulate or you want to regulate content, this is the kind of principles and the kinds of things you should think about when you're introducing legislation to protect oh expression and rights of information and to deal with that damaging content.
It's not ‑‑ the framework or the guiding principles are not ‑‑ they're not going to fix the internet. There's no pretense this will fix the internet. This won't deal all the issues around privacy and data protection or access and digital divide or competition policy. It won't deal with a whole range of issues. The internet is a massive, global phenomenon that requires a whole series of different, hopefully coordinated policy approaches to fix the different things that need fixing. This is just taking one piece of it, which is around the way the content online is being managed or not being managed at the moment on a number of significant social media and search platforms.
We want to be realistic about what we're trying to do. We're not trying to fix everything. There are things people have raised that won't be in scope of legislation. I think in response to Raul because I know this has come up before, the next iteration of paper will deal with the responsibilities of governments, mostly governments what it shouldn't do in terms of intervening in the shaping and delivering of content. It will also recognize the wider societal responsibility.
I think, as someone else said, the internet is not just a set of technical tools but a set of human interactions and human beings are responsible for the way behavior manifests on the internet and it's amplified and directed and shaped by algorithms and by search and a range of other factors but you can't eliminate the responsibility of all of us as people for the way we behave online, human behavior. That's something that will have to be acknowledged.
Finally, I think ‑‑ I certainly recognize that to achieve global consensus on content regulation in six months is not feasible or possible and probably not desirable. What this process is a contribution to a debate that I imagine will roll on through many more years and will need to broaden out through many more constituency. Hopefully it will be an informed contribution that will shape the debate at a national and global level on how we deal with content online and how people's human rights are respected not only in terms of our ability to express ourselves but a way to operate in a safe environment.
>> CEDRIC WACHHOLZ: We're aware also that there are a number of comments made online. I think we will certainly take them into account as ‑‑ because it's difficult to sum it up now under a time pressure. I think it was a very rich discussion. I invite now our Assistant Director General to close our meeting as we go on the way forward.
>> TAWFIK JELASSI: First of all, I would like to thank you all panelists and you online and you, the audience, for such a rich exchange. The fact that you're still in the room 25 minutes after the scheduled ending of this session shows the interest you have and thank you for your engagement. Certainly what you heard is ‑‑ I started maybe stating ‑‑ in my mind I have three questions. Why doing this? I think I addressed that. What to regulate? And how to make it happen?
I think many, many of your questions on the what. Andrew is clear for what is outside the scope for an attempt to regulate. Now how to make it happen. Thank you for your input as a Broadcom and telecommunication regulator. The devil is in the details how to make it happen and how to make companies do it worldwide.
This is important for us going forward. This is version one that we presented to you. Your inputs will inform us to enrich the debate. The next version will be published on UNESCO website on December 9th. So stay tuned if you're still interested, connect to the website and see that draft.
In general, there will be another revised draft because, of course, the consultation is experts and third parties will continue in the meantime. I realize I was very short because we announced this last May and the conference is in February. So we didn't give ourselves even ten months to do this gigantic piece of work.
But, again, time is of the essence. It will take three years ‑‑ what would happen in the meantime? At least we initiate the discussion. We try to start, move the needle even in a modest way. I believe in saying start big ‑‑ sorry, think big, start small, scale up fast. We had a big vision. We had a big ambition. We are starting small, what we can achieve in ten months or a bit less. Then maybe we can scale up fast. Maybe we move fast. Maybe if our leadership agrees to it we may consider a UNESCO recommendation on this matter. This is not on the table now. It is not. I'm saying there are many avenues to say how can we allow ourselves more time? How can we have deeper consultations regionally, thematically? How can we come up with something more solid and deeper? For that we need time, for sure. It's not our target for February for now. We'll do what we can, of course, time permitting.
Let me say also that we would like to have you join us for the conference either online or in Paris at end of February. Let me remind you of the website or the address for the contact. It's called [email protected]. [email protected]. I think back to our IGF forum. This morning there was a session of trust and security. The leading thought of our conference is we want to build internet for trust, that we trust the information on the internet. We trust the human interactions that Andrew referred to online. I think trust is the center theme here. This is maybe the hope with the IGF and some of the issues that we have been discussing here T.
Again, thank you all, for being here for your input. We hope you can continue together with this exchange and this debate. Thank you.