The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> MODERATOR: I don't think they can hear us so far.
>> DAVID WRIGHT: Yes, we can.
>> MODERATOR: We can start. Nice to have you all here on site and online as well. We are happy to have our workshop about online gender violence with you. And I don't really want to say that much. I will just hand over to David, who is our moderator, and he's with us online. So yeah, let's give it up for David.
>> DAVID WRIGHT: Thank you very much. It's great.
>> MODERATOR: We can't hear David yet.
>> DAVID WRIGHT: Sorry, can everyone online hear me. Okay, everyone ‑‑ thank you very much. Just the people in the room can't hear. I they can't hear anyone else either.
>> MODERATOR: Please wait, we are still trying to get the tech so we can hear you.
>> DAVID WRIGHT: Okay.
If you can organize for the video, the recorded clip to be played from the USB stick, that would be helpful.
It seems they are still working out to get audio. This is the first session after the opening.
>> MODERATOR: Just to give you also on Zoom a bit feedback, we still can't hear you, we are still figuring this out. Just wait a second and then we can start.
>> DAVID WRIGHT: For all of those on Zoom, we can have a bit of a Zoom party, ahead of everyone in the room.
>> AUDIENCE MEMBER: For some reason, I cannot rename myself. You see me as DRF Zoom.
>> DAVID WRIGHT: No problem. And Emmanuel, yes, you have the same thing as well. So that's appreciated.
We got Cindy. Is Kirsten on the call as well.
>> SABRINA VORBAU: She sent an e‑mail this morning, I don't know if you've seen it, that she's ill, won't be able to join.
>> DAVID WRIGHT: Okay.
>> MODERATOR: We can hear you now. You can start right in.
>> DAVID WRIGHT: Kathryn, thank you very much. If I can just confirm as well, if you're able from the room to play and share the recorded clip from Roberta Metsola, if you can confirm that.
>> KATHRIN MORASCH: No, we can't play it. It's on a flash that doesn't work. Maybe we can try a screen share, if someone can make me a host, I can try and do that.
>> DAVID WRIGHT: If someone can give Kathrin rights to share video, that would be helpful.
>> KATHRIN MORASCH: I'm going to try, should we start with the video.
>> DAVID WRIGHT: Let me just do an introduction and then we'll come to you, and we'll cross the technical bridge when we get to it. Which will just be in a moment.
So just wish to extend a warm welcome to everyone online and also to those of you in the room, to what's is clearly an important subject.
So my name is David Wright, I am CO of the U.K. safer Internet center and really here representing the European Insafe network, alongside my colleagues, Sabrina and Sophia, Deborah. So between us, we should be moderating this session both in terms of online and Kathrin from the room as well. We have a stellar lineup for everyone. We will shortly hear from Roberta Metsola, as long as we can get the video clip shared. Then we will hear, as well, too, from Emmanuel, who has joined us online, and then we'll turn to Kathrin who is in the room with you. Deb from digital rights will give digital rights and Cindy Southworth will give her opening remarks. We'll have five minutes just to set the scene from their particular perspective before we launch into a couple of questions for the panel, which I have.
And then we will open it to everyone online, and indeed questions from the room as well.
That's the program and how we are aiming to present this particular session as well. So do please think up some of those questions, like I say, it's an amazing panel that we have in this particular subject. And so, in terms of the introduction to the workshop, so all about online gender violence, the good and the bad and the ugly.
Just by way of introduction for me, I'm David Wright, chief executive of a charity in the U.K. One of the things, this is really my perspective here, since 2015, we launched what is the revenge on help line in the U.K. So in that context, we support victims of ‑‑ victims or adults who are victims of nonconsensual intimate abuse and predominantly since it started, we disproportionately support women as opposed to men. So just in terms of some of the numbers that are associated with that, back in 2019 we have seen an escalator over ‑‑ escalation in terms of calls.
Cases in 2019 totaled some 1700. That rose over 3,200 in 2020, rising to 442,021. Last year we supported 75 percent of those were indeed cases relating to women.
When we do support men, on average each case contains .2 images per case, but Conversely when we support women, each case, on average has a significant gender imbalance. So based on the team responding and getting content taken down, very practical help line, although I would allow revenge says easy terrible title. The help of the support service provides, that's how ‑‑ that's what people search for and how we find us, that's why the title endures, nonconsensual image abuse is.
>> KATHRIN MORASCH: Descriptive and precise around the particular subject.
We have gone through 3 ‑‑ removal of 300,000 images too. And the threat is equally a significant issue that we deal with from the U.K. perspective, and so we'll get onto here about an amazing platform, which I'm sure Cindy is going to give us some insights too, which has launched about a year ago and actually prevents people sharing your intimate images online. More about that later on.
Again, it is great to see perhaps it's taken quite a while, but good to see the world awakening to this particular subject as well. Across the last year we have seen some good progress from a policy perspective. We saw ‑‑ which we'll hear shortly from Roberta Metsola around strategy and direction to do with policy across the European Union. We have seen the White House launch in June a ‑‑ oh, dear, I've got a little bit of a poor connection. Hopefully it's okay now. We saw the White House launch a task force around gender, online violence in June.
We have seen the G7 governments have a real focus on this particular subject, as well. So that really just introduces us to the floor to hear from Roberta, which Kathrin, if I can turn to you now, if you're able to share the video clip, the recording that Roberta has shared with us. Thank you very much.
>> KATHRIN MORASCH: I think we can't hear the sound so far. I try my best to get this started. Okay.
>> Most of your cases of right out sustained aggressive online abuse. The more insidious process of name calling, and vilifying has come to characterize how many experience this.
>> KATHRIN MORASCH: I think we can hear her, but not see her so far, so I don't know how we can fix this, but maybe you can just hear closely and maybe only see the standing picture, so we won't be that delayed anymore.
>> ROBERTA METSOLA: Society problem, we cannot ignore either that women and girls are disproportionately victimized online. In 2016 United Nations study found that gender‑based violence is closely linked to power imbalances between women.
>> KATHRIN MORASCH: Maybe we can just ‑‑ well, switch off the microphone and I hold my microphone to my laptop and then maybe it will work better. That's somehow fine, no? Because we hear everything doubled.
>> DAVID WRIGHT: We can hear it fine online, Kathrin. If it's easier in the room, then please do so.
>> KATHRIN MORASCH: Okay, just start from the beginning again and then we will see what's happening.
>> KATHRIN MORASCH: Many experience discourse online. We cannot ignore there is a real modern day society problem. We cannot ignore either that women and girls are disproportionately victimized online. In 2016 a United Nations study found. Gender‑based violence is closely linked to power imbalances between women and men and harmful expressions of masculinities. The scale of the problem is not sufficiently discussed. Therefore victims find it very difficult to obtain support. Remember those stories of rape victims getting the blame because their skirt was too short adding insult to injury with the suggestion that rape was the victim's fault. Well, the same thing is happening online. To give you one example, when photos are leaked without consent, she shouldn't have gotten naked in front of the camera, putting the blame on the girl. Without addressing the real issue, a person receiving such private content has no right whatsoever to share it with others. And beyond insult and injury, what happens then is that victims start believing this false narrative. They blame themselves, and worse still, the negative backlash comes from both the victim's male and female peers.
We are seeing case after case of image based sexual abuse across the use, particularly, EU, who are tormented because their private photos are shared without their permission leaving them little or no recourse.
This is unacceptable, this is when online gender violence becomes a much broader societal issue of discrimination and violation of human rights.
The European parliament has always called for measures to counter violence against women and the LGBTQ community. Protection, and prosecution, those are the three main elements to counter online gender violence. We have welcomed this year's European Commission's proposal that will criminalize among others, cyberviolence including nonconsensual sharing of intimate images, cyberstalking and signer is incitement to violence. We are voting on this important legislation in the coming weeks. Finally, let me express my gratitude to this IGF for the work you are doing to bring gender equality one step further. Thank you for having me.
>> DAVID WRIGHT: Kathrin, thank you very much for sharing that. Hopefully people got to see ‑‑ at least hear from Roberta. I should have done a full introduction in terms of Roberta Metsola, the President of the European parliament. So speaks with great authority, and it's wonderful to have her contribution to this workshop. You know, really I think emphasizing the importance and priority being applied to this.
We started international organizations waking up to this particular subject as well. So we thank Roberta very much for that contribution. The opening remarks.
I'm going to turn now to the manual. If I can give the floor to you for your opening remarks, again, sort of no more than five minutes, that would be wonderful. Emmanuel, the floor is yours.
>> EMMANUEL NIYIKORA: Thank you, David. Good morning, good afternoon, good evening, everyone.
>> KATHRIN MORASCH: Stop your screen sharing, David.
>> EMMANUEL NIYIKORA: Good morning, good morning, good evening, everyone, depending on where you're connecting from. Just to introduce, I'm Emmanuel Niyikora, work for the technical, at the ITU for west Africa in Dakar, Senegal. Thank you so much for having me. Distinguished ladies and gentlemen, all protocols observed, online gender violence is becoming a driver ‑‑ has become a driver for gender inequality on internet access. As we all know we are increasingly communicating online, and COVID‑19 has actually further directed us towards the internet and the use of digital tools.
There's no doubt that digital space is significant for women and the girls empowerment. The digital space presents great opportunities for networking. Sadly, the space is not equally safe for women and men.
Unfortunately the misuse of technology for violence is also became our reality.
In sheer numbers, women remain the primary target for gender‑based violence. And digital technologies have actually simplified abusive behaviors such as stalking, by providing tools for abusers, access their targets.
Unlike the physical violence, which requires people to be in the same place, technology violence can happen across geographic locations, with abuser being able to access their victims even when they're not in a close physical proximity. This leads to a re, is the internet a safe place for women, because women are considerably more likely to be victims of repeated and severe forms of harmful actions online and with the help of technology.
Online gender based violence against women, the formers ‑‑ that tend to silence women and girls and limits their freedom.
This, in turn, extends the gender inequality into all spaces, leaving women with few places to turn to for support virtually and in reality.
The inventor of the internet, stated in the 2020, the web is not working for women and girls, according to statistics from the annual state of the world's report, based on the research conducted across 31 countries with over 147,000 girls and young women, 58 percent of girls have experienced online harassment. 50 percent of girls said they first met online harassment actually more than three harassments.
42 percent of girls and young women experienced online harassment reported feeling mental and emotional distress.
Of the girls who were harassed, 47 percent were treated with physical or sexual violence.
Today the internet has been used in a broader environment of systemic structure inequalities, which frames women and girls use of internet. To bridge the gender digital divide, all stakeholders need to consider online gender violence. According to the areas of responsibility and perspective. Combating online gender violence is a cooperative ‑‑ to ‑‑ I require a cooperative approach. No one single stakeholder that can address it overall.
But despair these discouraging trends, there's more proof than ever that online violence against women and girls is preventable. Evidence shows that the single most important driver of policy change is a strong movement and activism to end the violence against women and girls, making feminist mobilization in the face of antiviolence backlash.
At ITU, the equal access coalition, a group to bridge the gender divide, recognizes that wider internet access to information and life enhancing services and opportunities online gender based violence and safety concerns around the internet are important in access to technologies and the internet.
The international date for violence against women, the equal access coalition, we have developed a repository of useful resources to assist the women and girls safety concerns and bridge the gender divides. Those can be found online.
Secondary to contribute to the creation of a society that does not tolerate violence against women, the ITU has launched an online protection initiative that was launched back in November as my stakeholder effort with the group of cybersecurity agenda framework. So this initiative brings together partners from across all communities to promote child safety in the online world. As a part of this initiative, in 2020, ITU published a set of child online protection guidelines, for four groups, children, and the policy makers. In September, this year, 2022, in partnership with the national security authority, the kingdom of Saudi Arabia, we launched a set of online on child online protection, the following targeted groups. 13 to 18. And 9 to 12. We have conducted with partners face‑to‑face training on cybersecurity and online safety training for the youth, aged 15 and 25.
We all know that the international efforts to combat this online gender violence, according to ‑‑ we have spoken to that convention, cybercrime and I see tan bool convention, Istanbul convention.
Just to end my remarks, at ITU we take this issue seriously and we have efforts such as the already mentioned, effort to provide sources to ‑‑ that can be used to protect against online gender violence, we have also child online ‑‑ child online protection initiative and guidelines, different training and resources to build capacity, have been developed. So thank you so much for having me, and back to you, David.
>> DAVID WRIGHT: Emmanuel, thank you very much for that. The great work the ITU does in this space too. It's wonderful that we rightly give you the opportunity to share that amazing work, the child protection guidelines from 2020 revisions that we saw.
So I'm now going to Kathrin, we'll hand it over to you actually in the room at the IGF for your opening remarks, and I should add for everyone's benefit, Kathrin joins us representing the youth IGF too, but Kathrin, introduce yourself, too, the floor is indeed yours.
>> KATHRIN MORASCH: Sure, I can do that, you all heard me talking a lot before, but now it's on subject. David already introduced me, Kim Kathrin, I'm from Germany, I'm a youth ambassador for some years now, I guess almost ten years, so still working with a lot of great young people, kind of as maybe a mentor role or something.
Yeah, I'm really happy that we can do this workshop and that you are really interested in this theme, that's something that's really ‑‑ which is really annoying to me because I see online gender violence a lot, and yeah, it's really frustrating that, especially girls, are being targeted online just for being young and female. I mean, we have all those gender injustices all around like the world, like every part of our living, but it's especially visible online. And yet it's true that the discrimination and hate speech is really pushing a lot of young women and girls like out of internet spaces, and I guess that's something we want to tackle.
That's because of really looking forward to later to hear your ideas on how to do that and how you create movements and resistance to that to show all the people who are harassing young girls online that, yeah, we won't accept this development. Yeah, so really interesting seeing and hear from you. Thanks. Back to David, I guess.
>> DAVID WRIGHT: Kathrin, thank you. Thank you very much.
And so I'm now going to pass the floor over to Nighat, if you're able to make your contributions, Nighat, at the floor ‑‑ please, as well, introduce yourself, the extraordinary work that you do, the floor is yours.
>> NIGHAT DAD: Thank you so much, David, I hope everyone can hear me. If my connection is patchy, please know I am speaking from Pakistan, where we ‑‑ all of us do get inequal access to good internet, but anyways, thanks for organizing this panel, David. I think it's ‑‑ the topic has always been very close to my heart. I have been working on online gender violence for the last decade. A lot has been said why this issue is very important, why it's problematic, how young women and girls around the world have been facing this issue. It's probably very complicated and tricky in conservative societies, we are ‑‑ women are already facing a lot of patriarchy in online spaces and the same patriarchy has translated into the online space.
So I'll be talking about some of the solutions and how we have dealt with the problem over the years, and I think ‑‑ we can talk about challenges as well, but I think for me, it's very important to see how looking into different context, how we can sort of address this challenge in our own jurisdictions, in our own context. So in Pakistan, while interacting with young women and girls, so I basically run digital rights foundation, an organization working on digital rights, one of the primary works we do is addressing online gender based violence. While interacting with young women and girls for different training sessions around the world around online safety, we at digital rights foundation realize that the helplessness that the woman ‑‑ I'm women felt while enduring online harassment, and what made it really more heart‑breaking for me and my team was that they felt they had to go through it alone, without any support from their friends and family, because of the taboo attached with having an online presence and especially sharing your pictures online or reclaiming your agency online.
So we, as an organization, identified that the major problem is the lack of awareness around the remedies available to them or resources available to them or what are the legal, you know, sort of solutions they have. So what we decide, we were so burned out, you know, like looking ‑‑ like listening to them and really finding those solutions, so we decided to take matters into our own hands back in 2016 and took initiative and established the cyber harassment help line, which was the first Pakistan help line of its own cone, region's own cyber harassment help line.
In fact, at the end of that ‑‑ this month, basically, tomorrow, we will have completed six years at the help line. In this time frame, in last six years, we have received more than 13,000 cases from all over Pakistan, and so from other countries as well, because help line is not only available on a call, we are available through social media and our e‑mails as well.
So on average 68 to 70 percent of the cases we receive are from women, while one person are from trans folks. That's around 9,000 women who have approached the help line so far, before but because most people find out about the help line through word of mouth, and there's still a stigma attached with talking about occupying online spaces in our part of the region, the actual number of women who face abuse and harassment online is much bigger in this country.
So what are we actually trying to achieve through the help line, to actually make women and girls safer online and making them more confident and educated about how to protect themselves online. So what are the preventive measures they can do and telling them what are their rights, you know, when their rights are breached in the online spaces.
We do this by providing digital security knowledge through trainings, legal advice and mental health support because a lot of women when they face harassment, there's so much ‑‑ the toll is takes on their mental health and they usually do not talk about it.
Our aim is to equip them with sufficient knowledge to make an informed decision and, again, through the help line, we are trying to bring to action when possible answer to the question that has been posed to us, which is how we can collectively achieve our responsibility to ensure the equal participation of women and girls in online spaces safely and securely and where they assume their safety is paramount.
Of course, we cannot do this alone, so the important word is how collectively, like we are discussing it here, multiple stakeholders can take action, civil society, academia, so definitely all stakeholders need to be fully involved and committed to this cause, civil society help lines cannot do this alone.
So it is important for the tech companies, and I'm glad that the matter is represented here, to talk about the initiative they have taken, but it's important for them to take responsibility and keep in mind that what is online violence and abuse for women and girls in one part of the world may not be considered as such in other parts of the world.
Several companies while working with them, we have escalation channels with them, they now do understand that there is cultural and societal nuances while developing policies and deciding enforcement standards.
So I think it's ‑‑ talking from experience, this is a problem area where we are dealing with right now at the help line, basically certain intimate images uploaded on the internet with the intention of blackmailing, threatening, women in Pakistan. Those images do not necessarily fit the definition of intimate images as outlined by companies in the U.S. or the same doesn't apply in Europe or western democracies.
So I think this is something we can also unpack in our conversation later, but I really would like to, you know, talk about initiatives like ‑‑ but revenge porn help line, we are not only working in Pakistan, but with the collaboration of these initiators, trying to address people reaching out to us from those jurisdictions.
The parent company of porn hub, for instance, has used the hashtag technology to protect intimate abuse and remove videos from their platform. There are initiators happening and solutions that are sort of, you know ‑‑ the people are using that according to their own context, but I think this is the larger problem and we need to collectively see how we can address this problem globally but also contextually in our own jurisdictions as well. Over to you, David.
>> DAVID WRIGHT: Thank you ever so much for those very powerful words, and so much respect for the work that you and the team do there, have done, as you say, for six years, with 13,000 cases is extraordinary work. So we benefit from your insight and your leadership in this context, too.
As you mentioned as we move on to ‑‑ onto the next contribution, we have the opportunity to hearing from Cindy from Meta in terms of the work Meta does, but the extraordinary work Cindy has done over the course of her career as well.
Cindy, if I can hand it over to you, it is in the U.S. early in the U.S., thank you for getting up so early. To join us.
>> CINDY SOUTHWORTH: It's always great to share a panel with you, I miss being in person with fellow activists. I come to this work as the Head of Women's safety at Meta, but I spent 30 years as an advocate and activist working to end gender‑based violence in local shelters, running helplines and working at the local, state and national and international level. So I speak to you both as a fellow activist that is now working inside a company, I believe we should have activists everywhere, they're not limited to one space.
And what I want to do just briefly is sort of anchor the work that I do broadly and then talk a little bit about the work that we are doing that got mentioned in reference and David is a little bit stuck as the facilitator, it's hard for him to talk about StopNCII, as deeply as he wants to.
In my role on the safety policy team at Meta, I address safety in a myriad of ways, and I come to this having served on the safety advisory board on Facebook and now Meta for ten years before I joined the company. So I saw firsthand some of the work that was happening, and I wanted to become part of that and continue it. It's fun to actually work on projects that I advised on and now I'm responsible for. But the way that we address safety and work to end gender based violence online and offline is a comprehensive approach. We do it through partnerships with 850 safety organizations around the world, activists, NGO's, half of those are women safety partners. We do it through policies. We are constantly striving to figure out the right balance to make sure we have one global set of policies, how do we take into account the cultural and contexted nuance that may differ slightly country to country so those community standards or policies and the work heavily on those. Our tools and that's two different pieces of tools ‑‑ actually three really. One is user controls, where one of the things we find is what I may find offensive or someone else may find offensive may differ based on context. I was taking to a para‑Olympian, swimmer, she said one of the things people used to attack her, they tell Hershey should be in the kitchen instead of being this incredible athlete. They use kitchen emojis in her comments and on her Instagram page.
Obviously as a company, as Meta, we can't ban all mentions of kitchens and all mentions of uses of food emojis and otherwise every time I post about what I've made or something I've baked would be problematic.
But having like key word filter, somebody who never wants the mention of kitchen or food or food emojis can be used through the key word filter. Those tools can be used in really creative ways when we have creative perpetrators who implement gender based violence in really unfortunate ways because they get around a lot of our tools.
There's those type of tools and machine learning, we are trying to get better at identifying hate speech in all the languages and take it down before anyone sees it or reports it to us. And we report on those metrics in our community standards enforcement report. You'll see over time that our harassment is improving, hate speech is improving. So our proactive reporting and proactive detection is getting better.
In my world, we would never have to wait for someone to report things, obviously contextual offensive are harder for to us catch, we don't know all the background. There are two different types of tools.
The third I mentioned is the one I'll talk about briefly. Tools we create in partnership with NGO's, such as Stop NC i.
I'll talk about that in just a moment. The sort of fourth bucket is resources, we have created a lot of safety guides with NGOs around the world. Having been a victim advocate, it's hard when you're in crisis to navigate these resources and it's important to have them, it's hard to know which one to use and when, we are trying to make them more intuitive, upsell them at the right point and do a lot of work with LGBTQ partners, women safety partners and those type of groups to make them really intuitive and easy to find.
Lastly, our feedback loop. It's groups and conversations like this one today, we recently concluded a series of women's safety round tables around the world where we talked to over 350 women's safety organizations to hear what they like we are doing and where they want us to continue striving, tools they'd like to see us implement, policy changes they'd like to see us consider.
So that feedback loop is vital for us to keep the conversation going. And so with that, I briefly wanted to mention StopNCII. You can report on all the platforms all the tech companies, let you pretty much report an intimate image, a video or a photo and have it removed, but one of the things that we have heard over and over from NGOs, nonprofits and survivors, that doesn't help them rest easy, they're worried they'll lose their job, they are worried their family will see the image, they are really terrified.
So years ago, Meta worked with nonprofits, myself included, when I was in the nonprofit sector, to create a preventive way to create a digital fingerprint, a series of letters and numbers to prevent an intimate image from being posted at the time Facebook and Instagram. It was completely run by us, by Facebook. The limitations were it was really platform dependent. So in December of 2021, we partnered with the Southwest grid for Learning, with David and his colleagues, and launched StopNCII. Working with NGOs around the world, including the digital rights foundation, to make sure eventually other tech companies would be willing to sign on, and these hashes could be used by other companies to prevent these intimate images from being used to harm people. It's a really excited adventure, we are hoping to see more and more survivors take advantage of this tool to prevent intimate images from being weaponized, thrilled to join you today. Turn it back over to David.
>> DAVID WRIGHT: Opening remarks all complete. Like I said at the outset, I'll turn back to the panel with ‑‑ now, there are two questions, but I'm going to append one, just to keep the panel on its toes and see how you reacted to the third question as well. Then we will open the floor, both physically and virtually. So I want to hear questions, queries, challenges from all of you in the room there, but also virtually on Zoom too. So Kathrin, if you're able to collect, when the time comes, to collect questions from the room, and equally, Deborah, any questions anyone has online, please do pop those in the chat box, we will then get to them. Certainly make a note in the room, if you can pop in the chat questions you want to ask as they occur to you, then we will get to it in the allotted slot. But I do just have a couple of questions that I'm going to, like I say, turn to the panel about. Really in no particular order and aimed at anyone in particular. The first question, which I'd like responses from the panel.
Women and girls have a right to participate online equally and without fear, abuse or harassment. We all have a collective responsibility to resolve this. How do we go about achieving this?
Who is wanting to respond first to that question?
>> KATHRIN MORASCH: Maybe I can start. I don't know how to raise the hand.
So I think for me, there are kind of like three pillars in which this could work. As a lawyer, the first thing always is like law enforcement, you know, to get the public prosecutors and stuff, and all those ‑‑ the whole criminal content being enforced, I think that's also something which is working quite well. But also I think we have to prevent this from the beginning, I think that something Cindy already tried to talk about. It's always like the platforms who can help so much by moderating, by moderating the content and kind of like a lot of stuff is in their power when we talk about law enforcement to give the law enforcement, the names of the people and the IP addresses and stuff, have to work together, but also in prevention.
The third thing is, I guess we have to empower women and girls maybe like a short question to the people, I can see the room, who of you have been in contact with this, for example, on Instagram or Tik Tok or whatever it is, this movement of like the perfect woman or the perfect girl?
Someone, no? Okay. Maybe that's just ‑‑ so maybe it's a thing just in Portugal and Germany, I don't know.
Okay I can quickly explain. So there are like some young women and girls like talking about like their life and yeah, just picturing the ‑‑ what they are showing is how they are perfect in their looks, what they do in a day, they get up really early in the morning and then they do yoga and do all this great stuff throughout the day, I mean, it's great to do that. So don't get me wrong, but I think like this tends to ‑‑ a lot of young girls especially that they think they have to be like that. A lot of the stuff which is presented there is kind of like in the way that like we as young women think we have to be, especially to appeal good to like men and boys, I think that's also a really big problem. If you don't get all this negative feedback and all the hate and the violence online, I think this just is ‑‑ yeah, those two things go together and that's a really big problem, I guess like the third thing is to empower all the young women and girls not give a shit about men, stuff online, I guess.
>> DAVID WRIGHT: Thank you very much, Kathrin, those three pillars, as you say, are really important.
I know ‑‑ I've been completed ‑‑ I say completed. Started to draw some of the various activities that go on across the world, particularly looking at violence against women, in terms of adults. There's a lot ‑‑ quite rightly a lot of protection and primary focused on children, but there's often little provision, which ‑‑ that affords women the same sorts of entitlement and rights. Those three pillars, law enforcement, which will be my third question, Kathrin, when I get onto it, as well as the role of industry and empowerment, the pressures people feel to conform online, as you describe rightly there.
Anyone else wish to respond to that particular question? Again, about how we achieve ‑‑ how we achieve the right to participate equally online without fear of abuse or harassment.
>> NIGHAT DAD: David, I would like to respond to that very quickly. I think over the years, we have realized that it's ‑‑ cents a complicated problem, the solution should be full I fold, it cannot be one solution to this problem. We really need to encourage local solutions, addressing, you know, different contexts, but also global solutions at the bigger scale, right.
So pushing companies, for instance, the global agenda around this, governments coming together, addressing this, there's one that ‑‑ coalition has been formed by U.S., Canada and some other countries addressing online gender based violence. I think governments, tech companies, different stakeholders are realizing this is an important issue. I think it's the great work of organizations, civil society, activists, help lines like ourselves, several people have been raising this, made it a mainstream issue.
Political will is something that is so important. I have seen people talking about this locally, governments are talking about this all the time, but how much resources they put into this issue is a major question.
Yes, we need law, but at the same time, we have seen laws being interpreted in a wrong manner. Making law is not only a solution, how you are actually implementing that law and enabling the lawyers, judges, who understand how to interpret that law and enforce it properly.
I think that's really an important thing that we really need to focus on, that ‑‑ how much countries and Member States, because it's U.N. IGF. Member States are willing to invest into this issue is a major question for me. Capacity building of judges is very important, law is there, but then they don't know how to interpret that law in a progressive manner. The laws ‑‑ we have examples, living examples of such laws enacted in the ways of protecting young women and girls in the online space, but the same law has been misused and weaponized against people, against activists, and journalists.
We need to be very careful when we are looking into solutions, are we looking at both sides of the solutions, how these can be weaponized and how much willingness is there about ‑‑ from the stakeholders.
I'll respond to your other questions as well later. This political will for me is the most important thing at the moment.
>> CINDY SOUTHWORTH: Can I jump in on that. As someone who ran a nonprofit for 30 years, I just want to say to anyone from a government, if you are speaking about how important these issues are and you are not prioritizing the funding of civil society to address online gender based violence, you are failing. I desperately worried about how I was going to keep the projects I was running and the incredible staff that worked at the national network to end domestic violence employed to keep doing though work. I the people on this panel joining us today struggle with how to keep the lights on and keep doing this work. At Meta, we fund NGOs, but governments have to step up and fund them. You have to fund civil society, civil society does not function if there are not funds coming in to keep them alive and keep them going.
And so I call to tech companies to keep funding them, call to foundations and to governments to step up. It takes all of us to make this happen. And I see a lot of exciting movements right now of the online alliances coming together and talking about how we need data. There is phenomenal data, so much research over the years, so much really good solid research, we know what happens around online and in person gender based violence.
So instead of spending that time and money on research, let's spend that time on funding civil society who have proven evidence‑based solutions that just need funding to continue.
And then, if you're okay, I have a question in the chat I could pivot to that's on the same note around sexism and misogyny. Is that okay, David.
>> DAVID WRIGHT: Yes, please, Cindy.
>> CINDY SOUTHWORTH: I couldn't answer in the chat. It's hard to take down the patriarchy in the chat box, I'm not that skillful at 6:00 in the morning eastern time. The question is what are we doing as a company to deal with sexism and misogyny on the platform.
One of the things I was really excited about the opportunity to join Meta in my role is in addition to all the tools and levers and policies I talked about at the beginning is really one of the opportunities I was excited was how can we use our platform to do major social change. And one of the reasons we know that violence against women exists in the world and all of the ism's exist, nay exist because they are allowed, they inherently continue, and they are perpetuated because they are not interrupted. One of the ways you change social norms is to interrupt them and stop them and cause people to have a aha moment.
One of the things I'm excited about, we did internal research and saw when we pop up a message to people when they are writing something offensive, and we say to people, what you are typing, this comment has been considered offensive and may cause harm, 50 percent of the time, when people saw that message, they either edited their comment or deleted it.
And like that doesn't seem snazzy on like a social change standpoint, but that's interrupting behavior and causing people to stand back, take a moment, take a breath, think, and change their behavior.
My internal social justice warrior is like, okay, we are doing something here, we are causing people to change their behavior, for me that gets me excited, I want to do more of those types of things.
>> DAVID WRIGHT: Cindy, thank you very much. Hopefully a great response to the question as well. It does actually make me think, I'm going to go off a tangent just for a moment. As you say, Cindy, a lot of research in this particular area makes me think of a review in the U.K. which the school's inspector commissioned by the government in the U.K., to undertake a review of sexual abuse and sexual harassment in schools.
It follows the publication everyone is invited, particularly children to post anonymous testimonies citing the abuse and the harassment that they experience, again, often cited the school or university where that took place. In its two years of operation, it hosts 50,000 testimonies describing that abuse or harassment of the children, particularly girls experience, and quite a lot of it online.
That's what they concluded too, a survey of a thousand children in the U.K., concluded that children find sexual abuse and sexual harassment, quote, commonplace and normal. 90 percent of girls routinely receive unwanted sexual images, 74 percent of girls receiving requests for intimate images.
It is so normal and common they don't do anything about it, endure it. It's a cultural issue that we face.
So I think that's two encapsulates some of those points, Cindy, you make too, it's how we change that culture. Emmanuel, if I can come to you.
>> EMMANUEL NIYIKORA: Thank you so much, David. I would also like to contribute to this very complex topic. The perspective of Africa, online gender violence comes to already existing gender based violence that we see around Africa, I'll also not address at the moment. I will say thinking of how can we ‑‑ this can be done. Of course international organizations and other stakeholders try to do as much as we can. I think I would like to stress on the ‑‑ it has been mentioned building capacity, as the girl and women to protect themselves, because we know there are also issues of cultural aspect that comes in our societies, mostly in Africa where the girls, they are already women, they are already experiencing other forms of gender violence, that are already not addressed at the moment, and then adding also the complexity of online, and we don't see much being done in terms of capacity.
When the governments have limited capacity in terms of technology to protect the online gender violence, the capacity isn't there.
I think what we are doing that ‑‑ to come up with the guidelines, but also training, stress on training, the girls and the women, stressing that the importance of privacy and safety online. So young women and girls must be educated in how to protect their privacy and ensure that they have ‑‑ they developed safety protocols to protect their information and their account access.
They should be capacitated to understand cyber hygiene, which is a set of practices, organizations and individuals perform the integrity and safety of users to make sure that they have the capacity to protect themselves, because of other issues, again, reporting the cases sometimes becomes an issue, and we know me capacity of some of our government, they don't have the capacity to go after the violence if there's no one that reported because you need to report, then ‑‑ the program can be ‑‑ the girls need to be trained ‑‑ should also be somehow culture change, they need to understand that they need to report these issues so that at least something can be done.
I don't see really governments, because they have ‑‑ they do enforcement in African context, they have so much to deal with, so if no report comes and someone reports, it's very hard for them to go after those abusers that are happening online, not because they don't want to, because they have so many they are dealing with. That brings me to the issue, to, again, the aspect of the government establishing setting up the legal bodies, setting organization that's going to ensure these issues are addressed.
So there's not really one that can address this complex issue and then the context of Africa where we are seeing a lot of things happening, and the government having so many priorities, yes, we talk about it in different conferences, insist on really giving this issue priority because now with ‑‑ especially after COVID, much has been done online, studies, learning and work is being done online, that's where the violence are taking place on women and girls.
So there's no one solution for this, but I think we need to emphasize all stakeholders contributing to this, doing work, in terms of capacity building, working with the government to put in place the policies of countries to implement the guidelines working with the countries with other partners to work together. To bring this challenge on the table for discussion, because the context of Africa, this is a new ‑‑ I can say it's a new trend, we already have a lot of gender based violences happening in our societies in the physical world that goes until addressed and unattended because, again, cultural perspective and capacities of some of our governments. And a lot of intersection of issues that makes even the physical gender violence addressed.
Now we are adding online gender violence, they don't have the capacity, they are working with civil society and other competent organizations, I think, is very important.
But I think the steps are being taken before we are discussing about the issue, and some of the ‑‑ this issue also affecting the women from underdeveloped country, communities, target of this gender violence, they don't have the capacity to even report it to challenging aspects.
Thank you for bringing this topic on the agenda for the IGF, I think it's very important to talk about it and have organizations at ITU are taking this seriously, where so much is being done on protecting the child by training educators, training parents and also training focusing on girls and young women that are being capacitated to know how to protect themselves, but also play a part, because we need to support also our governments to address these issues, thank you so much. It's a collective approach to go against this very acid online gender violence.
>> DAVID WRIGHT: Thank you very much for that. That really important reminder.
We are a few minutes over and I'm keen to open the floor to give everyone the opportunity, particularly Kathrin, those in the room and also Deborah does need more questions online that we have too. Perhaps Kathrin, if I turn to you first, if there are any particular questions or challenges or comments that we have from the floor in the room there.
>> KATHRIN MORASCH: Just give me a sign if you want to say something.
Oh, okay, lot of people. Maybe we start in front, Elizabeth.
>> ELIZABETH: I hope you can hear me, this is Elizabeth Choa. I'm coming from a point of view from a research oriented nonprofit. I want to come back to the point, there's enough research available, we don't need more, don't need to open up potentially more technical data I agree, we know the extent of the problem probably, but the extent of the problem, data on that will only get us so far in terms of awareness raising, which is important, I agree, when we think about solutions, also coming from a technical point of view, I think civil society organizations are very interested in codeveloping also technical solutions, or at least frameworks that then really rely on being able to access more in depth data ‑‑ research data and platform data.
Yeah, it's less of a question or a statement, I hope we can have a discussion on that.
>> DAVID WRIGHT: Thank you so much. Any particular comments or observations?
Any response. If there's no one from the floor that wishes to respond to that one as well?
>> KATHRIN MORASCH: I don't see someone here that wants to respond, maybe from the panel.
Okay, yeah, right next to Elizabeth.
>> AUDIENCE MEMBER: I hope everyone can hear me. So I listened to you, looking at it from another perspective, gender based violence online, when it comes to the perspective of artificial intelligence, I belong to the school of thought that actually fight these things, so I believe that it has the potential to tackle this issue.
We most likely need more data to train models that can adequately, with very minimal ‑‑ the violence online. A tool, looking at it, that's just what I think about generating the models concerning this, thank you.
>> DAVID WRIGHT: So the need for more data to train some of the merging kind of AI models that might be able to identify to prevent ‑‑ identify and potentially prevent as well. A good point, good contribution.
Any other particular questions or comments?
>> KATHRIN MORASCH: Yes, there were more raised hands in the room. Does someone also want to answer to that, or we can go on with like the next question. Okay. I saw there were some raised hands in the room, maybe you can just raise your hand quickly again. Yeah, there.
>> AUDIENCE MEMBER: Good afternoon, everyone, my name is Kito I'm from an organization called research ICT Africa. Thank you for the conversation, I especially appreciated Emmanuel's comments about our patriarchal society and the very misogynistic society that we live in. But I was also very concerned about the language that we use, and I think that's a very important part of proactive policy making, and through all this whole conversation we'll be talking about women and girls, the problem are the perpetuators of gender based violence and predominantly men. When we talk about gender based violence, until online or offline, we almost normalize men as perpetrators by not speaking about those stats.
I know it seems ‑‑ I live in Namibia, it's almost like it's very harsh, it's easier to let's empower women. Maybe capacitate men to not be violent and we normalizing male violence furthers this problem that we women as well, we think it's normal, nothing we can do about it. It's not normal. Kathrin, I have to agree, unfortunately, about saying this perfect woman and I think it's ‑‑ let's leave women alone. If it's ‑‑ it's not good that they want to ignore and have this image, but women who are trying to ‑‑ I think this is how they survive under a patriarch Al, sexist, misogynistic society. They're trying to protect themselves. People who abuse them and saying we should empower women and say don't be a perfect woman, it's kind of saying why ‑‑ it's victim blaming in a way, you're getting those comments because you're having this view.
So, yeah, I just want to say the language we use around violence and perpetrators should change and we should be talking about where does this come from and I think that will inform the kind of like devices we use to respond to violence. Thank you.
>> KATHRIN MORASCH: You made an interesting point, we victimize young girls and women, I think that's something that also happens like in the offline world, like, you know, like women gets raped, you're always like what did she wear a really short skirt, it's her fault. The same is happening online when people say, yeah, she just shouldn't have except those pictures, then he can post it.
I think it would be interesting maybe someone from the room or online has ideas on how to address this to men, what you just said, we also have to work with them, that we don't only work with women and empower them, but also, yeah, show the perpetrators or the potential perpetrators, what they could do in the future.
So are there any like programs so far, does anyone from you know something about this? Because I only know from like Germany, there are always ‑‑ we are always trying to, you know, if people are like they are law enforcement, and then ‑‑ goes to court and they try to ‑‑ after that, they try to bring it into projects, so they don't start this again.
I think it would be interesting to maybe know some more about this before this happens, if there are any projects, I'm not familiar with any.
>> DAVID WRIGHT: Kathrin, Nighat has a hand, I presume in response to the comment that was just made.
>> NIGHAT DAD: Yeah, just adding to what they said, I think the language is really a problem. When we do trainings in Pakistan, there's always this tell women how to secure themselves online, and with all honesty, women are exhausted, you know, it's a lot of labor on their part and on our part to tell them how to be safe online, where the Constitution and the Democracy and the society actually, you know, it should be safer for everyone, so why one gender has to make an extra effort to be safer online than offline, but at the same time, you know, we ‑‑ unfortunately, you know, like ‑‑ they take this burden and we have to take this burden to kind of tell them that, you know, this is like not really enabling for them, especially closed societies, so, you know, what are the things they can do. Definitely I agree, we really need to change, you know, look into the language we use and also the burden we are putting on women more, that while they are being traumatized and, you know, like face violence in the online space and offline space, at the same time it's the burden on them to be safer, you know, and learn all the preventive measures and things to be more safer and, you know, like to address the abuse they face. But at the same time, initially I would just like to share a little example that initially I started ‑‑ when I started working, I only started doing training with woman because I feel safer to speak to them, and ‑‑ for us, while starting this conversation around online violence where this is like still a taboo talking about your intimate pictures being leaked or being weaponized against you, it's stigma.
And they were unable to share their stories, even with a group of women. So including men in that space, which they think is really not safer, where, you know, supposed to ‑‑ the perpetrator of violence was there, they were not able to share their stories. I think it's important to create those spaces for them, to schedule share what they face online and their stories. Now I think after 12 years, we have spaces where men and women are part of those spaces and women are talking who they are facing this violence from. It's coming from everywhere but majorly from men. That's the truth. We cannot really deny it.
So ‑‑ then there are men who are really an ally and they are there ‑‑ so, you know, you really need honest allies who are there to become, you know ‑‑ who champion the cause, but also not hijacking the cause, I'm just like talking as a ‑‑ like it's a tricky subject, but at the same time, you know, you came as an ally and support ‑‑ to support us, but also this is the narrative, the ownership and agency of our own story, you try to see how you can make things easier for us, but also champion the cause.
>> DAVID WRIGHT: We feel your energy and your passion kind of come through ‑‑ come down through the internet, and I'm sure that's felt equally in the room as well. Some really important points does remind me as well from a U.K. perspective, I'm sure you said earlier on, we do spend a lot of time coaching those people who reach out for our support in terms of they've contacted the police and the police have told them why did you tape the images in the first place. The problem is one with in terms of the response they get from law enforcement, which we spend a lot of time coaching them through no, this is an offense, and this is how you should report it. Indeed, this is how they should receive and how they should process that offense as well. It's clear to move on to Bangladesh remote hub, they've been very patient. So if I can give the floor to you, ask your question too, please.
>> AUDIENCE MEMBER: First, you'll need to unmute.
>> AUDIENCE MEMBER: Hello.
>> DAVID WRIGHT: That's good, thank you.
>> AUDIENCE MEMBER: One of our women like to ask a question, please.
>> AUDIENCE MEMBER: Hello. I'm from Bangladesh, my name is Tasnia. I want to say that in Bangladesh, they don't know what is the gender based violence. It will be easier for us if the content ‑‑ provide our Kevin tent basically.
Thank you.
It would be very helpful for us, please do something.
>> NIGHAT DAD: This is a very important question, this is basically, you know, like talking about making the content available in the regional languages and local languages, I think that's what we have been as an oversight board pushing Meta to make the content available in local languages like the question is really coming from the ground where they are like we do not understand, English is not our language, if we have a content in Bengali, the ‑‑ anyone on the platform would understand how the reporting mechanism work, what community guidelines are saying, I think it's basically how much ‑‑ how much effort and resources, platforms and tech companies are putting into, you know, into local languages and making the content available in a way where people actually understand what these companies are saying in their reporting mechanisms and community guidelines.
>> DAVID WRIGHT: Nighat, thank you very much. Cindy, would you care to offer responses, perhaps a Littleton fair, I personally know how much effort you go to around this subject, but clearly there is always more that can be done around languages, but Cindy, any response you wish to offer.
>> CINDY SOUTHWORTH: We are working on the translations with you, StopNCII is currently in Bengali. It's actually being translated right now, this week for Cambodia, it's in approximately 20 languages right now, and the ‑‑ David and his team are adding additional languages as we speak. And then Meta, I can speak to like Facebook.com/safety, the women's safety hub is in over 55 languages and we are continuing to add because we agree, things need to be in local languages, and that's not just language capacity, it's also cultural context, one of the things we are doing with all the materials that are for Thursday, December 1, the anniversary celebration that David and I mentioned for StopNCII one‑year anniversary is things are being reviewed in, by native speakers because it's not just are they technically accurate, how do you talk about intimate image abuse if you don't working in the NGO sector or understand these concepts. They're really, really complicated concepts.
And a professional translator may not get it just right, so we are having additional people look at these issues and then we know that the NGOs who bring additional expertise around gender based violence will look at those materials, but I completely agree, this is something we are always striving to do more of and do better. We get feedback from our partners like yourselves. If you see something and say whoa, you gate it wrong, please let us know, because language is nuanced.
>> DAVID WRIGHT: Absolutely right. I think it reminded me the StopNCII NGO network, like Cindy said, nearly 80 organizations, they work and support women particularly around online harassment and abuse around StopNCII. We met a ‑‑ any organizations here being online in the room or does work supporting adults, so particularly women around StopNCII, we would love to hear from you, there's a link on the website. We are trying to build that network all the time. It was folks we heard from Latin America which reminded us exactly that point two weeks ago around the importance of this, the accessibility of that information in languages, StopNCII doesn't really mean much in Spanish, for example. From across the European network, we see that haven't we two decades the importance of language. Kathrin, further question.
>> KATHRIN MORASCH: One more question in the room. You can go.
>> AUDIENCE MEMBER: Thank you very much, Kathrin, I'm Sophia, and I was also involved in civil society activism for quite some time, which also led to me getting harassed online quite often, which is gendered and racialized. I do agree with your comment it was really important you raised that, why focus on women and girls, the program is important and really important to create safer responses and to really like empower women, but in the end, what is more effective, focusing on female disadvantage or male privilege that is clearly there. I believe that it's way more effective to focus on the privilege side because it's the like kind of superior part where a man does not have the extra burden, for example, they go politics, they actually have to, yeah, go through all of this online.
Victimized girls, for me, my point of view is not proactive, it's emotional labor, everyone really needs to be involved in this conversation. Why shouldn't we educate privileged people about their privilege and prevent perpetrators from actually harassing.
Then I think I couldn't leave the session without mentioning intersectionallity. It's essential in this conversation. Online violence is also racialized indigenous, being indigenous plays a role, having a queer identity, disabilities, neurodiversity, the level of income matters, when looking at law enforcement and then also just statelessness, migration, such levels are important to bring into that. My question would be in the government capacity, but also in the industry, who has the power to actually moderate those conversations and who has the power to enforce those, a comment regarding Facebook and Meta and the languages, 55 languages for our world is not enough. We have way more, I think like 6,500. I think it's time to really invest into inclusivity there. Thank you.
>> DAVID WRIGHT: Sophia, I think you hit a good spot there in terms of the support and the comments, the supported comments to your particular contribution as well.
We only have two minutes remaining. And so before ‑‑ this is going to be a Littleton fair to give people two or three words to finish on the panel. Any final burning questions anybody has been itching to ask in the room or indeed online. Now is your moment to ask that all important more.
>> DEBORAH VASSALLO: We had one question online, the NCII tool and the person in the chat asked about whether ‑‑ how difficult would it be to override a hash with changing the background and image through other means and Cindy answered that images altered, it may need a new hash. They encourage people to create hashes of all copies of images, even ones that are exactly different since they will create different hashes. That was mainly what was being said on the chat, and one particular participant in the chat also told us she's doing her Ph.D. on misogyny, if anyone would like to contact her, she has left her contact details. Let me see if there's something else, I think that's all from the chat.
>> DAVID WRIGHT: Thank you, Deborah. That's awfully kind and also to Kathrin too, which leaves me with perhaps a minute or two, we'll turn to the panel, I'm not going to summarize this because of the excellence that is this particular panel, wish to turn to each of you for perhaps a closing sentence or two, if you wish to summarize, and Emmanuel, if I can come to you first.
>> EMMANUEL NIYIKORA: As for me, just to reassure the women and girls here connected that not all men are perpetrators, so not like all the men are doing the online violence, we have a big number of men not doing this, they are ready to support this movement of combating the online gender violence and the women and the girls that are being harassed, our sisters our daughters. So big number of men are available and ready to support the movement, we know it is an issue and there to support you and we know the challenges also there. That's not discussed enough, in regard to our cultures, our backgrounds, but this is the time now to work together and ‑‑ this is ‑‑ we cannot have it really continue to happen, thank you.
>> DAVID WRIGHT: Thank you, Emmanuel, you speak on behalf of me as well in terms of the room. That's another reason I'm not going to summarize this. It needs to be summarized by others too. Nighat, closing sentence, literally, sentence.
>> NIGHAT DAD: Yeah. I think if we start treating women not as sisters and mothers but as human beings, we'll achieve gender equality as enshrined in our Constitutions.
>> DAVID WRIGHT: How to summarize a life's work all in one sentence. Cindy, if I just can come to you too.
>> CINDY SOUTHWORTH: Thank everybody for the important work, it's great to be an ally among many amazing activists, stay tuned for exciting announcements later in the week.
>> DAVID WRIGHT: Indeed, Cindy, it's been an immense pleasure. Finally, Kathrin, any particular closing comments from you or indeed from the room.
>> KATHRIN MORASCH: Yes, why I try to give my best. So we talked a lot about women online and I think as long as women and children ‑‑ girls are online, we are exposed to harassment and violence, especially if, like Sofia said, you're politically outspoken and identify as LGBTQ+ person, or I don't know, you have a disability or whatever, then you are really ‑‑ you can be attacked real easily. I think we should really change that, and I hope maybe that all of you who are in the room, if you want to continue this conversation, with all of the other speakers, I can certainly give you, I don't know, context and whatever. Yeah. Help you trying to get to them, if you want.
>> DAVID WRIGHT: Kathrin, that's ‑‑ that would be wonderful if you can continue that conversation as well. I really thank you for holding the room, Deborah, thank you for organizing the moderating the online chat too. Also Sabrina, Sofia. On behalf of network, I couldn't not mention safe Internet day, which is the purpose the last ten years, we have been engaging working with the IGF, at the IGF, we will be back again physically, look forward to that next year too. 7th of February, it's been my pleasure and privilege to be able to moderate this extraordinary set of people. As Cindy, you say of activists working in this particular space. We have a lot of work to do, there is a lot of work to do, but with those kinds of insights I've heard, I feel massively infused and inspired to take the next steps. So with that, I think I can bring this to a close, thank you all for your participation, online and in the room, and wish you all a very good rest of day. Thank you very much.