The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> NICOLAS FIUMARELLI: Now it's better.
Okay. So we will start. We will have a conversation here about data protection in messaging apps. It's a specific topic, a whole topic because we all around the world use messaging apps everyday. Together with me, we have Joao Moreno Falcao, he works with Intelliway Tecnologia, and we have the youth alliance for sustainability, I'm Nicolas Fiumarelli. Here we have also our online moderator, Osei Manu Kagyah. We have a mix of technical and lawyers here. So we have a very good discussion.
Well, I would read a very short description of this session and then we'll start with the speakers. So during the last decade, the world connectivity, one of the many impacted groups was the youth. We have a number of young people exchanging messages through the Internet. The aim of this session is to have a conversation with youth through the messaging but especially, TikTok, Telegram, among others. And we will start with questions for the panelist regarding the messaging apps usage in different context and related to the infrastructure of these messaging app systems and then we will address on how to ensure that the application of the declarations of the rights of the child impact.
Then at the end, we will have a discussion about law enforcement and other public policies, but also technical things. So we will address cryptography and a lot of things. So in this conversation ‑‑ well, just to start ‑‑ oh, yes.
I am presenting here Osei, and he's the online moderator.
>> OSEI MANU KAGYAH: Thank you. Those joining online, good morning and good afternoon, good evening, this will be a highly interactive session and it's not going to be a one‑way talk. We should all get involved. If you have a question or a comment, raise your hand or leave a comment in the chat session and we hope for a fruitful conversation. We hope to advance this conversation.
Thank you, everyone.
>> NICOLAS FIUMARELLI: Thank you, Osei. So the first question to the panelists, what are the main different ways or architectures that deals with the content of chats or what are the current challenges that we need to face. So maybe we can start with Savyo.
>> SAVYO VINICIUS DE MORAIS: Hello. Thank you very much for your presence, even online or on site.
Well, in the point of ‑‑ starting from the point of the more common messenger apps, WhatsApp and the Telegram applications. We have basically two manners to communicate with our friends or our colleagues, which are the most ‑‑ the most common, at least in WhatsApp is the direct communication where my message goes directly to my peer. So there's no centralization, where both of us or online and my message for Nicolas goes right to his phone, not goes through one server.
And Telegram has this kind of communication, but also the most common in the case of Telegram, the messages are stored and all the communication goes through the cloud service ‑‑ the cloud services of Telegram. So there are two different manners to have this communication and both of them have implications in terms of cybersecurity.
For example, if someone gets access to the cloud servers of Telegram or if they get access to your phone, we will have different kinds of access to private data and so on.
But we also have in both cases and also in other direct messaging applications, in other platforms as TikTok, as Instagram, as Facebook, Twitter, and so on. That is the method of the communication with the frequency of ‑‑ that you talk with your contact, the ‑‑ the ‑‑ how much messages, what kind of messages, if it's photos, videos, text, this is a qualitative analysis of the communication that we have with your peers, with your peers, with your colleagues. So we can in this way understand how do you interact with your friends and to have information about who are your friends and how do you communicate with them.
We can also infer things about you. So this is also important, not only the content of the messages but also the qualitative analysis of this metadata, of this communication. I think that's it for now.
>> NICOLAS FIUMARELLI: Thank you, Savyo. Very interesting. Maybe Joao, you can address and explain for the audience that maybe don't know about encryption on private case and cybersecurity issues that continue with Savyo's comments.
(No audio).
>> JOAO MORENO FALCAO: Who you talk, like five years ago and 1 p.m., and you don't ‑‑ you don't remember anything. So how can they have audio? Okay. It wasn't online. People online wasn't hearing.
So when we think about these data that are now in servers, we need to first protect this data, and while the thing that protects the transmission are cryptography that is symmetric. So ‑‑ and all of the software that we use to communicate uses ‑‑ uses computational cryptography. This means that if the key is broken or the protocol is broken, we will have a problem because if the computer increases in the power processing, we will face the ‑‑ this communication being widely open. So the ‑‑ the thing that protects us in this ‑‑ in this option, in this case, is the desire of someone recording the messaging data, and storing for long enough to this protocol be broken.
And another thing too is on cybersecurity threats, because ‑‑ well, if someone steals your phone or forces you to open it, in a matter of minutes, they will have access to all of your information and can download it, and can have this access indefinitely. So we are ‑‑ we have ‑‑ like, putting all of our information protected by one door and if this door is open, it's over, because this information can be copied and you not have control of that.
We are seeing the implications of this in a lot of spaces, when some software or certain platform is hacked, and we have our private information, like our sensitive data. And when it's leaked, it's leaked. You can do ‑‑ you cannot do much about it.
So when we look at these messaging apps, we face a massive challenge because if this data is leaked, in the case of apps that, like, store this data in servers, we will really ‑‑ we will have a big problem.
>> NICOLAS FIUMARELLI: Very good. Then we will continue about what are the different policies around the globe regarding the regulation for the data protective purposes? Is there any accountability issue regarding child messaging and what about the children?
>> Yes, let me just say that the kindly forgive me if I take a deep breath every once in a while, but I will start from the children. The declaration of the children's rights gives protection to our children, but how does this translate into high power.
To date, we have children at home who prefer to talk to each other on social media or instant messaging apps rather than face to face. So the very definition of communication as we had in the past is being redefined by technology, and that gives us a lot of leverage, rights? So while we are getting into new heights and it's exciting, our serious challenges also in there, where someone can just triangulate you by the nearest cell tower, and then using the messages that you post.
Naturally, when we are speaking to people that we trust, there is a lot of asks that we leave there, that the whole world doesn't see, and that is what gives us the danger. The fact that we are not secure in these instant messaging apps, that once that door is opened, it's laid bare.
And so, for example, various global policies and that leads to various country specific policies around the world. For example, in Ghana 2012 passed the data protection act, that is supposed to guide the collection, the distribution and the usage of data and basically follows that kind of international protocol.
Of course, you have the Global Digital Compact also that is meant to pave way, digital rights. You can go to the UN website, UN digital compact to make your submissions there. So all of these are ways in which we can engage and ensure that our right to privacy is ensured.
In fact, on my way to the conference, for example, I wanted to ‑‑ after getting my ticket and everything, you know, first time at IGF, I wanted to post. And then the friend I was traveling with was, no, turn the back of the ticket. Because apparently it happened to someone, the person posted their ticket online and by the time she got to ‑‑ to check‑in, her flight had been canceled, and these are the kind of dangers we face. For someone who doesn't know about these dangers, how does these policies make sure that we educate them for them to understand the dos and don'ts of messaging or laying ourselves bare, really?
>> NICOLAS FIUMARELLI: Thank you. So you mentioned about the policies, you mentioned about possibility of education program for the end users to know how the data is stored, how the data is communicated, but Savyo, how is the privacy policy of these private sector or messaging apps? What these privacy policy say about our messages while they are not being delivered, because of other phone is turned off. What happens with the data in all of these scenario as far as the privacy policy.
>> SAVYO VINICIUS DE MORAIS: Thanks for the question, Nicolas. Well, the first point, a bit of what Joshua just said is that in terms of accountability, there is an interesting point that I saw. For example, in the privacy policy of TikTok, that it ‑‑ even ‑‑ I'm not sure if you must send your age when you are signing on to the app, but it can infer your age at least in approximate data between 3 and 5 years, between 10 and 12 years. So this is one interesting thing.
And this is ‑‑ there is also another concern in regards to the more specific only for messaging apps, especially in the case of WhatsApp, where the data of WhatsApp can be exchanged between WhatsApp and telegram and Facebook and all the other companies of the Meta company. So basically in WhatsApp, for example, if ‑‑ we don't need to say our age, our birth date and so on, but if we cross this data between platforms we have information specific from WhatsApp, with kids that is being shared with other platforms. And this is not the specific case of kids but we have more inferences that be done and more things that can be done with child and teenagers and also with youth. So maybe this is one point that we need more transparency on how this data is being processed. What kind of information is being used between the platforms, between Facebook and WhatsApp, for example.
In this case, I don't see any problem for Telegram, in terms of the age of youth, child, teenagers in terms of storaging messages, there's no three information for how much time the message gets stored in the server of Telegram. Basically, they say it will be stored as far as important for you. So we can delete it, but there is no auto deletion, at least, the text message, is but the media is sometimes deleted.
And, this is basically the point. In the case of whaps app, where all the communication is peer‑to‑peer, phone to phone. When one of the phones is not connected, it's offline, the message goes to the WhatsApp cloud servers and then it keeps there until the peer is up again, is online again or for 30 days. So this is basically the timing of the things in the general platforms.
>> NICOLAS FIUMARELLI: Very good. Well, I will say do we have questions in the Zoom or raising hands?
>> OSEI MANU KAGYAH: Yes, one question. We would like to hear from you over here, your perspectives on the various data protection. I'm interested in knowing how robust these laws are. And one question from Bo Han. In the situation where enterprises dominate information storage and use, is there any way to promote use rights for data protection. I don't know who is taking this. Joshua?
>> JOSHUA AYAYI: Yes, in most countries we see government come to the aid, and that has worked so far in terms of governments regulating and making sure that private enterprises abuse. But then who protects us from the government?
Because there is also that conflict of interest, right? For a government to use anybody's data, for them to say a matter of national security and it really a matter of national security? So is it? ‑‑ so the find of dilemma we find ourselves in now is that on the one hand, government is able to make sure that the private entities do not abuse.
But what if it gets to the point that government wants to abuse. Is it a case of coming one a multi‑stakeholder approach where you have equal representation from private, civil society, private for‑profits, government end users and you have them able to come together and pass these policies and ensure its implementation. And I will also say that's what, in fact, goes ‑‑ there should be that representation when it comes to policy making and implementation to ensure that everyone's voice is heard, and everyone's data or information is protected in the right way.
>> NICOLAS FIUMARELLI: Yes, going to in‑room question.
>> AUDIENCE MEMBER: My name is Nicky Kloso, I work for Roadblocks. I have a question about the ‑‑ the panel is sort of about data protection for youth. No one has mentioned age verification, sort of at a baseline, who do we know ‑‑ who do we know is a youth? How do we do age verification? Because to me, it seems like at a base level, most of the companies say that you have to be a certain age. Sometimes it's 13, which is arbitrary. I think that's somewhat US centered and in the rest of the world, it varies and I think it's more broadly seen as 18. But I think there's been a move from seeing ‑‑ from companies being the ones, you know, that should be doing verifying to now more thoughts on the side of the device and the operating systems being the ones that could potentially be a stronger player in age verifications base because I think, you know, if we are reliant on national forms of identity, for example, in the US. If we have to use driver's license or a passport, we leave behind people who are undocumented or then those who don't get access to services.
So I'm interested in sort of at a baseline, we start the conversation but we don't really talk about who is getting online and how are we verifying who these kids are?
And then on the data protection side, I think in the Global North, there's a broader conversation about best interests of the child. So we've seen in the UK, the age‑appropriate design code. We have the DSA coming in Europe. I think we have seen that picked up, the US is sort of a mess, but we see that a little bit in the US. I will speak for myself. We see that in the US, and definitely in Australia. How will we see those conversations ‑‑ how do you see those conversations evolving in the Global South? Are they appropriate in the Global South? How could we think about that in the Global South?
I do think they are in many ways the rye conversations in terms of the best interest the child and really holding companies accountable for the data protection schemas and I'm interested how they propagate.
>> JOAO MORENO FALCAO: Hello. Thank you for your comments. It was very enriching. Yes, this actually really bugs me because when we think, like, in Brazil, about WhatsApp, it's like the main messaging app installed by the majority of people that ‑‑ which has cell phone. And what ‑‑ even with kids, because, well, their father's use, other kids use, and other school group. You are sort of forced to fake the ID to connect there. And in this sense, the company, like says, oh, okay, it's forbidden by us. So we don't need to look for it.
And this is a huge problem. And in my university, there is a group which studies the communication flux inside WhatsApp and they are, like ‑‑ they are certain that this dynamic is very harmful for youth, because it's totally unintended. You can hurt really fast. And no one will notice.
And we talk about the encryption, end‑to‑end, in this sense, it's like harmful to them, and we actually ‑‑ I don't know how to deal with this because at the same time, we know that these kids need to be here because it's like the place that everybody is we also have the necessity of guaranteeing the that all the citizens will not be scanned in the name of the protection of kids. So it's very difficult to deal.
>> JOSHUA AYAYI: So, for example, in Ghana, we have the data protection act, which stipulates that you have the right to refuse to give your data out or you have the right to ‑‑ so you can tell them give me everything you have on me and delete it.
The question is how many banks, schools, institutions, are willing to even abide by it? Is it a question of it is known by everyone, so if you hit a snag or a challenge, you go to the police and tell them that this person is refusing to give me my data. It's even in WhatsApp, if you don't check that box, you are not accessing the service. So it boils down to the kind of service that we provide, the take it or leave it service. And that is disheartening. Even before we get to the conversation of the children, the individual in the Global South is not protected, because most of the services, 90% that the average person can access is take it or leave it.
So you are either giving us your data or you can't access the service. And our conversation needs to go around that. So for coming to children, largely in Ghana, aside from the policies that protect children, it's largely also depends on the family, saying, okay, I'm buying this device for my child. And I'm putting the parental control on it, to make sure that they are protected when I am not with them. And it boils down to my earlier submissions. It's a question of education and how that education is given. Because if you pick an average person and you begin to speak technical language, it becomes difficult and this is something that we have been advocating for a long time. How do we make sure that my grandmother who is in the village, who I just bought a SmartPhone, that it is easier to communicate through WhatsApp than to make a phone call. How do I make sure that she is protected or how do I get the information to her, these are the kind of pictures you can't take. And this is what it means and these things.
So it's a whole conversation that has to be had. It's quite complex in Africa, but we are getting to that point where we're building ‑‑ getting more Africans into the global policy space to make inputs and also make sure at a local level ‑‑ for example, in Ghana, you have the local assemblies who are responsible for implementing policies that various ministries make. So how do we ensure that at a local level, implementation is done to the barest maximum, to achieve the highest impact? And that's what we are doing in Ghana at the moment.
>> OSEI MANU KAGYAH: Thank you very much. We have some insightful comments ‑‑ comments, this one from Nicholas Lennon from Ghana. He says in data protection, government says check on other stakeholders. But the task of keeping governments in check and including oversights, so using law enforcement and national security are policed by some to prevent abuse.
And another question, he says that also in its current form, I think he's answering the question ‑‑ in this form, it's difficult to educate young people on most privacy schemes, especially in the Global South. We may have to redesign these privacy terms into max similar forms, perhaps using visualization and other means familiar to young people if we really want them to know about its impact. I couldn't agree more.
So I will pick a question from ‑‑ this one from Ghana. In a number of apps you are required to submit data before you can install the apps. This is a global issue. How can we change the narrative such as if we don't agree to the policy, you can limit ‑‑ you can limit the data you can send through and be able to install the app. And the last question, this is from Bangladesh. They are watching live, and the question is ‑‑ our question is, what is the impact on privacy‑related to the infrastructure of messaging app systems and also how to ensure that the declaration of the rights of the child, what will be the role of the youth in regulatory backgrounds.
I don't know who will take this. We come to the background.
>> SAVYO VINICIUS DE MORAIS: Thank you for the question. And I will first address the question from ‑‑ let me ‑‑ I think from Bo Han. But basically, the question about how to ‑‑ you do not allow to collect my data and keep using the apps. So this is technically a problem, because ‑‑ well, we if we have a centralized system, it costs. And somehow ‑‑ and I'm not exactly defending this ‑‑ this part of the exploration of the information. But the companies that are hosting these kind of systems has to pay the costs that is having to keep the app running to the servers, the cloud services and so on.
So this is ‑‑ this must be paid somehow. To have a definitive solution for this would be something like web 3.0, which is a decentralized infrastructure where we can maybe somehow have a low cost, or zero cost, direct cost for one centralized institution to host this kind of app, and also the other problem is to make this kind of platform popular. Instead of this, at least in my point of view, it's hard to have ‑‑ to not have any problem in this point of view, because if this is from a private company, it may have privacy issues. If it's founded by some government, it may have some issue, privacy issue and so on. And ‑‑ well, also coming from the civil society, or something like that, it's still hard to have funding to keep the things running.
So I think that to have a kind of definitive solution for this issue is in a future ‑‑ long future, we can have centralized messaging apps.
>> OSEI MANU KAGYAH: Thank you very much. Insightful comments and debates are going on in the chat. So we now go to the audience and pick a few questions. I think someone raised a hand. Okay, I'm coming over.
I would like to hear more from the Global South and also more about ‑‑ you have another question?
>> AUDIENCE MEMBER: Hello, everyone, I'm Milana, I'm from Brazil, and I'm a fellow of Brazilian youth program, and looking at the youth and the civil society point of view, what is the paper of programs like IGF to increase the privacy policy knowledge around the world? And how does this information get out from here and arrive to the community. Very simple.
>> OSEI MANU KAGYAH: Let's take another question here.
>> AUDIENCE MEMBER: My name is Tiese, I represent a civil society organization, whose mission is honor the children. Part of our work is related to the digital rights of children and adolescents and the commercial expectations of their well‑being.
I would like to understand and my question of you, how to deal with the encryption. There is a place in the middle where we can guarantee that protection and privacy, and at the same time ‑‑ especially when it comes to target children and talking, take advantage of the time, I would like to commit everyone to a panel that my organization is ‑‑ is addressing at Friday at 11:15.
>> OSEI MANU KAGYAH: Thank you very much.
>> JOSHUA AYAYI: Okay. I would like to take the first question. If I taught your question right now, how do we make sure that what we discuss over here leaves here to help those outside, right?
Basically? If I got your question right. Okay.
So one ‑‑ one of the things I think we need to do more is to be deliberate about it, and being deliberate about it means making sure that as civil society, we put it in an online action plan, right?
So I was just yesterday speaking to the president of a GigaNet for Africa, they convened the Ghana school on Internet Governance of which I was a fellow in 2021, how to make sure ‑‑ of course, my organization operates in one of the 16 regions of Ghana. We are looking at how to bring a form of the Internet governance to the Vota people. Because most of the time you have many of us traveling to the nation's capital to be part of programs like this, and then when we come back, how do the rest, who are not able to travel, access ‑‑ it's being deliberate and making sure that it's included in our action plans. And that is something we are willing to do.
And if any other organization is here, and you are willing to support us in that way, human resource, in terms of finance or just helping us with communication and publicity, you are welcome on board.
>> SAVYO VINICIUS DE MORAIS: I would like to address the second question about the protection of child and teenagers. So I think that in this point, we have ‑‑ I'm not ‑‑ first of all, I'm not a ‑‑ I do not have a legal background, but just coming and saying something that comes to my mind is that we have regulation in other kinds of media, like TV and radio, that limits the kind of thing that goes through ‑‑ through the child and the teenagers. And we ‑‑ I really think that we should have something in the sense also for the social media platforms in terms of collection of data.
Our current approach of regulating data protection is I can ‑‑ is asking to ‑‑ for us to control the data that I send, but children do not exactly know the consequences of the data that is sharing. So this is not enough to ‑‑ for children to say, okay, do you know you can just choose for not sending this data? Because they are not going to understand.
So maybe some kind of approach that could be taken in this sense is having some kind of regulations that limit the kind of data that it's collected from child, from basically, in the age and also not only collection, but in some ‑‑ in some platforms, some data can be harmful or can be badly used. But in some other context, the same information might be useful for bad things.
So making some kind of regulation also in terms of sharing information between platforms, still considering the age of the persons.
>> OSEI MANU KAGYAH: Thank you very much. These companies make their monies off data. I'm sure he's talking about Big Tech, are changing that business model will require the knowledge. You pay a subscription for the service and without that, privacy will result in the efforts to limit data collection or strategy. I would like to go to the audience, and I would like to know the individual perspective, that policy regulation ‑‑ my question is who is a youth?
Who is a youth?
So I will come to that. That's my question. So the audience if you have any input or questions here:
>> AUDIENCE MEMBER: Thank you very much, I'm Ethan. I'm now an alumni of the ISOC IGF youth ambassador, and I also work as a tech lawyer for access partnership. Mine is less of a question and more of a ‑‑ call it an ideal, if you will, that I hope you can filter out and develop. But with all the talk around children's rights and children's rights in terms of privacy and protecting it and the idea of the children themselves not understand what they content so, whether they are playing games or TikTok or whatever the case may be, the idea that the parents themselves also don't understand the implications of certain processing or giving data or the value of giving the data that is required to be on these apps.
There is now this notion which is growing more and more popular in law, in the legal space, where we have comic contracts. And essentially, it's basic. It's pretty much drawings of what is happening, what are the legal obligations, what are the legal rights between the parties involved and the party is to make sure that the parties understand. So if it's the case of me being from whatever country I'm from and I don't understand English as well as the party I'm contracting with. As we have the comic contracts, there's some understanding of what we are ‑‑ of what we are agreeing to. So I'm trying to bring it all together in the sense of what is stopping us? One or how can we develop such ideas and develop from that too create an environment whereby the regulations or the consent forms whatever is filled in to access these apps can be presented in such a way that is digestible to the users.
What I'm saying, it doesn't always have to be the traditional way of text, that we are used to. Maybe there's a need for flexibility around that, but, yeah, that's essentially comment and interested to hear your thoughts on that.
>> OSEI MANU KAGYAH: Thank you very much, Ethan.
Okay.
>> AUDIENCE MEMBER: Hello, I'm Julia, I'm here with the youth delegation of Brazil and I'm also a master's student. I feel like when we were discussing data protection and privacy of children online, generally, we focus on literacy and consent. Even the age‑appropriate design from ISOC sort of focused on that, on how to make information more easily understandable for children, but I have certain concerns that that is really effective because, of course, if children are going to be online, they need to have an understanding of what's going on, what are the consequences to a certain extent, understanding how they can ‑‑ what are the consequences of using their data and so on.
But I feel like that's unfair to leave that burden to the children or even their parents because there is a lot of ‑‑ it's an unbalanced relationship. So how can we go to another direction ‑‑ I don't have an answer to this question. It's a question that keeps coming back to me all the time, but how can we have a different approach that is not focused on consent or even on literacy as a way of protecting children, and actually have a more effective ‑‑ effective access to more healthy relationship with social media, from children and from youth?
>> OSEI MANU KAGYAH: Insightful. Thank you very much.
>> JOSHUA AYAYI: Yes, so first of all, the comic contracts. I mean ‑‑ I came across one on it once. I just glanced through. Now that you mentioned it, it's worth looking into.
And then with the second one, of course, it's true like you were saying, it's an unbalanced relationship and we're putting a lot of pressure on the parents and their children as well.
I think it comes down to what also an institution describes as age-appropriate content. We know be recently Ghana was in the news with the Proper Family Bill and then so for example, you have children in other countries that can be freely taught things but in Ghana, it will not be allowed. So how does an institution? It also comes across, let's make this technology as human. The Ghana government says YouTube Ghana, you need to put an extra filter because we don't want X amount of content. And that filter costs you $1 million annually.
Now, how does that translate into them making their money? So then that also means Ghanian content creators will get a lot less money than their counterparts in other countries who are not using that filter. Because that is only directed to YouTube Ghana. So it's ‑‑ it's a conversation that has to be started and a lot of things have to ‑‑ it's very complex. We need to look at a lot of dimensions and somewhere said in the comment section, in the online chat, if it's about minimizing profits for the institutions or for the private companies and then they are definitely not going to do it. So we need to find a way where they generate their money and still be able to make the age‑appropriate decisions, country specific.
For example, another thing that ‑‑ and please, don't worry if it's offensive to you, but in Ghana, people eat cats. Yes. It's a delicacy. But in other places ‑‑ I mean, we gather them as pets, yes. We still eat them. It's a delicacy, right? But in other places, it's just strictly pets. So what if I'm killing my cats and then I post, oh, I'm making cat stew on Facebook? And Facebook flags it as ‑‑ as very insensitive content. But in Ghana, it is not insensitive, right?
And something that Facebook would allow, it becomes sensitive in Ghana. So it's very complex and as the discussion moves forward, we need to get a lot more people in to see where we can tolerate ourselves up to and where we put the boundaries and, okay, when it gets beyond this point, it becomes country‑specific issues and then country‑specific laws are made to address that, but when it comes beyond point x, that becomes a global issues ‑‑ a global issue and that's the global laws can also deal with that.
>> NICOLAS FIUMARELLI: Okay. Just to following on the messaging apps part, without going so specifically in content regulation, for example, we would like to address about the exchange messaging apps and one question for panelists, are they quantum computing achievements a threat for the messaging apps? We can start with Joao.
>> JOAO MORENO FALCAO: Well, the quantum computer paradigm will ‑‑ when applied to, like, the modern computing we see that it will totally affect because most of the algorithm programs ‑‑ the most cryptography algorithms that we use in our messaging apps are based in protocols that rely on certain assumptions, related to how computers work nowadays, and applying, like, the computing ‑‑ the quantum computing paradigm, what we will see is ‑‑ is that most of these protocols or broken, because when we use quantum computers, this will be much simpler to execute, much simpler to break, and the ‑‑ and while, cryptography tries to be like, at least 30 years ahead, and when we see, like, a break in the computational power or to execute certain ‑‑ certain actions, certain steps, we see that it will real little affect what we have now and what we collected, because all this information that was produced in the past will be broken easily in the future.
>> NICOLAS FIUMARELLI: Let me take a little of that. I am moderator but I will put my point of view on this, because there is an issue if you have a money in the middle in your home or in your neighborhood, there is listening‑like, phishing, like listening to the communications without ‑‑ within your mobile phone and router, for example, if you share it with your friend. So they will have access to the encrypted messages and then to upload it, for example, but if you collect ‑‑ if this man in the middle or a thief collects data that nowadays you are transmitting, then in five years when the quantum computers are globally available and publicly available, in cloud, for example, they can ‑‑ they can ‑‑ they will decrypt all of your data.
So all data that we are transmitting nowadays, that is sensitive data, like credit cards with your family, the password for NetFlix, all the things you are using right now, if there is a man in the middle, at the ISP level, at your home or your neighborhood, everyone ‑‑ this information will be public or everyone. I think this is a very interesting issue that is not easy to address. Maybe a solution, that I think it would be to do very quantum resistant algorithms in the data messaging apps that are not right away. Right now we know some of the algorithms that they use are not public enough, but from some researchers that I have heard and read, they say that, well, the data are not quantum resistant.
So we are in had a very, very specific problem right now, because as I say all of our data that we think is private, that we think is sensitive, we share, I don't know, private things with our colleagues or friends, these things will be public someday, if someone nowadays is collecting those.
Because nowadays, for example, a common algorithm for ‑‑ for encryption that is classical encryption, the classical computer will spend years in the encrypting of one message. But when we have quantum computers, this will be a matter of seconds to decrypt a full video. So I think it's very concerning: What do you think, Savyo about this quantum computing issues?
>> SAVYO VINICIUS DE MORAIS: I'm a bit pessimistic about that, but not in terms of the encryption, but about other ‑‑ the quantum computers, how popular are they getting, but ‑‑ well, this is still an issue. And in terms of cyber messaging apps, I think that this is the ‑‑ this is the smaller problems in terms of decrypting data with classical cryptography algorithms. All the other ‑‑ most of the Internet protocols rely on the classical computing algorithms. So we ‑‑ we need to update our HTTPS, our DNS SAC and all the protocols in which we are relying on today. But this is going to take up big time. IPv6, we have 20 years to change all that.
Well, going out of this chaotic scenario, the point is that we still have just ‑‑ just a few quantum preexistent cryptography algorithms and they are still not being widely used.
In the specific terms of the messaging apps, I don't think it's hard to sometimes change the ‑‑ adapt the new quantum resistant algorithms, to adopt them in the cryptography schemes. Okay. I think it's working.
So I think it's not going to take a long time, as we have quantum resistant algorithms that we can be confident that they are going to work well, that do not happen, things that, for example, we have with some other hashing or common cryptography algorithms that can be easily break able with some assumptions or something else. So this is always the better of cryptography. We are still taking some time to have also quantum cryptography. It be there easy to change in the messaging apps but also the protocols can be threat for other problems, not for privacy, but for all the situations.
>> JOSHUA AYAYI: Yes, and you remember earlier, I said the whole process of our communication is changing. And, of course, in every system, there is a new system coming there's different placers for the old system to adapt. And sometimes in adapting, you need to identify even new players that were formerly not part of the old, and see if you can together with them create the new system.
And this is essential, especially when we are saying leave no one behind and nothing for us without us.
Then, of course, looking at the history of the Internet and how we are seeing how the Internet was used for education and productivity in the west and in other developing countries, but when it began to get to the Global South, it became more of distractions and entertainment, and today in Ghana are more than 95% of the Ghanians who are hooked on the Internet, spend an average of five hours on social media.
This is something we are trying to change because just 2% of Ghanians on the Internet use it for productivity. It's something we are trying to change. So if quantum computing is now coming to change how data is stored and how data is assessed, then obviously there is the need to create that kind of inclusive ecosystem that has to do with policy regulation and implementation and enforcement to make sure that there is no country that is left behind. We do not want another situation where a developed country has all the resources to fight these things in the age of quantum computers and have to sell it at an exorbitant price to the Global South countries taking our natural r sources and other things.
Of course, that's not what sustainable development is about, and for one, I am happy that we are having this conversation now so it will be more inclusive and deeper as technology progresses.
>> OSEI MANU KAGYAH: Thank you very much, our able panelists. I would like to take more questions from the audience. This is a debate style and we would like to hear more from you. And this is a question that has been answered earlier. So I am from Haiti. My question is about the deletion of images, photos on social media. I'm always told that when a photo is deleted on the social media, Facebook, for example, it is not definitely deleted. I would like to know if in any case we can delete it definitely. If yes, what is the process?
So we will come to that. So I will come to our audience, if you have any puts, comments, please do share.
>> AUDIENCE MEMBER: Hello, I'm from here in Ethiopia. First of all, I would like to thank you for giving me this chance. This is an interesting session. I am a researcher and I want to ask the Ghanian panelist if he ‑‑ they have filtering mechanism for age group and especially the youth ‑‑ for youth and social media, yes, if he follows and then I want to talk about quantum also. There is ‑‑ sorry.
And the other thing, there is a safe quantum safe. Actually, it’s not reality, but they are trying to use some algorithms to protect even this ‑‑ to tie encryption mechanism and also we can use ‑‑ if we get an attack, they get that encrypted data and push that data to the quantum computer to decrypt, but they can get decrypted part.
If they don't observe, they can't use that data. So by using some salting mechanism we can protect and the other thing also, it is ‑‑ even for Africa and other third‑world country, it is not available, the quantum computer. So it is not a issue currently, but we can focus it for other parties.
>> OSEI MANU KAGYAH: Thank you.
>> AUDIENCE MEMBER: My name is Oshekra from Nigeria. I'm a program manager and analyst. I want to make a comment concerning the question that was asked from my friend in IET, presenting the deletion of data on Facebook. So one thing I know most of us would understand here, is that when it comes to things like data privacy. No matter how you pull out different issues, in some way, you are connected.
So when it comes to the specific issue of your images being deleted on Facebook, for example, most social media messaging apps basically, oh, you consented to this.
But then it is imperative that we get this social media apps to give us the full picture of our data and I think many of the Big Techs are trying to do that, because they get pressure from groups like the IGF groups and national groups and civil societies. But we have to continue to show them exactly how this data is being processed, and how it's being deleted, everything transparency in structured manner and that we can improve the trust the people have in the social messaging apps.
It's important to note that the social messaging apps are very, very essential to the way we live our lives now. And trying to apply by bringing out everything that is negative or being a way that we don't understand may actually be ‑‑ it's important that as civil societies, as governments, as groups like the IGF and ITU, we get these people to be as transparent as possible and follow the necessary data protection acts and laws and we have the NDPR and we are working on a child online protection act at this time, and we try our best to ensure that every social media provider in Nigeria tells us exactly how our data has been processed and how it's being deleted. And that has to be transparent. Thank you.
>> OSEI MANU KAGYAH: Yes, I couldn't agree more with him. I have always questioned the transparency from Big Tech. And also in one sense, I also ask myself, are we overregulating these people? So I will come to that. Any more question before that? I think we have the question about mechanism and the quantum computing.
>> JOSHUA AYAYI: So for the filter mechanism, Ghana has not. I only made an assumption that assuming it comes to that point, and the institution to has to bear the cost, then, definitely we will pass the cost down to those who are making money off of you cube, the Ghanians and that also would then mean that those Ghanian content creators would have less money.
So where does the government of Ghana even start that conversation, seeing those who it's going to affect.
And, of course, if there are less and less Ghanian YouTube content creators then what that means is that the story of Ghanians are no longer being told and then you have less and less contact on began A. Then definitely, what we are actually seeing that we don't have a lot more stories of Africans on the Internet is what would happen. So when it comes to that, then you see the different players. Just by having this discussion, we can map out the different stakeholders when it comes to having a filter mechanism. And then involve them in the policy discussions, cocreate a policy that will both uphold the values of the Ghanian and still make it possible for Ghanian content creators to push Ghanian stories out there. So that was just an assumption.
But if it ‑‑ it will have to happen. These are some of the things we need to look at.
>> NICOLAS FIUMARELLI: So we have 20 minutes to go. We heard a lot about accountability and education and different things about improving the algorithms from the company side, and now I will share my screen to ‑‑ to you to be able to participate in a Mentimeter. Mentimeter is a platform, you will see instantly. Let me ‑‑
>> OSEI MANU KAGYAH: As he shares the Mentimeter form. If anyone has a side question or a comment to ‑‑ okay.
>> SAVYO VINICIUS DE MORAIS: I would like to add a little thing about the deletion of data in the apps. So basically, the current data protection laws require that if the user creates its data, it should be deleted. The problem is how can I say ‑‑ if that data is ‑‑ was really deleted or not. So this is the point on accountability. But more specifically, in the case of WhatsApp and Telegram, in WhatsApp, it's stored in the server for 30 days and in telegram, it keeps stored until you delete it. Except in the case of secret chats where one of the parts of the chat in the case of Telegram, if someone delete, the same data will be deleted in the other side. If and if I'm not with a bad memory.
In the case of secret chat, no the data of secret chats are stored in the cloud servers, but secrecy of that media is only to the parts the communication in the case of secret chat.
>> NICOLAS FIUMARELLI: So I think the participants in Zoom can see the link to Mentimeter, but you can access your mobile phone and you can go to the menti.com ‑‑ we see the screen but we have the ‑‑
So you need to go to Menti.com and put the code that is 58541704.
Sorry about that.
It's on the top of the screen. So there is a question for the audience and online and the on‑site, about what are the current challenges to face regarding secure exchange of messages? So to have more input.
>> OSEI MANU KAGYAH: Okay. We want to know the current challenges. So you just go to www.menti.com. It's spelled m‑e‑n‑t‑i.com. And user code 58541704.
I will repeat again. Www.menti.com, spelled m‑e‑n‑t‑i.com, and 58541704.
We have the questions: What are the current challenges to face regarding secure exchange of messages?
And transparency of messaging provides. We need to have strong privacy policy, terms and conditions.
Is there confidentiality of communication? I would like to have this survey.
>> NICOLAS FIUMARELLI: I just want to say that the idea is to maintain a debate, as we answer so this can be more collaborative. Some comments that are appearing, there stored data managing. Transparency trustworthiness. They need to have strong DNS and IP. So very, very good comments.
And if one of the panelists wants to comment on some of the things that are showing up in the menti method.
>> OSEI MANU KAGYAH: We want to hear from you.
(No audio).
>> NICOLAS FIUMARELLI: Yes, thank you very much. And remember that, well, this is totally connected to the Global Digital Compact and some of the key points of the role, about trust and security and ensuring digital human rights. So thank you so much for being with us. Thank you for your input.
>> OSEI MANU KAGYAH: Sorry. I would like to turn to the online audience. It's an insightful conversation. Let's keep the conversation going. We need to have an open, secure, Internet. We need to have our data protected and effectively use our technology.
And thank you everyone, over here, thank you so much for your time. Have a very pleasant day. Bye.