IGF 2023 – Day 1 – Open Forum #20 Benefits and challenges of the immersive realities

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> IRENE KITSARA: Good morning everyone.  Welcome to the Open Forum Benefits and Challenges of the Immersive Realities.  I will start with the speakers, the onsite panelists in alphabetical order.  Adam Ingle Global Digital Policy Lead from Lego Group.  Clara Neppel Senior Director at IEEE.  And Patrick Penninckx head of Information Society department from the Council of Europe.  And Council of Europe is the organizer of this event.  In remote participation we have Professor Melodena Stephens Professor of innovation and technology Governance from Mohammed Bin Rashid School of Government.  And Yu Yuan from IEEE SA President.  Welcome. 

My name is Irene Kitsara I'm the European Standardization Initiatives Director at IEEE SA.  I have the pleasure of being lead of the upcoming Council of Europe and IEEE joint report on the impact of the Metaverse and impact on human rights, rule of law, and democracy.  I will be your Moderator today. 

Let me start by asking Patrick and Clara Neppel why the Council of Europe is organizing this session and working on issues related to Emerging Technologies and also what is the role of IEEE in this? 

>> Patrick Penninckx: Thank you so much.  For the Council of Europe it is an imperative to always work at the edge of the developments of technology.

(Off mic)

The developments are compatible with the values of the Council of Europe.  And also in terms of the emerging new technology, which is encompassed through artificial intelligence and immersive realities, we need to see to which extent it coincided or reinforces a certain number of challenges for the development of human rights. 

So I have to start again, I guess.  What I was trying to say is that the Council of Europe has always been at the edge when it comes to the development of new technologies.  When we try to look at the development of everything automated processing of individual data already 40 years ago, for the cybercrime Convention more than 40 years ago, for us it was important to look at the impact of the Emerging Technologies and in this case, immersive realities have on human rights, rule of law and democracy. 

That's why we thought it was important to work in partnership with IEEE on looking into the Metaverse and how the Metaverse would impact those human rights.  That's why we decided to organize this workshop here. 

>> Clara Neppel: Thank you.  Thank you for having us here.  My name is Clara Neppel, I'm the senior Director of IEEE in Europe.  We're based Austria and Vienna.  On the flight here, I watched a documentary from a famous architect who said man creates buildings and buildings create man.

It is the responsibility of the architect to create buildings that make people happy. 

Now we're at a time when we create a completely new virtual reality and we're the architect, we cannot do it alone as technologists.  We need to create an immersive reality that makes people happy and cares for well‑being and of course human rights and society. 

We need to bring also in this report that is what we try to do to bring different perspective.  So from the technological side, the ethical side, social side.  And yes, this is basically, bidirectional dialogue that we need to continue also for this.  Yes.  Thank you. 

>> IRENE KITSARA: Thank you, Patrick and Clara.  We are hearing the terms, Metaverse immersive realities and other sessions we have also related terms such as virtual worlds.  I think it would be good for our discussion to talk a little bit about these terms and maybe as well as the technologies that are enabling making such realities an option and making it possible for us to experience.

With that is correct I would like to turn to you to provide us with a perspective on this. 

‑‑ Yu to provide us a perspective on this. 

>> Yu Yuan: Metaverse was termed in the sci‑fi fiction novel well, 30 years ago.  But during the past decade, this concept itself has been extended quite a bit.  (Audio skipping)

Let me share with you our definition of Metaverse while trying to provide most inclusive definition for Metaverse.

In terms of Metaverse, we could agree that this is talking about digital universe.  So from the experience perspective experiential view, there are three types of Metaverse.  A digital or different Metaverse or digital counterpart or digital extension of the universe, which means three different types of digital universes are (audio skipping) virtual reality, augmented reality and digital twin.  From that perspective, Metaverse is a kind of experience in which the outside is perceived as a universe. 

But from another angle, the functional view (audio skipping) (frozen)

How about now, can you hear me? 

>> IRENE KITSARA: Now we can hear you, thank you. 

>> Yu Yuan: Metaverse from another function, the functional view, it can be from the next version.  (Audio skipping) in the next stage of Digital Transformation.  Let's look at Metaverse technology landscape.  We can say supporting technology is like a competition, storage, communication, networking, data, knowledge and intelligence are all necessary to enable Metaverse.  There are core technologies for Metaverse.  Namely, the sense of actions you can call that XR or special computing.

The second category is persistent virtual worlds, we call that persistent computing, about how to create virtual maps, virtual scenes, objects and characters, collectively constituting virtual worlds.  Digital finance and economy, consensus computing, it is about digital assets, (?) it is the decentralization and blockchain.

From this technology Roadmap or landscape, you can say AI is an integral part of the Metaverse technology landscape.

With that being said, we can say that Metaverse is the next biggest thing.  Why?  Because if we look at the history of the digitalization or Digital Transformation, we're actually between two stages.  The current stage, which is already exploding we call that intelligentization which is the rights of AI, using AI everywhere.  But in the next phase (audio skipping) and it is upcoming is Metaverse.  So we are in between two stages.

I can also add that as well we agree that AI is transforming production, transforming forces of production and production.  And Metaverse will redefine production and life.  Metaverse is the next biggest thing.  I will stop here. 

>> IRENE KITSARA: Thank you Yu.  You touched upon the fact that we have different areas of application of the Metaverse.  I would like to turn to Melodena and ask her about some application areas, and then move to Clara Neppel and Adam to talk about the benefits arising from the immersive experience and ways that the Metaverse can also promote human rights and rule of law and democracy.  Melodena, would you like to start? 

>> Melodena Stephens: Thank you, when Facebook changed its name to Meta in 2021, the market suspected that the total size is 13 trillion U.S.

That was revised and went downward, but I don't think it is a wrong estimate at all.  The first reason is Metaverse is also hardware.  So you see the doubling of computing power every 18 months.  And you see a lot of the geopolitical tension is pushing adoption of the Metaverse.  You can see it in the 5G and proxy wars going on

You see Private Sector's tremendous interest.  In fact, the applied research from Private Sector is greater than Government investment.  You see this in things like for example, Microsoft's acquisition of act vision blizzard for about 69 billion U.S. dollars.  And we see Governments are huge adopters.  I will go through that briefly.  But we see a standards war coming out.  It is being played by the Private Sector currently.  You see Pokémon GO, an augmented reality game got 50 million customers in 19 days.  That is huge adoption curve.

You see also price war happening right now, with Meta's oculus glasses priced at 500, and Apple's glasses at 3,500, all in time for Christmas.  Gaming is driving the Metaverse.  There is more than 160 virtual worlds.  Fortnite has half a billion customers and generates something like 6 billion U.S. dollars.  A lot of the income is also micro purchasing.  We can't ignore other players which have huge numbers, for example, Meta with 3.8 weeks billion users.  Microsoft with most of the fortune 500.  And they have Microsoft mesh and act vision, that is 92 million monthly users and Minecraft, significant number of children.  Apple has 1.5 billion users entering into the payment circle and Google has 4.3 billion and Tencent 1.26.  And the video is the typical hardware provider is entering the space.  The crossovers are interesting and it is hard to determine market.

Industry applications are, for example, in digital twins, countries adopting ‑‑ cities, for example, and countries, UK has a digital twin strategy.  South Korea has one.  And we see cities adopting it.  We see manufacturing, the factories adopting and creating digital twins.  Siemens, BMW, Germany, utility sectors, Sydney water.  And petroleum, oil, gas and 900 cities with Smart Cities.  The Internet of Things, it is pushing the doing of the Metaverse.  We have 125 billion connected devices in 2023.

We seek Government which historically has contributed 40% to GDP and approximately maybe at the higher end.  And entering.  So for example, tourism, during the pandemic, Dubai was present as World Expo.  They had 24 physical visitors coming to the site.  It was COVID after all.  But 125 virtual visitors.  And this is part of their legacy.  We see KSA and Finland in Minecraft with the 3D version of Helsinki as a city.  Education is an adopter that is typically pushed by engineering and health.  That is where a lot of research is happening now.

There was the first surgery but more to access digital records, and some work is happening on customer care.  A lot on reskilling.  For example, Accenture bought 60,000 oculus quest headsets in 2021 for the employees and created the Nth floor for training and networking.  And retail is getting involved in the Metaverse.  Typically now it is more experiences.  Brands are testing it out.  We have luxury brands like Gucci, Burberry, and fashion brands, like H&M, Forever 21.  You name it, they are there.  There is no doubt we will reach 13 trillion.  It is a function of standards or who will win the standards war.  And what is the situation with regulation?  I will stop there right now, Irene. 

>> IRENE KITSARA: Thank you, Melodena. 

>> Patrick Penninckx: Well ... (off mic) Is it on?  Yes, it is on.  Okay, well there are ‑‑ the question that Vin just asked in the high‑level opening remarks was what is the Internet we want and what is the Internet we deserve?  These are two different questions.  The same goes for the Metaverse.  What is the Metaverse we want and which one do we deserve? 

I think if we want to create a Metaverse that is respectful of human rights, that will enhance freedom of expression, that will be inclusive, that will be accessible, that will be fostering Global connections, we need to put those mile posts and benchmarks in place.

That is why we cannot just let digital development happen.  We have to be able to steer that digital development.

I wouldn't say that we need to steer innovation.  I think that is for companies to do.  But we need to put those benchmarks right that make sure that there are within the Metaverse also innovative educational opportunities.  That there is a democratic participation.  That there is a digital rights protection.  We very often at the level of the Council of Europe say what is the protection of rights that we need to do offline?  We also need to do that online.  If the Metaverse is the next step up with the Internet of Things with connected realities, with 5G and quantum computing and how it interrelates altogether in certain industries are very far ahead, you didn't say that earlier on.  But for example, testing of in the Metaverse how it feels to be underwater, for example.

These are innovations that we need to be able to not grasp but at least to be able to say what usage do we want it to give in the future?  I could imagine that not only it gives you the feeling of jumping off a cliff into the ocean, which is the fantastic use of the Metaverse, I guess.  But if we are able to use the Metaverse in order to do water boarding, this may be a completely different reality. 

So we need transparency.  We need accountability.  We need digital rights protection, and I think it ‑‑ the experience already shows that we need to be able to give a certain guidance on that.  We're trying to do that in the technologies that are being developed.  Right now, we're developing a regulation on artificial intelligence that is a framework Convention to be dealing with this.  We hope to finalize that by mid next year.  But also, in our future work plans, the Metaverse is part of it.  And the fact that we can work together with IEEE on those kind of things seems to me essential.  Because as we said before, it is in this multistakeholder context that we need to be able to discuss that from all angles.  Whether that be from the technical community, from the engineer's point of view, from the business point of view, but also from an ethical, Civil Society and academic point of view and be able to Governor all of that. 

I think the benefits are there.  We can work towards the promotion of human rights and rule of law and democratic participation, but it is not going to go evidently.  We have seen that with the development of the Internet.  The Internet has given us a number of opportunities.  We want it to be open and transparent and flexible and worldwide.  But we're increasingly getting a more fragmented world.  And we also know if we let things happen, if we want a ‑‑ not want a Metaverse, but if we get a Metaverse we deserve, we may not be getting the Metaverse we want.  And I think that's important from human rights perspective to look at it. 

>> IRENE KITSARA: Clara Neppel and Adam on the benefits? 

>> Clara Neppel: Well I think we heard quite a lot on the benefits.  I was thinking again, on my flight to Japan that probably already immersive reality contributed to this flight and your flight as well to be more safe.  Because the pilot was probably trained by hours and hours in immersive realities to master a situation which he hopefully never encounters or never very often.  This is already an immersive reality that helps us.  And we hear now generative AI.  Generative AI will revolutionize also design.

We will have the car industry, which is already testing out different design options in different immersive realities.  And I think that we are moving now where we heard from the digital twins of cities and somebody asked to try to map it to SDG.  So I will try to do the obvious one, of course is SDG 9, industry and innovation and infrastructure.

But if we go to the digital twins of cities and even of the planet, of course, we also touching about the SDG 3, sorry, 13 on climate and also sustainable cities.

And we are moving to the digital twins of ourselves.  I think that this is where our collaboration with the Council of Europe is going to be essential.  Because there, we are entering a realm that we certainly cannot handle alone when it comes to human rights, democracy and rule of law.

And so, digital twins of ourselves, what does it mean?  It means inclusive health, healthcare, SDG 3.  Education was already mentioned.  SDG 4.  But what is very close to my heart is really SDG 17.  That is partnership.  It is partnership for the common goals.  And I think that this is going to be now really a game‑changer.  If we're thinking about climate change, we see quite a bit of measures which are very difficult to implement because citizens don't understand the full impact of it.  There is a fear.  What does it mean if a solar panel is close to my field or a wind turbine is nearby?  What does it mean if my city will implement new measures in terms of traffic control?  This is something we can try out in the virtual reality and can really enhance the democratic participation that Patrick talked about.  Thank you. 

>> Adam Ingle: Thank you.  I think the benefits of ‑‑ I'm from the Lego Group, I will focus on what it might mean for children.  And really it has tremendous potential to amplify the things kids care about.  We have undertaken research alongside UNICEF to understand what is child well‑being online and what components and building blocks make children feel like they're in a positive space.

One of them is social connection.  The Metaverse and the interoperability is something that can impact children, and you and unprecedented.  You have an avatar that is building experiences and built up a history online.  That conveys a unique sense of yourself to your peers and other kids.

So you can connect in a way that you haven't been able before.  That is really what kids value. 

You can create in a way that you can't do offline, even with Lego BRICS.

You are really able to build the worlds around you.  You have seen the power of Minecraft, roadblocks, what is happening in the Fortnite.  These are all early Metaverses.  As technology improves the graphics, layers of services and the potential is huge.  Children learn through creation.  That is what we have found.  They can do that in an even better way.

You also can empower kids.  You know, they have the sense of identity, they're online, engaging, building their own lives there.  They really value this kind of sense of empowerment and often can find interactions patronizing, but they have a right of access to the benefits of technology and the Metaverse is an avenue for that.

They can learn, create, connect, do all of these things.

I know we are getting to the downsides later, but I want to say, there is a massive caveat to all of this, these things need to be done in a responsible way, particularly with children.  Social connection, we have seen the harms that come through an unconsidered approach to those types of things.  So the benefit is Programme, but it needs to be done right.  Hopefully, that is a good segue. 

>> IRENE KITSARA: Absolutely.  Thank you for that.  We have this in the title of the session of the challenges.  This is a part of what a lot of sessions in the IGF are addressing around concerns that come with Emerging Technologies and applications.  I would like to address this question to all our panelists about what are the challenges that could arise from Emerging Technologies and what is the potential impact it can have on the human rights, rule of law and democracy, remembering the organize of the event.  Let us maybe just give a bit of background on what we have covered in the upcoming report.  So we have looked into on one side the enabling environment that the immersive realities and Metaverse can create for exercising human rights and the role of law and democracy.

But other issues, we have looked into reality and data protection, privacy, protection of children and vulnerable populations, access and accessibility issues, inclusion and nondiscrimination, freedom of expression, censorship, labor environment, and issues related to the rule of law such as territoriality, enforcement, access to justice and democracy.

Before we all despair, let's start by some of the issues.  I will start with Clara Neppel and then move to Patrick Penninckx, Melodena Stephens and Adam Ingle. 

>> Clara Neppel: Thank you.  I already mentioned that we have already very practical examples of virtual reality.  Autonomous cars being tried out in different scenarios.  Even there, there are certain ethical questions.  A cow on the street might have a completely different value in India than Europe.  If we have the digital twins or avatars or digital humans, we're entering a new territory.

The digital humans interacting now in seamless interconnected space.  There is ‑‑ who is going to control that space?  So until now, these immersive realities have ‑‑ also the rules of engagement have been designed by private actors.  Now, if bee ‑‑ we have something like a public space, who will decide who will enter that space?  What is acceptable behavior?  And when somebody or whether someone should be excluded?  We're talking about inclusive space as possible.

We see a shift of paradigm from the moderation of control that we know from AI and social media to the moderation of the behavior and moderation of space.  What does it mean to be addressed in a virtual space? 

And again, if we are discussing about virtual spaces, what is a public infrastructure?  To what extent can people co‑create that infrastructure?  And what does it mean to ownership?  We already see our children in Minecraft creating magnificent cities, so on.  What does it mean if it is incorporated into a private virtual space?  Who has ownership? 

And again, who is dictating the rules?  In the digital space we van open source, the Governance of, you know, who is actually controlling what code is getting into it.  We had some time ago, something like a dictator.  Somebody dictating which codes should be part of that service.  Are we going to have something like this in the digital space?  Hopefully, it will be democratic.

And it will influence our worlds views.  Because we will have a completely different perception of an environment if we are immersed in this.  Who is going to control how this is going to look like?  What does it mean, perception of history and perception of reality as such.

I think is we heard of privacy.  We are entering a new space.  We will have technology that is omnipresent.  We have to get away from the technologies that we hear now, the headsets.  Think about technologies that are upcoming.  Last week at a Paris fashion show, something called a human AI was presented, which was a very small pin, which is there all the time and registering basically everything.  Recording everything.  It is a digital assistance, Star Trek‑like assistant.  The question is, what it mean to this conference if we had such a technology that every time recording what is happening, to what to whom and possibly feelings he has.

So you can imagine a type of information asymmetry that we will have and all the power of those who can also predict certain Alliances, certain power gains in the future.

You can see we have certain new aspects to existing ethical challenges, like privacy, bias, accountability and we have some new challenges.  We had Tom Hanks, also last week telling that there is a digital Tom Hanks around who is probably having dental care.  He has nothing to do with him or this.  We have more digital twins that are going to be copying our physical features and also our characteristics.  The way we're talking and feeling.  How much we actually control the digital selves or these digital feelings? 

And are we going to need to have an authentication of content and of the digital humans?  And last but not least, I want to conclude with safety, I think that safety is also going to play a completely different role that we are discussing now in terms of AI.  Maybe some of you have heard this advertisement that the Metaverse is virtual, but the impact is real.  I think that is very true. 

Of course, you will have very real impact when it comes to healthcare.  But if it is not designed well, it has a real impact on the patient.  And other things which makes this ‑‑ this need of the designing it the right way a very important one. 

>> Patrick Penninckx: Now, the human rights activists and also organizations that stand for human rights are often seen as a little bit alarmist and do not see sufficiently the positive side.  But it is also for a human rights organization to be able to point that out.  Let's say the ones that are ‑‑ the evangelists, if I call them that way, of the future developments including the immersive realities will point at the advantages. 

They also do serious efforts.  I'm now not speaking for that business community, but it is not as if that business community goes about developing things in a completely unethical way.  They pit quite a number of resources ‑‑ put quite a number of resources into place.  And Meta was unfortunately not able to participate in this panel discussion.  But I know they do a lot of effort in order to be able to ensure that the ethical principles, human rights principles, legal principles are also being respected.  Adam will certainly say something more about it after these as well.  Because that is their prime concern.  Not their prime concern.  The prime concern remains doing business, obviously. 

The question is not so much how much ethical principles are being put forward in business, it is to which extent this universe is going to be regulated by private business or to which extent has a democratic society with the principles that it endorses and tries to promote to which extent does that have an impact on the development of this new immersive reality. 

None of us here are immersive natives.  I'm an analog native.  Some of us may be digital natives.

(Chuckling)
Not looking at anyone in particular.  But none of us are immersive natives.  We will have to be able to look into a completely new reality of which we do not necessarily yet see the contours.  And in order to be able to see those contours, let us not be naive. 

I'm old enough to have looked at the start of the Internet and the positive feelings about democratic Governance and participation and improvement of let's say grassroots democracy.  But we also see that that was maybe a little bit naive and we also see that there are a number of things that we need to ensure that especially when our societies are instead of growing more democratic are getting more defendant ‑‑ defensive of human rights.  We're regressing.  We're back sliding.  So let us see what that means.  If some of the information and data that have been collected even until now fall into the wrong hands, I think we are very badly off. 

Now, the Metaverse also and the immersive realities, allows for new forms of crime.  Allows for new questions or has to be put new questions with regards to the jurisdiction.  Where ‑‑ who is going to be judge and party?  Can we be judge and party?  Should we not divide that?  Should we not have the ones that are deciding on how the developments are taking place be separated from those who take a number of decisions with regards to the jurisdiction about it? 

Now, we have spoken about privacy.  Clara Neppel mentioned it before.  We're getting into a new dimension of privacy.  Because in order to create an immersive reality.  We also need to ensure that new forms of data collection including biometric psychography are recorded.  These are very intimate, more even, I would say, than our health data, which are sensitive data.  How are they going to be governed?  I think even if ‑‑ who was it?  Tom Hanks?  No?  Was it Tom Hanks complains about deep fakes, I think in the future we will deal with something which is far more immersive than that. 

I think we're moving towards in order to be able to represent yourself through an avatar, it basically means that you have to have a complete picture of yourself including of your experiences, et cetera, et cetera, to make it more realistic.  We'll be in Turkiye, in 2034, will the IGF take place in an immersive world Irene?  So these are the things we need to ask.  What are the consequences of that for privacy and digital security?  How do we identify ourselves?  Not only Tom Hanks, but also everyone in our room here.  But what about anonymity, can we still be anonymous?  We're outraged about video surveillance and some countries and some cities are excelling in that.  But what about anonymity and private life? 

At least for the European Convention on human life, privacy is one of the pillars, Article 8.  What about freedom of Executive Session, what about the counterpart of disinformation and misinformation?  We see especially now with the ongoing war how misinformation and disinformation are being used in a 1930‑like manner, but in much more efficient manner to be able to stifle freedom of expression but also to control both forms of population.

That immersive reality can only be an extra layer of that.  I think we need to not be naive in terms of thinking about ‑‑ thinking that everyone is nice.  Not everybody is nice.  At the IGF, of course everyone is nice.

(Chuckling)
But there are other people out there that may not be so nice and have different intentions on how your private information will be used.  Let's also think about inclusivity.  The speeches earlier today were all about how can we make the next 2.6 billion people connect to the Internet.  But how are we going to connect the next 8 billion people to the Metaverse?  Who is going to be included?  What are the elements of inclusion?  I see the potential for educational purposes, so on, so forth. 

But in order to be able to benefit from the educational goals, we have to be able to ensure that people can also participate.  So inclusivity, accessibility, how are we dealing with the Digital Divide?  Not only worldwide, but also within our societies. 

And that is something that has also been shown during the COVID crisis, how the Digital Divide in our countries have been extremely difficult to overcome.

So Governance and accountability.  It is good to be accountable to yourself but you can also get away with certain things.  I try to be accountable, but I'm not always so accountable.  Don't tell anyone.  But that's the reality.  If you are judge and party you can't be totally objective.  We need to in this multistakeholder approach come to common sense.  I think this IGF also points at it.  That is that we need to be able to on the basis of a number of common principles, common values, how do we want to see the next step not only in Internet Governance and artificial intelligence, but how do we also measure that in terms of the immersive realities and how are we going to position ourselves to that?  Are we going to be naive in hoping that the next generation will be simple and will be defensive or not? 

>> IRENE KITSARA: Thank you.  Let's now move to Melodena.  And being aware of time, I am asking all the speakers onwards to be conscious of that so we leave time for the Q&A.  Melodena. 

>> Melodena Stephens: Yes, I would like to talk about the universal human rights Article 23 which says everyone has the right to work, free choice of employment to just and favorable conditions of work and protection against unemployment.  The Metaverse is data hungry.  It basically consumes your data like Clara Neppel and Patrick has mentioned.  The answer is it will remove jobs.  For the first time the World Economic Forum in the 2023 report said AI technologies like Metaverse will be a net job loss, not a net job increase.  And that means we will not be prepared.  Because now skills don't matter.  Your experience doesn't matter.  Because it is all saved on the Metaverse.

And this, the cost of not preparing people to have jobs or to keep jobs will be something like 11.5 trillion for training.  But even more, if you look at things like pensions or Social Security.  The bigger worry are the jobs that are being formed are often low‑paying jobs.  The human being is coming to the bottom of the supply chain, right?  We see this already because some of the jobs are tagging content or content moderation.

I will give an example.  Roadblocks has a very active community and they have 4.25 million developers.  If you want to earn on roadblocks and convert their money, the roadblocks to actual U.S. dollars, you have to make a minimum amount of money.  Of that 4.25 million, developers only 11,000 qualified.

This has a direct impact on health.  That is another universal right, right?  And the impact is well‑being, especially the uncertainty whether I get to keep my job is important.

This raises questions on IP.  As learning my experience and skill sets is because of the amount of years I spent and are uniquely mine.  Do I have IP on this?  We see another important thing coming in, which is behavioral addiction to technologies like this.  I mentioned at the beginning, a lot of the Metaverse has been built from gaming.  So we try to gamify behavior.  And we know for children as an example that many ‑‑ not just children.  But adults also can get addicted to games.  This is declared a psychiatric disorder in 2019 by WHO.  But the answer is as we start putting it in our daily life in shopping, work, and education, at what point will the so‑called magic circle.  The circle between reality and imagination disappear? 

This is something we aren't actually putting enough research into.  I would like to very briefly bring in environment.  Clara Neppel mentioned that.  But the Metaverse is something that required huge amounts of data and computing power, hence it has a significant carbon footprint.  Take the semiconductor chip embedded in most technology if you have a mobile phone or laptop.  The average chip when you take all of the components travels 50,000 kilometers and embedded in 169 industries.  So we're looking at environmental costs in carbon in terms of water.  Because chips are not.

>> IRENE KITSARA: ‑‑ are not recycled.  e‑waste is growing exponentially and less than 17% is recycled.  Something like mercury we see it in fish across oceans.  It is not contained.

I want to briefly mention one more thing.  Cultural representative is extremely important in the Metaverse.  I think it will be something Nations have to consider, whether it is stereotypes that are represented on the Metaverse or how do you actually do that?  So with that, Adam, over to you. 

>> Adam Ingle: Thanks.  I will keep it brief because a lot of the challenges have been discussed.  I think one thing that has come out, though, and always comes out in these discussions is how so many of the issues aren't unique.  They exist today and we're still grappling with the solutions today and now regulation and legislation is forming a response to the issues.

So I think we'll actually have to wait and see how the issues in web 2.0 and regulatory response and cultural response to the issues plays out to see if we start in Ernest with the Metaverse in a better playing field. 

When it comes to kids and the challenges they face, from Lego Group's mind we want to create a kid‑friendly ecosystem.  One with high safety standards, responsible design, limited ways for harmful contact, conduct, contract.  And in order to do that, to get truly immersive ecosystem we need others to join us and share our standards.  We can create all the great Lego experiences, but the Metaverse is connected and interoperable.  We need to lift our gain to have a collective approach to address the harms children will face.

>> IRENE KITSARA: Adam, thank you for leading to the last question.  Again, because of time, I ask the rest of you to cover, we're at the IGF, naturally the last question is around Governance of the Metaverse.

And could you share some key concepts you know, or the issues we have been herring and the considerations and challenges are very much I think known issues from ongoing or previous discussions related to AI, generative AI, social platforms and gaming.  How can we address some of the challenges we heard?  What can be some of the considerations and elements we should bear in mind while considering Governance of immersive realities?  Patrick, would you like to start?  Or Melodena?  Melodena, would you like to start? 

>> Melodena Stephens: Sure.  When we look at the Governance I ‑‑ I want to quote something from ITU in 2003.  IGF committed to the WSIS principles that says commitment to people‑centric, inclusive, development oriented society.  I wonder if sometimes we put technology before people.  We see a lot at the national levels in terms of policies.  OECD reports 800 AI policies, most are in North Africa and Europe.  We see a lot of data regulations, 62 countries with 144 data regulations.  Most of it is fragmentation.  The Metaverse will be global and it really requires collaboration across Governments.

The few Governments that have the policies on the Metaverse, most of them recommend self‑governance.  I think this is because of the adoption curve.  You see South Korea came out with ethical principles.  The agile Nations that is a Coalition group with several countries is coming out with a report in this week and again it talks about self‑governance.  China for the first time has actually said you could file trademarks of NFT virtual goods.  And Australia has a white paper on standards.

Self‑governance, the time to put together an overarching policy will take too long.  We need Private Sector to work with that.  There are standards coming out.  If we look at something like the Metaverse standards, which is an association with 2,400 members, most of it is Private Sector.

One of the challenges I would like to bring is open source.  The Metaverse builds on top of open source, there is a proprietary layer.  This creates a problem.  Take for example, a database of faces.  Megapixel had a dataset of 4.7 faces scrapped from flicker, today you can do it from Instagram or YouTube.  80% of that was from these places and using 900 research papers.  We see the open source does have challenges I would like to highlight. 

Another one is apogee software, there is something called log 4J.  This is responsible for the 404 error you see.  They found out a problem in the code created vulnerability.  What is interesting is it is embedded everywhere.  Amazon, Apple, Minecraft, all java systems.  That is 3 billion devices.  We will see that this problem will exist.  It is not really how much foresight you have but how quickly and transparently we can work together.  If we penalize Private Sector for being transparent, they will hide it and make the vulnerability worse.

That is something we need to find.  We find that there isn't much way forward, for example, Barbados wanted to put an embassy online.  And according to the Vienna Convention, it talks about only physical embassies.  These are countries with limited resources, and virtual embassies work.

I want to highlight one more thing.  Most Governments represented on the Metaverse are represented on top of Private Sector.  So using something like decentral land, and the sands in working on that.  This raises also interesting questions at which point I would like to stop now and hand over. 

>> IRENE KITSARA: Thank you, Melodena.  Patrick? 

>> Patrick Penninckx: When you speak of Governance, there are a number of Governance principles that are enshrined on what we have done on data protection, cybercrime and questions related responsibility, transparency, explainability, to revocability, to the right to contest all of the elements need to be looked at.  Obviously, what we did when we started to work on the new Convention on artificial intelligence, the first thing we did was some kind of feasibility study.  That is look at what are all the ethical principles that are already out there and which are applicable.  What is the legislation that is out there?  That would be application to the Metaverse.  And then look at where are the gaps.  If you have identified the gaps, look at which are the elements that could constitute the elements of a future Governance within this?  I think I will leave it there. 

>> IRENE KITSARA: Thank you, Patrick. 

>> Clara Neppel: Thank you, I think what we hear now more from the Private Sector as well as the need for interoperability of regulation, regulatory requirements.  One way to achieve this could be actually through Global standards.  It is important to say standards are there to move from principle to practice and this is the top‑down approach.  This is important.  But do we see a bottom‑up approach?  In IEEE we are working since 2015 with the align design initiatives which resulted in a set of standards from value‑based design, which can be used also for the Metaverse.  To defining more closely what is transparency or age‑appropriate design.  Adam is part of that.  So I think we need to bring together this top‑down and bottom‑up principles in order to create that framework, which works for everyone.  Yeah, I think I will stop here because we want questions as well.  Thank you. 

>> IRENE KITSARA: Yes.  I would like now to turn to you and to the audience and see if you have any questions? 

For the panelists.  Then I hear we also have an online question, maybe we can start with that. 

You can think in the meantime. 

>> With the increasing number of users not guilty virtual realm, there is a plethora of data from eye tracking to brain activity and heart rate, how do you envision the Governance and regulation of such intimate data in the Metaverse? 

Furthermore what steps need to be taken to ensure biometric data remains private and avoid misuse.

>> IRENE KITSARA: I can address what we have identified in the report.  Maybe that will give an overview of some of the issues identified by the experts.  So indeed we will be looking into much more invasive practical supervision and censoring.  The idea is ‑‑ experts have been looking into the idea of rethinking privacy.  We thinking what it means.  There are different defenders of the introduction of the so‑called newer rights.  In Chile, this is covered in their constitution.

On the other side, there are thoughts.  Issues around bystander privacy, not just your own that you can consent to.  But also for example, the people who may be in the same room with you and they don't know that they are being also recorded with you.

So there is a plethora indeed of questions.  And there are different views around the Governance of that.  Whether there may be also some self‑regulation, self‑governance principles that could help with that.  Or whether we should be looking at existing reinterpretation of existing law or introduction of new one.  Do we have any questions?  Please, the gentleman. 

>> ATTENDEE: Thank you for allowing me to participate in the panel, Town Hall.  Very interesting.  And talking about immersive life, technology or maybe existence.  From that perspective, we in Poland because I'm from Poland ‑‑ have different consideration.  The biggest tension is not on the freedom of expression or personal data and privacy, but much more, maybe only one of the future attention, freedom of conscience.  I would like to ask you how to deal with this.  Not only the part of the fundamental rights.  But from the technical point of view, it is of course challenging.  I understand this.  I thought it was worth to put the question on the table.  Thank you. 

>> Patrick Penninckx: I remember chanting in one of the demonstrations in Belgium in the 1980s that the thoughts are free.  I don't know if the thoughts will still be free.  That's freedom of conscience indeed. 

Of course, I don't know to which extent ‑‑ once we are starting to look into interaction between man, and we see the technology already enhances or has the capacity to influence our behavior, to which extent will it be influenced in the thought processes? 

I think our thought processes are already being influenced by the messages that we get very directly.  Otherwise, how can you explain that whole forms of ‑‑ the entire populations can be influenced in a certain manner.

When I looked at the Edelman trust barometer, I saw in authoritarian regimes, the trust in public media is the highest.  This seems to be contradictory.  But also quite revealing on how a regime and whether that be a private or public entity can actually influence the way people maybe not think, but at least act according to what is expected from them.  So freedom of thought is definitely and freedom of religion, because also in the European Convention of human rights, this is enshrined are definitely things at stake and need to be looked at.  Thank you. 

>> Clara Neppel: I don't know but the questions, it is my personal view, we are talking more about the moderation, that practically we're talking about content moderation, if it should be private or public. 

Probably in order to have a certain balance we need the multistakeholder moderation at some point.  This is, I think, that we are here at the IGF, this should be at the heart of discussions, because I think this is also what Patrick mentioned before, democratic process cannot happen if you cannot control ‑‑ if you don't have anonymity, first of all.  That is important.

You can't activate your right as a citizen.  That is my private. 

>> IRENE KITSARA: (Off mic)

Shortly with that, we identify in the report the mental privacy, mental autonomy and practically reinterpretation of notions when you lack freedom of expression and what they mean with the technologies that have the potential of changing not just our perception of reality but changing our thought process and our ... thank you. 

>> ATTENDEE: My name is Michael I'm the Executive Director of UCLA approximately.  I want to pick up on what you said about content moderation.  As far as I understand it, the tools to moderate content effective at scale do not exist for the technologies.  So it is fine right now, as long as adoption rates are where they are.  But if these things take off rapidly, there's no actual way to follow the standards that already exist for attritional social media platforms.  This is something that is a legal and policy and technical challenge. 

>> IRENE KITSARA: Yes.  Because we will need to ‑‑ please. 

>> ATTENDEE: I'm Steve foster from UNICEF we also did a short report on Metaverse and children and some of the rights.  Hopefully, that was useful.

My question is around ‑‑ I'm sorry, maybe it is too big a question for this time.  But I'm from South Africa originally.  Your thoughts on how the Metaverse will play out over time.  Not everybody can afford the $500 or $3,500 headset and not everybody will.  If the technologies will scale Globally, that is a question they think it will.  It will look different for users in Johannesburg or Cape Town to perhaps kids ‑‑ I'm looking at children in New York.  Some children in New York. 

So as we have seen signals of this, of beginning to talk to cloned characters.  You might be doing it in WhatsApp.  Doesn't have to be an immersive environment.  It is going to normalize talking to AI.  You are not sure if it is a person or not?  Any thoughts on how it might play out?  If there isn't time now, we have the next few days.  I would love to have a coffee and pick your brains. 

>> Melodena Stephens: Can I answer?  Oh, was it Adam that will go ahead? 

>> IRENE KITSARA: Please? 

>> Melodena Stephens: I was going to say one thing.  When we look at the Metaverse, generally the standards often come from the I.T. or technology Sector.  But we see health come into that.  It is really important we don't approach this in silos.  Ministers have to work together.  Health has to sit with social.  If you see an impact on people, communities, societies you also have to work with technology.  That is missing right now.  Content is being developed for schools.  I don't think there is a psychologist or sociologist involved.  In Adam's company they do, but many it is not true.  In the technologies, this will get cheaper and cheaper.  It is happening.  The technologies are only viable at scale.  That is how they will work.  There is a danger they will be affordable and embedded and can't get rid of it.  Think of ChatGPT, everybody is using it and now we want to know how to regulate it.  We're at the wonderful time, the 10‑year window to have the conversations and come up with the safeguards.  I think the dialogues are so critical.  Thank you. 

>> IRENE KITSARA: Thank you upon Melodena.  I think we need to stop here.  But talking about partnerships, I would like to keep your ‑‑ to share with you, the result of the digital partnership that is between IEEE and the Council of Europe.  And stay tuned to the upcoming report on the Metaverse and its impact on human rights, the rule of law and democracy, which is expected to be released in early 2024.  Thank you very much, and thanks to our panelists and the organizer and our hosts of course. 

(Applause)