The following are the outputs of the real-time captioning taken during the Fourteenth Annual Meeting of the Internet Governance Forum (IGF) in Berlin, Germany, from 25 to 29 November 2019. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
***
>> PEGGY HICKS: All right. Thank you, everybody, for joining us. My name is Peggy Hicks. I'm at the office of the high commissioner for human rights, often referred to as the longest title in the room. If anyone can compete with that, let me know afterwards.
We're glad with all the packages on the agenda, you've come to talk to us about the project at the human rights office that really grows out of many of the types of conversations I've heard about what it means to bring human rights to new technologies and really start applying it in practice. And, in particular, look at the U.N. guiding principles on business in human rights and what do they mean? What can we expect for companies.
I was just in another session. For those of you who were there, I apologize for repeating, but one of the big things we hear, we know the guiding principles, they're cornerstone. We're committed to implementing and respecting rights in accordance with the guiding principles, but they pose challenges for companies that are trying to apply them with regards to, you know, new technologies like, for example, facial recognition. It's not as straightforward. We don't have prior examples or case studies that can be used in terms of how we do this. The same way as we might in the apparel industry or the extractive industry where there's been a lot of work already done on how the guiding principles can be used in that space. So the B‑Tech Project gross out of that idea that we needed to really get more practical and start developing guidance on some of the key questions that private sector companies are facing in trying to fulfill their responsibilities under the U.N. guiding principles on business and human rights.
So I'm going to do a brief introduction of the project, and then I'm going to turn it over to Mark Hodge who we're incredibly fortunate to have working with us on this project and who is sort of the brains along with the aspects of what have we're trying to do.
For those of you who have not had a chance to look at the materials on this, we have created a web page portal on our website, www.OHCHR.org that contains the background on the B‑Tech Project. We introduced in June of this year a scoping paper of what we wanted to accomplish on the project and solicited comments on that paper over the last few months. Now the final version of that scoping paper is online at the portal.
We really wanted to thank those who came in with comments in support on that effort. We had all sorts of engagement from state representatives and civil societies and many others. We also have a blog post on there that talks about the key trends and the takeaways from the public consultation process that we had around that scoping paper.
So while that paper was being finalized, we've done a couple of things. We've had a number of informal consultations with companies, civil society organizations, CGI, we've talked about the project. We've also done two outreach events, multistakeholder, one in South Africa, and one in Seoul, South Korea, trying to make sure that we see this as a project that is really global in the way that we're looking at it. It's just not focused on Silicon Valley and companies in the west.
We also, at the Business and Human Rights Forum, which is unfortunately taking place simultaneously with IGF. So people are shuttling back and forth, but we had a session on that relating to the project at the business and human rights forum in Geneva yesterday as well looking specifically at remedy issues and how we may be able to address remedy issues in the context of this project.
So the next step for us is that we're working on some foundational papers that are going to restate the core concepts related on the guiding principles that relate to tech companies, and they will sort of form the basis for the practical work that's being done. Then we move into the phase that we're all very much looking forward to, which is working on case‑based scenarios derived from the companies and participants within this where we're going to work through in a collaborative way how we answer some of the critical questions that are being faced in applying the guiding principles in the tech sector.
I have other speaking points here that I could go into, but I think it's better to turn it over to Mark. We think there's a lot of reasons why it's important for both companies and governments to support this effort. We're very excited about the fact that it's not simply bilateral. We're working across companies so that we come up with ways of approaching these issues that will be shared rather than company by company defining these things for themselves with not necessarily the full level of input.
And we think it can really be helpful to provide a safe space for companies to work through the challenges we face. We found that a lot within our business and human rights project in general, that that pure learning ‑‑ you know, we can be at a table as a pure resource. Often, it's the peer learning amongst the companies that is the most fruitful in discussing some of these issues. We're hoping to provide a good space for that to happen.
With that initial introduction to the project, Mark, I will turn it over to you to speak more about the substance here.
>> MARK HODGE: Thank you, Peggy. Good afternoon. I think it just turned afternoon. Gosh, this is a big room. It gets weird when you can hear yourself echo back to yourself. Really pleased to have you all here. Thank you for taking time to be with us.
As Peggy mentioned, my name is Mark Hodge. I've spent the best part of 15 years working in human rights, more recently in the last 10 years or so focusing on the original kind of U.N. protective framework in different sectors.
I have become very interested and moved by the idea that we can, I think, apply the guiding principles in the context of the technology sector and some of the challenges we're facing, I see while it's a new area and we don't have kind of enough good examples and experiences, I see vast lessons that can be learned.
I come with that hope. I also come to this with a very limited or new level of understanding around the technology sector compared to all of you in the room. That's part of the value of having these consultations as well.
Just to reiterate what Peggy said, just to put a point on it a little bit, we're at a key point in the project in terms of moving from conceptual framing ideas of where we want to focus into formalizing research partnerships, setting up consultations next year, being clear on what our deliverables will be next year as well. So I will make sure that we have time towards the end of this time we have together to share with you and talk about what the concrete plans are next year.
I wanted to split kind of our time up here into two parts. One quick part, just a bit more substantive context to the project and the logic behind it and how we're approaching it so the substance is at a general level.
Then most of our time is spent diving into what we call the focus areas of the project. There are four focus areas. As we get to that point, I'm going to need some help from you in the room to understand how much you've already engaged with the substance and the materials we've sent out because that will help me figure out how to manage our time together. We want to make sure that we allow time for you to input into this.
So in terms of, then, the first context of the project and building on what Peggy has said, as Peggy has mentioned and I have mentioned, we are very clearly focused on how can we apply and leverage the ideas, the spirit, the approaches embedded in the U.N. guiding principles on human rights in relation to a whole range of challenges around new technologies.
I will say that we are very acutely aware that what we need to do is also be honest where there are limits of where the guiding principles take us, right. The guiding principles are not the mechanism through which we can galvanize states to fully meet all of their state obligations. That's a critical aspect, right, of what happens in this field.
There are also parts of other international human rights law that are connected to and referenced in the guiding principles. So from the normative basis, but they're mentioned in passing. So we do want to touch on those. One good example is humanitarian situations and what that may say about technology companies. But this is definitely a starting point.
Consistent with that, we're focused on the three pillars of the guiding principles. Lessons have shown us it can be problematic if you focus too much on the second without thinking about the role of the state and what the state can do in reference to remedy. We touch on all of those areas as well because that's critically important.
I was thinking this morning how can I articulate kind of our theory for change for this project. I think it's fairly simple in that we believe there is value in clarifying the normative expectation but to do that through pointing to practice, to do that through pointing to this is a good example around end use. This is a good example for companies or where the state and civil society have enabled remedy. Unless we get to that granular level of what this looks like, we start talking about scaling and incentives and other levers we can use with no real sense of what good practice looks like.
The first is: How do we clarify practice. And to say this is what good practice looks like and this is the journey we should be on. That's kind of the first developing leading practice.
And the second is how the wider system around business sets up incentives, trying to drive scale around this, sets up requirements, right, around what is expected of companies, how that can get embedded in both regulatory and non‑regulatory approaches. The role of investors, we've had a lot of conversation about that in this space.
So with that, I think I am going to in the spirit of opening up pause and see if anyone has any foundation or questions here at this point around what we're trying to do before I start to talk about our focus areas and more of the substance.
Thank you, and please introduce yourself.
>> AUDIENCE MEMBER: My name is (?) I'm from Sierra Leone. I'm interested in human rights because of my background. I know it's a new frontier. It's very difficult sometimes for the guided principles that are proposed by (?) They are difficult to apply to certain situations, especially in developing countries like Africa, when you talk about countries in Africa. They are difficult to apply because we have these multi‑national corporations which I believe are more, you know, reach, and they use these high technologies. They take them to these indigenous people. A lot of human right violations occur during those activities in those areas.
So my question is: How can technology influence the guiding principles, especially AI, artificial intelligence? Is artificial intelligence technology more better than the current technology we're using in order to help curb some of these serious human rights violations that these multi‑national corporations are causing?
>> MARK HODGE: Thank you very much. We'll take a few more comments and questions. Again, try to be brief so we make sure we get on to the substance.
Please introduce yourself.
>> AUDIENCE MEMBER: Hello. My name is the (?) From Germany. We're working on human rights in Germany. We have a new law that's coming up and a new campaign to push for the enforcement on legislative way. My question is in terms of the kind of scoping, how technology so far is useful or better or worse? I came about one example from my colleagues working on a program, and they are very concerned that there are no safeguards in place how the data is used and how people's data is protected. There was a special report on extreme poverty, highlighting the social protection systems and using digital technologies in social protection systems is used for, say, focusing on fraud and not making protections better. That's one question.
The second is you're saying you're focusing on the digital sector, but I think there are no ‑‑ I mean there is no just digital sector because we know what Google is, to have and to traffic. So I think it's not worthwhile to focus on one sector but look at what are the implications across the level.
>> MARK HODGE: These are good questions. I'm glad we paused.
Anyone from this side of the room? You don't have to be forced to speak.
>> AUDIENCE MEMBER: Thank you so much. My name is (?) I'm from the Republic of Conga. This is from the human rights office, you didn't elaborate on that. Why did you decide to bring in this project. What has pushed to that?
Two small questions. One is related to the gender perspective within the project. I am very interested to see what is your approach within the framework.
The last one is related to protection of human rights defenders working in business sector. I think whatever information provided related to fraud, they get more fragile. Is the program taking care of that human rights defenders working in the business sector.
>> AUDIENCE MEMBER: Thank you for this discussion. It's amazing. I'm from Brazil. I'm a member of a high‑level panel. I was wondering if you're considering the national humanitarian law in your discussion. There's principles that are very important, mainly if you're considering (?) Machines. Currently, you see a lot of AI being used in the medical domain. Probably in the near future, you can have a number of (?) Because of the output of these machines.
>> MARK HODGE: Sorry. I didn't hear you. Did you say are we considering international humanitarian law? Yes. Okay.
I will be brief and let Peggy weigh in.
First of all, this question about the use of technology to address wider business and human rights realities that we face. It's critically important. Within this project, we have to be very careful about scope of what we're doing, right. I think the platform of this project, there will be opportunities to point to other work or key issues, but currently that we haven't included that within the direct scope of this project. That's not to say that AI and technology won't help us deal with a whole range of due diligence questions and empowerment questions and power questions within the context of business and human rights. That's sort of an initial honest response. That has not been where we're focused our energy. That's not to say we should go there, and it's something we'll reflect on as we go forward, but we have not anchored the work there at this point in time.
On terms of scoping, have we done a scoping. We have done our own research and engaged with groups looking at where the issues are, whether it's in relation to cloud computing, algorithms, facial recognition, as Peggy mentioned. The whole Internet of Things dynamic. We have a good sense of things out there. It's changing. We have a large community around us to help us grapple with those questions.
On the question of non‑tech companies, it's a question we've had asked us to before in this project. We are not saying that ‑‑ so I think you're right there are certain technology companies that engage and provide product for different industries, and we'll pay very acute attention to that as we get into the project.
For example, end use doesn't happen in a financial setting or in health care, for example, or in the employment human resources space. We're alert to those dynamics and the way states use these technologies increasingly. So very alert to that, that we want to make sure we anchor into what are the primary duties and responsibilities of those actors that are kind of developing, designing, and deploying the technology, right. So there will be a bit of a bloodline as we go through the project. That's where we're starting.
I will say one word about the gender aspect, and then I will say one word about international humanitarian law.
So very quickly, this morning I was having a conversation about one of the biggest challenges in human rights in due diligence in this space is engaging stakeholders and their agenda aspect, as well as other stakeholders, will be very important. I haven't thought much about the human rights piece. I will let Peggy weigh in.
International humanitarian law, we have not focused on that, but we have just set up a small piece of research to look at a cross‑cutting basis around how you look at these questions of technology and the context of international humanitarian law because they clearly provide existing clear duties, obligations to non‑state actors as well. We're laying out the normative framework around that. We'll also comment on where are the issues we're seeing, where is technology impeding on. We're getting close to the violation of humanitarian law. It's not in the focus paper, but it's going to be cross‑cutting. We'll start in January the piece of research on that.
>> Peggy: Just to come in, in terms of why we came to this, I think it does go to some of the conversation that we've already had. I mean, we have a business in human rights project already, and we have a new human rights and new technologies project. Coming out of both of those areas, there was a strong push for this is one of the gaps that the office is best suited to fill.
We look for the urgent areas, but there many, many urgent areas, so it often comes down to where does the skill set of our office best fit. We thought it was here. I was happy to hear about your point on the effect on human rights defenders and the vulnerability of those particularly working on business, corruption, land rights, environment, those areas.
We do have a specific focus on threats to human rights defenders and really trying to hone in on those vulnerable groups as well and looking specifically at how the online environment has changed the way human rights defenders both work and are threatened and what can the office bring in that. That's a particular cross‑cutting piece of work that we're going to expand on and work on in the coming year. So it won't be directly within the B‑Tech Project, but it's certainly a focus of the office.
>> MARK HODGE: Thank you. And that will certainly come up when we get into questions on accountability, the role of defenders and civil society and being part of the ecosystem of accountability and remedy.
Okay. So we have just over half an hour left in the session. What I'm going to propose is I do a quick overview of the four focus areas instead of doing one by one and pausing. Because we'll time out.
I will do a speedy overview of the focus areas and pause. We'll just allow the conversation to go where you guys want to take it, in terms of questions and inputs.
What I will say is that part of the objective for us today is to provide an update and openness in what we're doing with the project but also to allow individuals to sort of recognize where in the project they may be able to engage in us. The particular focus areas and questions asking about where you're working and where you would like to engage further was. Please, bear that in mind as you hear me go through the focus areas.
How many people in the room have seen the scoping paper that is online? Seen and read, how about that?
(Laughter)
>> MARK HODGE: That's very helpful. It didn't want to repeat what you've read before. So I will take time on each of them.
So we have four focus areas. The first one is focused on human rights risks in related business models. The second airy is focusing on human rights due diligence and use of technology products and services. The third focus area is on accountability and remedy, so exploring how the questions of remedy at a company level but in particularly a wider ecosystem of remedy play out in the context of realities of new technology. And the fourth area, looking at the smart mix of measures that states could deploy and should deploy while steering our whole governance around these approaches and these systems. That fourth area, we'll look at public procurement, regulation, trade. That's across the fourth area.
So in terms of the first focus area, business models, the U.N. guidelines are very clear that businesses should be addressing risks to people that will very core to that are business. It's not a fringe due diligence activity it's essential to embedding into the DNA of organizations. The reason the guiding principles and the reason why we're pulling this out as a focus area is because business models are determinant of business models, core business decisions that actually then set in motion a whole series of events, processes, systems, decisions, approaches that often can be hard to mitigate on an operational level, right. So a business model logic will actually drive potential human rights risks, and we want to shine a light on the fact that due diligence should be applied to business models more broadly.
We have set out two or three examples in the scoping paper about the types of business models we may be looking at in the industry.
I want to be clear, as well, that this focus area is not about labeling a business model as bad or good. The guiding principles very much take the approach that we have to embed due diligence into understanding the impacts of business models, right. So what does ‑‑ and the conclusion of the recent Amnesty International Report around Facebook and Google kind of made that report as one of the recommendations. As a first step, it must be applied to the business models of these companies and wider companies.
We also think it's important to engage diverse stakeholders on because we have to be sure not to drive a purely ideological view. It's broadly acknowledged the fact that people can access the Internet or access the service in a product like Facebook. It does enable civil society and other actors to fulfill their rights. So we have to be honest and address some of the dynamics there. That's where we hope the project will do as well.
In the scoping paper, we set out some of the questions we'll explore. What are the business models and underlying technical development and the way they get sold generating human rights risks, what are the common examples of this, what do strategies look like, what do good practices look like when things are designed around technical.
That's the first focus areas that we will focus on.
The second focus air is looking at human rights due diligence and end use. Companies have a responsibility to prevent, mitigate, and remediate issues that occur not just in their own operations but across their value chain. That includes in relation to products and services whether it's in the way those products and services are designed and then used or the way in which those products and services might be misused intentionally or otherwise by third parties. So the guiding principles have a very broad remit of what responsibility looks like for a company, and it includes those issues which clearly are not in control of the individual company and sometimes not proximal to the company. Technology might involve thousands, billions of individuals making use of something that could be very hard to understand.
What's critically important, as well, is the guiding principles offer a very simple concept, which it is concept of leverage. Businesses need to lead into these challenges that are systemic in their wider ecosystem. Where we're at in the project, what does that look like? What does due diligence look like in relation to end use? What does it look like in terms of ‑‑ what does leverage look like? How is it successful? What are some of the practical examples. To Peggy's point, what are some of the dilemmas companies come across. There may not be visibility of who is using the product or there might be a state that is the actor using the product. And the way the contracts get set up, there's very little leverage in those concepts. We're trying to hone in on that.
What we're going to focus in on is what are sharp‑end questions, the really difficult things we believe clarity of normal expectation, examples of dilemmas and good practice, and also tools and recommendations are very useful.
So in this context, for example, we'll start by looking at what actually exists within the tech sector and beyond where we can draw lessons. What are the methodologies being used? Where are good examples of companies trying to assess risk.
I think there are challenges presented for the industry. We'll look at those. There's a key aspect in the guiding principles in companies needing to stop contribution to a third party misusing their product or to harm being associated with their products and services. That can be particularly challenging when you're engaging in a situation where there's a web of other actors that may be contributing to this misuseful harm.
Sometimes when you contribute to somebody's behavior, it can be very subtle. It can be embedded in the design and develop over a period of time. Last week, we were in a meeting around human rights in this space. Somebody said, Sometimes when you're contributing to a general environment of speech, that is not a direct harm on people, but it can lead to direct harm. We need to explore those questions.
Also, look at questions around how you're going to affect stakeholders and how we deal with the question of scale and lack of visibility and really, again, pinpoint some of those issues which we'll be mapping out shortly.
Just to be clear, we're not going to map out ‑‑ there's a project they will be developing a methodology for taking the human rights methodology and applying it in the context of technology. That's not what we'll do, though we are working closely, we'll take a look at these interesting approaches and case studies.
Also, I think it's fair to say that the tendency of our work is going to be around organizational government processes and systems, not, per se, trying to innovate ourselves around technical solutions to these. There's partnerships on AI. There's great work. Data sheets and datasets in terms of transparency, we're not going to ignore these issues, but it's fair to say that what does it mean for governance and processes and accountability and the relationships with external stakeholders. Part of that is because we believe the pace of change in technology needs to be based on creating a level of oversight and engagement around those questions around the organization more widely.
The third area, and then I'm going to ‑‑ I will be quicker in the third and fourth area ‑‑ access to remedy is the third area. The U.N. guiding principles are very clear that when things go wrong ‑‑ and things do clearly go wrong in quite horrible ways often ‑‑ when things go wrong. Victims need to have access to remedy, right? That's an obvious statement in the context of this room and the human rights conference. There's an interesting project called the accountability and remedy project. We'll be building on that in the context of this work. Really, we're thinking about what is the remedy ecosystem that needs to exist around technology? In the context of technology, we're certainly going to think about but probably not drive too deeply into the notion of kind of operational‑level grievance mechanisms because it feels like it's not where the value add is in the context of the technology sector.
I do think there are lessons to be learned about how realtime technology companies can begin to understand where there are concerns or grievances that are being raised.
I think they're, again, distinct challenges we'll focus on here. One will be around the fact that there are multiple actors, as I've already said, that might be involved in harm occurring. How do you unpack that?
Another issue that's been mentioned to us is the sheer scale. When there are multiple things going on that can be going on at different layers, how do you figure out how to prioritize where to focus? Who are the people that are suffering the most server harms? How do we process those in a remedy context.
Also, the way that we think about remedies has a huge reliance on states. It may be the actor that's underpinning and even causing the harm, how do we square the circle of that reliance on looking to that same actor for providing an ecosystem of judicial and non‑judicial systems to provide remedy?
Last piece, and then I'm going to pause ‑‑ and you can offer your thoughts and comments and ideas ‑‑ is the state duty to protect. Focus area number four, it's clearly the first pillar in the U.N. guiding principles. Here we'll do a few things. Clearly, we want to reinforce and make sure we keep pointing to existing state obligations around human rights and point to other parts of the U.N. system and special rapporteurs that are doing so. We'll talk about what it means around public procurement, around trade facilitation, around regulation, around corporate governance requirements.
There's so many levers that states have, and we'll be exploring ways in which states can use those levers to create technology that incentivizes and rewards technology companies and others that have been responsible but also punitive or (?) Actors that are irresponsible.
Some of the areas I think that are particularly ripe for exploration are what I've mentioned before, sort of public procurement and responsible contracting. We know very much that states are using different state agencies all the way from law enforcements to even kind of social benefit schemes, different agencies within government are using these technologies. So what does it mean to be a responsible procura and what does it look like between a state and private company when agreements are made to provide services? There's a lot of information we can draw on.
The other area we need to pay attention on ‑‑ and I don't know how we do this ‑‑ clearly there's a huge motivation to incentivize and benefit from innovation and economic growth. We have to make sure we don't end up with double speak around how states deal with this. On the one hand, they try to promote economic solutions in terms of attracting technology innovation and on the other hand we need to make sure that doesn't get divorced from commitments to meet human rights obligations. That's a particular challenge for this industry but also more widely.
I'm going to pause there. We will ‑‑ there's a lot there that I've thrown out on the table. I wanted to at least give you an overview and open up for comments, questions, concerns about the project points of clarification about the particular focus areas or the direction of travel.
>> AUDIENCE MEMBER: Probably just a comment. My name is Angela Castner. I have a procurement background and look specifically at procurement in this area. My comment would be that I'm pleased you've mentioned this and it's a key lever for any government in how they achieve the SDGs, good public service but also their supply chains are looking at how human rights, that they understand what's happening in these supply chains.
I guess one of the things we're finding in this space and from my own background is these are actually difficult things to procure and buy. We don't always have the educated buyers. That's not just in developing countries. I mean, the U.S., the UK, governments don't always get this right. So there's also a capacity gap and, I mean, I'm sure there's issues you've looked at. It's very pleasing to look at and see that's one of the key levers.
>> MARK HODGE: I'm so glad you're here. We've been looking for people like this.
Other comments on this?
Okay. So I'm going to go to this side of the room and then over here. I've got three people.
>> AUDIENCE MEMBER: Hi. Thanks. My name is Mary Jean. I'm here as part of the open Internet for Democracy Leaders Program. My comment/question is within the field of international human rights as a field on its own, it has very difficult challenges, particularly when it comes to attributing what is considered a human rights violation as it is, particularly given the geopolitical alignment with countries opposed on what is considered a human rights violation.
My question is: When it comes to a commercially driven agenda, there's higher stakes. When incorporating a human rights lens in that view, already human rights as it is on its own, has its own fair share of challenges, how are you going to then align and harmonize these fields? Is it going to be a redlining situation in terms of attributing ‑‑ especially if it's state actors ‑‑ what is considered a human rights violation.
>> MARK HODGE: Thank you.
>> AUDIENCE MEMBER: Hi. My name is Morgan Frost from the Private Enterprisers site. Just one question in terms of incentivizing businesses to sign on to these type of principles. Are you specifically looking into the business case for digital rights? Making sure that there is incentive for businesses to sign on without jeopardizing economic growth? Thank you.
>> MARK HODGE: Thank you.
Two more down there. Individual ‑‑ no? Yes? Wonderful. Gentleman, please come to the table and find a microphone.
>> AUDIENCE MEMBER: My name is (?) From International Society here in Berlin. My question is similar to that one. You know, the human rights founding document, as we know them, they are from an analog world. And the question is when you do your research or you propose due diligence, is there any concept you're applying digital rights? How do you apply this traditional human rights concept to these new technologies?
>> AUDIENCE MEMBER: Thank you. I'm human rights Ambassador of Finland. My question is also about (?) In countries, including Finland, we're discussing the new laws or amendments to legislation. I'm wondering if you've been able to look at this site.
>> MARK HODGE: One more.
>> AUDIENCE MEMBER: I just have a short question. I'm Lisa, a senior program manager at Ranking Digital Rights. My question is: Are you aware of the project ranking intelligence companies using the U.N. principles? I would just be interested to hear.
>> MARK HODGE: Thank you. So I'm going to answer the last two questions first, and then I will make some comment on the other questions, and Peggy is also going to weigh in as well.
We're very aware of your work. Having engaged Rebecca with us in Tunis, and your colleague was with us last week in Copenhagen. We're familiar with the work. We know you've been on area one, paying a little more attention to rankings. We're eager to learn what you find.
Around the movement, I have reflections, I think. One is that we'll need to look at whether as states require due diligence and the way they structure that, the guidance they give to companies, whether there's a case for some alertness to the uniqueness of different industries, right, I think that clearly is. So I think we hope that this project would be able to feed into any awareness within governments around how what due diligence should be, what it looks like in this context, and what some of the particularities may be.
I also think ‑‑ this is a human rights comment ‑‑ we want to see what other states have. It's one critical lever, but, as we've said before, public procurement and trade, I think as we get to our fourth sort of mixed focus area, we'll definitely think about due diligence. We need to, in this project, figure out where the add is. It may be around the levers. It's to be decided, but we can't ignore that interesting movement and development.
So the digital rights piece, I don't think we can be fundamentalists about how we have defined. I think we can be fundamental in terms of knowing where the baseline is, but we can't be fundamental about what are the particular issues for this industry and how do they manifest itself, and even reconceptualizing rights as well. This is an area where my expertise falls off a cliff. We need to be think about that when we talk about the normative framework in which this sits.
In terms of incentivizing, that's interesting. I see it in two parts. On the one hand, we want to keep pushing forward, but set a normal expectation. These are not something you sign up to. These are clearly embedded. This is the norm. As the gentleman mentioned, this is embedded around the world. Companies are being demanded to respect human rights. I think we need to shy away but we need to be clear and talk about and identify in empirical terms about where things come from doing the right thing. That's conversations we'll have with some of the investor community.
We have convenings for next year to those who have said a lot about it. A lot of VCs in this space and the speed at which innovation is demanded. Historically, we've found that doesn't bode well. I think that's interesting for the way we have around the way the market incentivizes certain approaches.
I think I'm going to hand it over to you now.
>> Peggy: Thanks. Just to chime in on a couple of points made. On the business case point, I think it's a really interesting issue, and we brought it back in an earlier session as well, that this issue ‑‑ it's one thing to think that the guiding principles are optional in that sense, but at the same time, you know, we do work in an environment. Your last comment about venture capital also shows where, in fact, we work in a world where not only are we not incentivizing companies to do the right thing, there sometimes are strong incentives to do the wrong thing. I do think we need to take that on as part of the conversation as well.
You know, fortunately, that fourth focus area will allow us to talk about how governments can do better in that regard.
I take your point, this isn't easy. I don't think we want to say that, you know, through any one project or any one lens, we're going to be able to, you know, solve these issues in a way that's necessarily going to work for everyone. But I think part of what brought us to it is that idea of where we have value added. You started out by attributing human rights violations. I think that's something our office does really well. Sometimes we get feedback from governments that disagree with what we said, but we issue hundreds of reports a year. Most of it, because we're sort of a central player that has developed methodologies that are seen as credible and reliable, we have the ability to do that, and it has an impact. We hope this is an extension of that, that we will be solid in terms of the methodology and the approach we've shown in terms of public consultations around the scoping paper and the way we're engaging will also allow us to develop something that I think could both push the conversation forward and have the level of credibility and cross‑stakeholder reliability that people will look for.
The other thing I should say ‑‑ and I should stress this because I stress it with the team frequently ‑‑ there's a challenge as a United Nations actor to engage in this space. The ways in which we have intended to work are not at the metabolism rate engagement that's necessary for us to be responsive to what the need is in the digital space. And I think everyone feels like ‑‑ not just our organization, but we're all kind of behind the curve on this. There's so much that's being done, and we're having this conversation now. We want to start moving as quickly as possible to having the guidance and ideas and practical case studies that will have an immediate impact. In terms of the methodology we're taking here, we need to be innovative. That may mean we make steps in directions that could open us up for more criticism than other times. I think it's worth take some of those risks in this project. We're going to sort of push to move it forward. That means we'll come back to stakeholders like yourselves to find out how we're doing along the way and adapt the approaches that we're taking as we go forward. The other thing I wanted to comment on briefly is the comments on digital rights and how they fit in. It's a matter of how we work that human rights are rights that operate offline and online. The statements principles, they've been embedded.
The question is: Is the human rights framework up to date enough. I think you talked about that being analog. I don't think concepts like human dignity and the need for privacy, the need for non‑discrimination are analog ideas, you know. And I think what we need to figure out is how we make them useful in a new environment. And that's exactly what this project is geared at, the recognition that how we work on discrimination, how we work on privacy in the digital space does require us to look at things differently and dive deeper on issues, which is what we're trying to do here.
>> MARK HODGE: Thank you, Peggy.
So I'm going to share a few more comments and then just talk in the final minutes around what the beginning of next year looks like, in concrete terms, to give you a sense of that.
I wanted to also just build on two things Peggy said. Around the sense of adequacy of existing frameworks for the challenges we face, from my perspective, having worked 10 to 15 years with different businesses, industry groups and others in civil society organizations around business challenges, some of the issues I hear about challenges of incentives, challenges of integration, challenges of getting folks in certain part of the business that really want to move fast in innovation and others want to slow down, these are challenges we have certainly got some learning to bring to, right, if you think about the way business functions in society. We've got some experience from the past decades around that. They're not exactly the same industries. Those industries are always unique. I do think there's a lot there, also, around the way the human rights and business field has built frameworks, approaches, tools, knowledge, the way businesses have also kind of found solutions around this that we can also learn from that I don't think are out in any shape or form, given the dynamics we're talk about.
Process, the process itself has space to talk about dilemmas clearly. We have to figure out how to respond to that. It's not uncommon for other processes in the U.N. to hear and talk about specific cases that come up, right. We're going to have to be selective and think about our mandates and mandates of other rapporteurs.
Quickly, in the last few minutes, what does the first half of next year look like, and where do you want to be by the end of next year? We'll be publishing papers around focus area two and four. We'll be doing research on areas one and three, so business models and access to rem difficult
We'll be kicking off ‑‑ we're trying to formalize relationships with partners to do deep‑dive research into the business model questions. We'll be working with partners on that. Both examples of where risks may be and to offer ways in which we think about these challenge and what best practice may mean in this context as well. That will be happening followed by multistakeholder meetings in the first quarter of next year. We hope to have a Silicon Valley meeting to bring together the second part of it to look at good practices and to bring solutions to the table and get deeper into the sharp‑end questions they mentioned earlier as well.
And we'll also be having a consultation somewhere in Asia, I believe, though to be decided exactly where, that we'll kind of repeat some of those conversations as well to bring that perspective in.
Through the midway through next year, I think we'll begin to ramp up a bit more on the remedy work and the mix of smart measures work. That will realistically kick in to get a look at aspects that are urgent for the business community here.
As I said, we'll also be publishing a small piece of research around conflict as well and international humanitarian law and technology and conflict situations, I think by April/May.
Where do you want to be by tend of next year? I can't do it in three seconds. We really want to focus around these areas, what are the big challenging issues, what are the key two or three things we can add value on and have built a level of multistakeholder solution and dealing with those solutions in a practical way. And we also want proposed deliverables for this project. To be able to say, Okay. Now that we understand the problem and started talking about solutions, what can this project really deliver in terms of guidance, case studies, tools, frameworks, recommendations across these different focus areas that we can deliver on that will be things that can then be used moving forward for the wider field. That's our journey. We're going to be delivering things iteratively as we go.
>> Peggy: I appreciate everybody being here and indicating interest in this by your presence. Ultimately, this will succeed or fail based on the level of engagement and support we get from people like those in the room. So to just really a plea to all of you to continue to follow this project. You know, go to the website I mentioned and look at the materials there. Join us in these consultations. Give us your feedback and thoughts. For those of you who are able, the smallest percentage, despite being a pillar of the U.N., we're looking for additional support. We have a unique funding model for this project in that we are obviously very open to government support as that is, of course, where most of the funding for the office comes from. Switzerland has come in and been generous in that regard. We appreciate additional support.
Also, because this is a project that requires active ‑‑ not just the visceral type of, you know, we support it engagement from companies but a seat at the table that will allow us to address the problems that they see and get their buy‑in to really take up the deliverables and the things that come out of it.
We're funding, as well, through a consortium model where we're asking for a small investment from companies that are interested in the project. We do that purposely to avoid the due diligence issues on our side. In other words, we wouldn't have a project that would be fully funded from the corporate sector for this, but we do want that sort of stake in the game from those that are going to participate in it. So we're looking for companies who want to join us in the work and sort of a nominal contribution to show that they're truly committed and will follow through on what comes out of the work as well.
So happy to have any and all of you come back to us on any of those points. Again, just thank you for your time today.
( Applause )