Reviewing PPI´s Participation at the Internet Governance Forum

The Internet Governance Forum (IGF) just ended, and PPI was lucky enough to host a workshop and have several representatives on site, together with our colleagues from PPEU. We have participated in this event for numerous years consecutively and view this forum as one of our main avenues to influence transnational governance on an issue that is very important to us, the Internet.
This was the 20th IGF and held in Norway. The overarching theme was “Building Digital Governance Together”.
PPI’s Ethical Networking Workshop
For the second year in a row we hosted a Day 0 workshop. This year´s workshop was titled “Ethical Networking: Sustainability and Accountability.” Our session collaborated with Cambridge University´s Ethics in Mathematics Program and professors and internet experts from around the globe. At the end of this blog we provide a copy of our report, which can also be read on the IGF website:
https://www.intgovforum.org/en/content/igf-2025-day-0-event-197-ethical-networking-sustainability-and-accountability
We also provide a transcript below of the entire discussion, which we hope will be the basis for future academic publications.
Other Takeaways from the IGF
Beyond the workshop, PPI shared a booth with the European Pirate Party (PPEU). Besides meeting with our PPEU colleagues, whom we have known and worked together for years, we were able to share in networking and information dissemination with other organizations. Many people have heard of Pirates but never met one. Others have never even heard of our movement. It is shocking to some to consider how successful our movement has been on a global scale and how much we are central to the discussion of internet governance.
Looking ahead this year´s IGF especially highlighted the accomplishments and dangers of AI. Prior IGFs have discussed AI, but the current IGF was able to showcase real projects that are in much more advanced stages of development. Technological developments with AI are happening at a brake-neck pace, and the Pirate movement definitely needs to put itself at the center of this debate.
PPI’s Ethical Networking Workshop Report
List of Speakers and their institutional affiliations:
• Daphne Tuncer, Institut Polytechnique de Paris
• Marc Bruyere, Civil Society
• Maurice Chiodo, University of Cambridge, Ethics in Mathematics Project
• Dennis Muller, University of Cologne, Ethics in Mathematics Project
• Alexander Isavnin, Free Moscow University, Pirate Party Russia
• Keith Goldstein, Pirate Parties International, University of Potsdam
• Sara Hjalmarsson, European Pirate Party
Key Issues raised:
• Ethical networking requires interdisciplinary collaboration, which was represented by the mathematicians, engineers, sociologist, and political scientists who participated in this session.
• Technology design must consider long-term sustainability, as minor technical design choices today can significantly affect energy consumption in the future.
• Educating young people on ethics is crucial for responsible technology use.
• Ethical oversight of technologies varies significantly across different global regions.
• Engineers should integrate ethics and sustainability at the core of technological development.
• Societal impact assessments must be made alongside the technology development process.
• Ethical networking demands inclusive international perspectives and dialogue, such as the practice of meeting at the IGF.
• Citizen oversight is essential for governance of data-driven systems.
• Continuous monitoring of technological impacts is critical to prevent misuse, and accountability mechanisms must include long-term enforcement measures.
• Dedicated resources and time are needed for ethical impact assessments and community engagement.
• Human failures are a primary concern in sociotechnical evaluations.
• We must conduct more interdisciplinary studies that examine how humans interact with machines, how they are able to learn state of the art technology, and what types of educational interventions will motivate the adoption of new technology.
• Practical research tools like surveys are valuable for understanding these human factors.
There were no formal presentations during this session.
The session began with the broad question about how to align network practices with ethical principles. Daphne Tuncher emphasized the need to make underlying assumptions explicit, noting that network research narratives “are just taken for granted” and arguing that we must “reserve time” to question them. She also warned that fast systems can hinder reflection: “We tend to value high speed as something good… but to some extent, I believe this is not really aligned with ethical principle where we require time to think.”
The next discussion focused on emerging technologies, and how ethics cannot be added later. Dennis Muller stressed that “ethics is not an optional extra or a bolt-on,” calling for “a fundamental systemic shift” in design processes. He argued that developers must balance technical success with social good at every stage: “Technical success must be balanced with success from an ethical and sustainability perspective.”
The third discussion addressed IGF’s role. Maurice Chiodo described the IGF as an organization that “breaks down the silos between the technical and non-technical experts” and spreads insights across communities. Marc Bruyere added that IGF should forge stronger ties with standards bodies like the IETF and W3C, urging “more participation from the IGF community” in those forums.
The final discussion discussed the way forward and potential next steps. Sara Hjalmarsson asked the speakers to sum up how they proposed evaluating how to learn about the human element in systems. Maurice Chiodo highlighted that socio-technical evaluations must assess not just failures of the technical or human components but also the human-machine interface itself. He called this “the primary site of miscommunication and error.” Keith Goldstein agreed, advocating surveys, quantitative, and qualitative methods to capture human experiences. The speakers recommended creating working groups that mix technical and social science experts. They suggested surveys/questionnaires to track human factors in networks. They also proposed case studies to document ethical failures. They called for ethics training as part of engineering curricula.
There were two questions from the audience. The first question asked how to democratize ethical networking so that ordinary citizens can meaningfully oversee data-driven public systems. Dennis Muller observed that “we need to respect the different cultures and different regions of this world.” Alexander Isavnin added that “technology cannot insure you in something. You are your own insurance. You have to communicate, you have to oversight, you have to think about what’s going on with your data. IGF is a good starting venue but your participation is also really important.” The second question probed how global North–South power dynamics influence networking ethics. Alexander Isavnen explained that while “Internet and these technologies could shorten the gap between what we call West world and the others, or North and South,” they can also be “easily abused by the government so technology alone cannot close gaps.”
Overall the speakers indicated a limitation involved the lack of dedicated budgets to make impact assessments. Ethics must be a core competency for all stakeholders, but it is too often secondary. Bridges must also be made between academics and standards organizations like the IETF and W3C. They urged the IGF to publish periodic policy briefs. They also recommended local IGF chapters engage with communities on these issues.
Transcript of Ethical Networking Workshop
# Transcription:
(00:00:07.507): you
(00:00:40.615) [Keith]: Okay, thank you everybody for coming.
(00:00:42.600) [Keith]: It’s a pleasure to see you all.
(00:00:44.805) [Keith]: This is the IGF Workshop on Ethical Networking, Sustainability and Accountability.
(00:00:52.122) [Keith]: Rather than introduce everyone, I’ll turn over to my colleague Sarah over here to ask our first question.
(00:00:58.334) [Sara]: Thanks, Keith.
(00:01:00.236) [Sara]: First of all, my name is Sarah.
(00:01:03.520) [Sara]: I’m the vice chair of the European Pirate Party.
(00:01:06.764) [Sara]: We have a booth here if you’re on site.
(00:01:09.707) [Sara]: So if you like what we’re talking about, please feel free to stop by.
(00:01:15.133) [Sara]: I’d like to start by letting our speakers introduce themselves first before we start with question one.
(00:01:22.882) [Sara]: So I’d like to hand over to Daphne.
(00:01:30.042) [Sara]: We have Daphne with us.
(00:01:35.050) [Sara]: Welcome, Daphne.
(00:01:35.892) [Sara]: Please introduce yourself and tell us a bit about what you do.
(00:01:39.478) [Sara]: Tell me something about yourself, your project, and how it relates to ethical networking.
(00:01:54.528) [Sara]: We’re having a bit of a… Yeah, now we can hear you.
(00:01:57.475) [Daphne]: Sorry, sorry.
(00:01:58.457) [Daphne]: I couldn’t turn on my mic.
(00:01:59.981) [Daphne]: Sorry about that.
(00:02:00.562) [Daphne]: That’s okay.
(00:02:02.487) [Daphne]: Happens sometimes.
(00:02:03.369) [Daphne]: Go for it.
(00:02:04.371) [Daphne]: Yeah.
(00:02:04.832) [Daphne]: Hi, hi, everyone.
(00:02:05.754) [Daphne]: Sorry, let me turn on the video as well.
(00:02:07.578) [Daphne]: It should be working now.
(00:02:08.621) [Daphne]: Yeah, great.
(00:02:09.563) [Daphne]: So hi everyone.
(00:02:11.808) [Daphne]: Thanks a lot for joining this session.
(00:02:13.451) [Daphne]: So my name is Daphne Tuncher.
(00:02:15.916) [Daphne]: I’m academic.
(00:02:17.038) [Daphne]: My research is in the domain of computer science, more specifically computer networks.
(00:02:22.009) [Daphne]: I’m affiliated with Institut Polytechnique de Paris in France.
(00:02:26.017) [Daphne]: So, over the years, I’ve been trying to work on putting together kind of actionable resources, both for research and education on what I call responsibility in our digital development.
(00:02:41.533) [Daphne]: So, thank you.
(00:02:43.455) [Sara]: Okay, wow.
(00:02:44.616) [Sara]: That’s a big responsibility.
(00:02:45.858) [Sara]: All right.
(00:02:47.660) [Sara]: Thank you, Daphne.
(00:02:51.323) [Sara]: Next.
(00:02:57.286) [Sara]: Next, we have Marc Brouillere.
(00:03:00.841) [Sara]: Marc, are you with us?
(00:03:02.628) [Marc]: Yes.
(00:03:03.371) [Marc]: Can you hear me and see me?
(00:03:04.837) [Sara]: Yes.
(00:03:05.741) [Sara]: Loud and clear.
(00:03:08.002) [Marc]: OK, quickly.
(00:03:09.624) [Marc]: Actually, I did a PhD when I was 40, like 10 years ago, coming from a long path from an industry and so on.
(00:03:18.035) [Marc]: And when you’re actually starting to do research, and you know what implication it is in research, you are actually influencing things and innovating stuff and so on.
(00:03:30.030) [Marc]: And it always questioned me how to do this without hurting
(00:03:35.457) [Marc]: society with an integral way.
(00:03:37.840) [Marc]: Then that’s what we with Daphne had a very first conversation about it.
(00:03:42.126) [Marc]: And I’m actually working for a large company back for 10 years in research for Airbus with everything’s do count in the choice you do.
(00:03:52.841) [Marc]: And it’s very valuable that we are actually all thinking of the impact of the choices we do.
(00:03:58.609) [Marc]: And I really appreciate we have this time together.
(00:04:02.947) [Sara]: Oh, wow.
(00:04:04.449) [Sara]: So you’ve had a lot of insight to share there.
(00:04:07.815) [Sara]: Looking forward to it.
(00:04:09.598) [Sara]: Next, we have Maurice.
(00:04:14.847) [Sara]: Maurice, are you with us?
(00:04:17.110) [Maurice]: Yes, thank you.
(00:04:17.892) [Maurice]: Can you hear me?
(00:04:18.573) [Sara]: Yep.
(00:04:18.994) [Sara]: Loud and clear.
(00:04:20.581) [Maurice]: Excellent.
(00:04:21.562) [Maurice]: Thank you.
(00:04:21.882) [Maurice]: It’s a pleasure to be able to speak here today.
(00:04:24.525) [Maurice]: So my name is Maurice Chiodo, and I’m a research associate at the Center for the Study of Existential Risk at the University of Cambridge.
(00:04:31.511) [Maurice]: I’m also the principal investigator and co-founder of the Ethics in Mathematics Project.
(00:04:36.016) [Maurice]: So a research mathematician by training, I specialized in computability theory and abstract algebra.
(00:04:41.341) [Maurice]: My work now looks at the ethical challenges and risks posed by mathematics, mathematicians, and mathematically-powered technologies.
(00:04:48.087) [Maurice]: I’ve been working on this for over nine years and have insights and industry experience as an ethics and safety consultant in AI and blockchain technologies.
(00:04:56.045) [Sara]: Oh wow, you’ve done a bit of everything.
(00:04:58.991) [Sara]: Thank you, Maurice.
(00:05:00.274) [Sara]: Next we have Dennis, Dennis Muller.
(00:05:03.521) [Sara]: Dennis, are you with us?
(00:05:05.350) [Dennis]: Yeah, I’m here.
(00:05:07.713) [Dennis]: Thank you very much.
(00:05:08.493) [Dennis]: It’s an honor to be here.
(00:05:09.415) [Dennis]: I’m also a co-founder of the Ethics in Mathematics Project.
(00:05:13.139) [Dennis]: I’m currently a research associate at the University of Cologne, where I work on mathematics education for sustainable development.
(00:05:19.986) [Dennis]: And I work with Maurice at the Center of the Study of Existential Risk, where I study extreme technological risks related to AI and the Internet.
(00:05:28.317) [Dennis]: Overall, my work sort of connects to ethics, education, mathematics, and I’m particularly interested in studying how mathematics and mathematically powered technologies are shaping our world.
(00:05:39.595) [Sara]: Okay, wow.
(00:05:42.640) [Sara]: All right, very good.
(00:05:43.862) [Sara]: Great to have you with us.
(00:05:45.806) [Sara]: Next, we have Alexander Isavnen.
(00:05:49.391) [Sara]: Alexander, are you with us?
(00:05:52.600) [Alex]: Yeah for sure.
(00:05:54.362) [Alex]: Hello.
(00:05:55.444) [Alex]: I’m Alexander.
(00:05:56.906) [Alex]: I’m a member of the Council of Russian Pirates Party.
(00:06:01.392) [Alex]: We live in very difficult countries and our party and citizens of our country constantly need to face ethical and sustainability challenges.
(00:06:13.508) [Alex]: I’m also a mathematician by education, but have no relations to ethics and mathematic projects.
(00:06:20.515) [Alex]: Thanks.
(00:06:21.376) [Sara]: Okay, welcome.
(00:06:23.498) [Sara]: Next, we have Keith.
(00:06:25.400) [Keith]: And I’ll just introduce myself.
(00:06:26.962) [Keith]: I’m Keith Goldstein, Chair of Pirate Parties International.
(00:06:31.367) [Keith]: I also have been involved with Daphne and Mark here on drafting a research project on computer networking ethics and looking at how humans are able to learn new systems.
(00:06:43.782) [Keith]: Okay, thanks.
(00:06:44.363) [Keith]: So, why don’t we move on to the next question?
(00:06:47.566) [Sara]: So, let’s start with the first question there.
(00:06:51.992) [Sara]: We’re sharing a little bit.
(00:06:54.715) [Sara]: So, how can we ensure that mathematics and computer networking practices align with ethical principles, including privacy, transparency, and accountability?
(00:07:09.893) [Keith]: So, Daphne, would you like to start?
(00:07:14.258) [Daphne]: Yeah sure.
(00:07:14.719) [Daphne]: I’m happy to start.
(00:07:16.042) [Daphne]: I mean so as I said earlier I’m a computer scientist.
(00:07:18.507) [Daphne]: But in the recent year I started working a lot with people from social science.
(00:07:23.739) [Daphne]: And through this collaboration I got to learn a lot about the role of narratives in how this contributes to how we approach and develop new technologies.
(00:07:33.580) [Daphne]: And if you take computer network research as an example, so a lot of the narratives that we have today have to do with hyperperformance, optimization, measurements.
(00:07:45.277) [Daphne]: So of course, there’s nothing wrong with that.
(00:07:47.901) [Daphne]: But my point is that very often, these things are just taken for granted.
(00:07:52.528) [Daphne]: We never really question these narratives.
(00:07:57.115) [Daphne]: And so it does subconsciously, us, like a researcher in computer networks,
(00:08:02.082) [Daphne]: influence the way we think.
(00:08:04.707) [Daphne]: So to me, spending time on talking about this narrative to make them explicit and also having a space to confront them is an essential part and also ingredient to get an alignment between our practices, for example, in computer networks and ethical principles.
(00:08:23.632) [Daphne]: So I think what is really important is to reserve time for that.
(00:08:28.058) [Daphne]: So today, and I think this has been driven a lot by all these developments in the computing technologies, we tend to value high speed as something good.
(00:08:36.510) [Daphne]: So it’s fast, it’s good.
(00:08:39.354) [Daphne]: But to some extent, I believe this is not really aligned with ethical principle where we require time to think.
(00:08:45.162) [Daphne]: So I think time is very key here.
(00:08:48.666) [Sara]: Thank you.
(00:08:49.127) [Sara]: Next, Maurice.
(00:08:53.534) [Maurice]: What are your thoughts on this?
(00:09:08.857) [Maurice]: Ensuring that networking practices align with core ethical principles requires us to address three distinct but ultimately interconnected challenges of the alignment problem.
(00:09:19.333) [Maurice]: So from the perspective of ethics and mathematics, we must first define what we want to achieve.
(00:09:24.922) [Maurice]: Second, we must determine how to achieve these outcomes by developing the right mathematical tools, technologies, and practices.
(00:09:32.496) [Maurice]: This involves examination of the methods we use.
(00:09:35.562) [Maurice]: For instance, a commitment to privacy requires not just policy, but the implementation of privacy-preserving mathematics from the ground up.
(00:09:42.655) [Maurice]: Third, and most crucially,
(00:09:45.543) [Maurice]: sticks.
(00:09:46.425) [Maurice]: This is the long-term challenge.
(00:09:47.587) [Maurice]: To get this right, we must scrutinize three areas simultaneously.
(00:09:51.174) [Maurice]: As I said, the ethical vision of our outcomes, the integrity of our tools, and the robustness of our processes.
(00:09:56.785) [Maurice]: Any one of these can undermine the others.
(00:09:59.490) [Maurice]: For example, an ethical process can still lead to a harmful outcome if the underlying technology is flawed.
(00:10:05.017) [Maurice]: Therefore, we must move beyond just analyzing intent and design aims.
(00:10:08.282) [Maurice]: We have to rigorously investigate the technologies and the technologists’ ability to do good or cause harm.
(00:10:14.352) [Maurice]: We must understand not only what they want to do, but also… Oh, we have a bit of a lag there.
(00:10:22.746) [Sara]: We missed the last thing you said, Maurice.
(00:10:26.272) [Maurice]: Oh, sorry.
(00:10:27.554) [Maurice]: So, I was saying that, therefore, we must move beyond just analyzing intent and design aims.
(00:10:32.455) [Maurice]: We have to rigorously investigate the technologies and the technologists’ ability to do good or cause harm.
(00:10:38.545) [Maurice]: And we must understand not only what they want to do, but also what they can do.
(00:10:43.494) [Maurice]: Okay.
(00:10:44.976) [Maurice]: Yeah, that’s a big point.
(00:10:47.761) [Sara]: Let’s see.
(00:10:48.923) [Sara]: We have Alexander.
(00:10:50.666) [Sara]: You have a slightly different cultural environment.
(00:10:54.753) [Sara]: What’s your perspective?
(00:10:56.873) [Alex]: Let me give perspective, not just from my cultural environment, but from my experience.
(00:11:04.203) [Alex]: We all know that technology and instrumentation and tools are being developed much faster than regulations or even spelling norms of what’s going on.
(00:11:17.582) [Alex]: At the beginning of the internet, there was no privacy considerations or security considerations because scientists have created internet
(00:11:26.675) [Alex]: for their own needs.
(00:11:28.798) [Alex]: They thought that only such good guys with scientific approaches will exist on the Internet.
(00:11:37.250) [Alex]: But actually, a lot happens in that.
(00:11:40.635) [Alex]: A lot of people came here, evil people, bad people, governments, corporations, and so on.
(00:11:47.144) [Alex]: So I think that our idea of sustainability and ethical networking should go towards understanding of what people need first of all and only then such formulated needs need to shape technology developments.
(00:12:09.942) [Alex]: Back to my cultural background, in Russia it’s happening always.
(00:12:15.309) [Alex]: The state and the state-controlled corporations are developing technologies.
(00:12:20.997) [Alex]: They are announcing that technologies are for the good of the people, but lately it appears that even network applications are developed for surveillance or control of people’s activities.
(00:12:36.819) [Alex]: Thanks.
(00:12:38.461) [Sara]: All right, very good.
(00:12:39.623) [Sara]: We’ll actually get into that topic in a moment.
(00:12:43.949) [Sara]: In the meantime, we have Dennis.
(00:12:49.917) [Dennis]: What’s your perspective?
(00:12:51.780) [Dennis]: I think to truly align our practices with ethical principles, we must understand that ethics is not an optional extra or a bolt-on.
(00:12:58.790) [Dennis]: It’s something that we must fundamentally embed within everything we do.
(00:13:03.723) [Dennis]: Principles like safety and sustainability cannot be bolted on at the end of a project, especially with decentralized technologies such as the internet, where retrospective fixes can be very difficult or even impossible.
(00:13:19.788) [Dennis]: I think that achieving this requires a fundamental systemic shift in how we work.
(00:13:24.917) [Dennis]: We need to communicate, hire and train with ethics as a core competency.
(00:13:31.149) [Dennis]: Technical success must sort of be balanced with success from an ethical and sustainability perspective and this can be quite challenging from my experience and from working with other
(00:13:41.928) [Dennis]: engineers and it requires sort of like an adjustment because engineers can be accustomed to viewing their work as sort of like a technological optimization problems and this perspective demands that technical and non-technical experts and the affected communities of those technologies must find a common language and build a shared understanding of the goals and risk involved.
(00:14:03.308) [Dennis]: And so ultimately, technical expertise and ethical expertise are sort of like two sides of the same coin.
(00:14:10.976) [Dennis]: And only by fostering a community that sort of like equally values forward-thinking responsibility and backward-looking accountability, we can ensure that this happens.
(00:14:21.108) [Sara]: Okay, very good.
(00:14:22.649) [Sara]: Thank you, Dennis.
(00:14:25.873) [Sara]: And finally, we have Mark.
(00:14:29.785) [Marc]: I think multi-depletionary groups and thinking is always a benefit.
(00:14:35.557) [Marc]: And then that’s something in the discipline of engineering, design, and so on, what those imply.
(00:14:42.046) [Marc]: Choices as well, with nautical think, ideas and thoughts actually have placed sometime in industry.
(00:14:49.596) [Marc]: Even in research and so on, that’s very important.
(00:14:52.319) [Marc]: We have also feedback and time to give proper response and ideas, review from other,
(00:15:01.291) [Marc]: people with all their field of research or activities and so on.
(00:15:06.358) [Marc]: For a simple story to illustrate this, we are actually, and I just verified, using IPv4 to communicate through Zoom.
(00:15:15.252) [Marc]: Think of a teeny details who has very profound impact today.
(00:15:20.119) [Marc]: Then the design on that IPv4, they actually place the source address before the destination address.
(00:15:26.869) [Marc]: What do you do when you are actually
(00:15:29.911) [Marc]: checking where the packets need to go.
(00:15:32.895) [Marc]: You’re expecting the destination, not the source, to be first.
(00:15:37.182) [Marc]: And these teeny details is actually using a lot of power and electricity every time for a very long time.
(00:15:44.292) [Marc]: Big impact on the consummation of electricity and so on.
(00:15:49.239) [Marc]: Because all the routers have to wait, have to wait for the destination field
(00:15:54.667) [Marc]: before having the source.
(00:15:57.110) [Marc]: Kind of a teeny mistakes, but big impact.
(00:16:00.615) [Marc]: Then obviously reviewing and so everyone and all disciplinary things for such things is very difficult.
(00:16:07.344) [Marc]: We don’t know, they didn’t know that actually that design they did will remain for that long.
(00:16:12.651) [Marc]: And in IPv6, destination comes first.
(00:16:16.925) [Sara]: All right, thank you very much.
(00:16:20.388) [Sara]: And I think that’s actually a wonderful segue into our next question.
(00:16:26.554) [Sara]: So we’ve already, Mark, sorry, Mark has mentioned the technology that we’ve had for quite a while now and how we’ve learned from that and made things more efficient.
(00:16:43.729) [Sara]: But we’re also seeing emerging technologies
(00:16:46.912) [Sara]: How can these emerging technologies such as automated language models, an artificial intelligence, the Internet of Things, and so on, be ethically developed and deployed to ensure they have positive social, cultural, political, academic, and environmental impacts?
(00:17:10.469) [Sara]: It’s like 10 minutes for that one.
(00:17:13.675) [Sara]: So let’s go back to Mark for that one.
(00:17:19.925) [Sara]: Everyone will have a chance to answer, but we’ll just do it in the opposite order this time.
(00:17:25.114) [Sara]: Go ahead, Mark.
(00:17:28.661) [Marc]: It’s a hard practice to have
(00:17:31.411) [Marc]: all the view and impact of what we do.
(00:17:33.993) [Marc]: But what I actually, when we started to open up ideas and thoughts with Daphne, we did find people who are working hard on those questions from the root practice of what we call computer science today, with mathematicians, or both of your, as Maurice and Dennis.
(00:17:55.413) [Marc]: They put together a lot of questions, a lot of way of asking yourself,
(00:18:00.638) [Marc]: It is a good project, and so on and so on.
(00:18:03.422) [Marc]: That practice needs to be every time for everything, mostly.
(00:18:07.147) [Marc]: It was very hard to have the time for this, but it’s necessary.
(00:18:11.172) [Marc]: Giving time for this kind of practice is essential.
(00:18:15.458) [Marc]: And it does, it has to cover a minimum of different payouts that’s been introduced by their works.
(00:18:23.709) [Marc]: And I think we rely on actually kind of future projects on their approaches, and it’s very valuable.
(00:18:30.558) [Marc]: And that’s why, yes, I let all the people already spend a lot of time thinking of it.
(00:18:38.175) [Sara]: OK.
(00:18:39.016) [Sara]: Very good.
(00:18:39.517) [Sara]: Thank you, Marc.
(00:18:43.501) [Sara]: Alexander, you have something specific, go ahead.
(00:18:49.090) [Alex]: Yeah, you asked a really broad question about the impact of very, very difficult fields of human society.
(00:19:00.008) [Alex]: But I would like to point two issues.
(00:19:02.693) [Alex]: First of all, for technologies, development of technology is something funny.
(00:19:08.442) [Alex]: So that’s more than young people who are rushing into technology, into education, into testing something.
(00:19:16.736) [Alex]: They don’t think about impact of their activities at all.
(00:19:21.805) [Alex]: So that’s why we have script kiddies, we have young hackers and so on.
(00:19:26.513) [Alex]: That’s, I think, the lack of education, overall education, general education, not technology education.
(00:19:34.902) [Alex]: That’s an issue.
(00:19:36.844) [Alex]: And I remember myself when I was young, the Internet was a university and so on.
(00:19:43.413) [Alex]: I definitely can confess I did some unethical things which I would not do now having understanding all this impact.
(00:19:52.164) [Alex]: So first of all, we need to educate young.
(00:19:54.967) [Alex]: The second approach, and this is actually a kind of experience from local, from Russia, because officials, corrupted officials or corporations which have ties to the government, stating nearly the same things, that technology needs to be ethical, technology needs to provide sustainability and be available for everyone.
(00:20:24.953) [Alex]: But in contrary, technology does not develop.
(00:20:31.443) [Alex]: For example, in Russia, we do not have 5G cellular networks because all their frequencies are stockpiled by few companies or militaries under the name of protecting common resource and so on.
(00:20:49.430) [Alex]: So it’s a development
(00:20:50.452) [Alex]: 5G networks is not possible, not because of sanctions, not because of some retrospective things, but just because somebody tries to keep us sustainable.
(00:21:03.968) [Alex]: So I think that’s two points I would like to bring to the table and maybe discuss later.
(00:21:11.998) [Alex]: Thanks.
(00:21:14.819) [Sara]: All right.
(00:21:15.420) [Sara]: Thank you very much, Alex.
(00:21:17.802) [Sara]: That’s an interesting point.
(00:21:19.784) [Sara]: And of course, we also invite questions from our online participants.
(00:21:26.011) [Sara]: Next, we have Maurice.
(00:21:28.294) [Sara]: Go ahead, Maurice.
(00:21:31.557) [Maurice]: Thank you very much.
(00:21:32.759) [Maurice]: So I’m going to sort of try and give this from an engineer’s viewpoint.
(00:21:36.403) [Maurice]: So from an engineer’s viewpoint, there are three key aspects to ethical development here.
(00:21:41.789) [Maurice]: Perspective, perspective, and perspective.
(00:21:45.700) [Maurice]: Even the most conscientious engineers cannot ensure positive impacts on their own.
(00:21:50.890) [Maurice]: We work deep within technical systems, but technologies like AI in the Internet of Things are fundamentally human endeavors.
(00:21:57.023) [Maurice]: They connect people and the object people use.
(00:21:59.828) [Maurice]: Therefore, human insights…
(00:22:01.732) [Maurice]: and a range of perspectives must be central throughout the entire development and deployment process, not just as an afterthought.
(00:22:09.000) [Maurice]: This requires a shift in resources.
(00:22:10.803) [Maurice]: Ethical development isn’t free.
(00:22:12.324) [Maurice]: It takes dedicated time and effort to consult with domain experts, conduct impact assessments, and engage with impacted communities.
(00:22:18.752) [Maurice]: This work must be budgeted for as a core project requirement, not an optional extra.
(00:22:23.197) [Maurice]: Furthermore, our motivation must be scrutinized.
(00:22:25.640) [Maurice]: We should focus on applying our skills to solve recognized societal problems rather than inventing new problems to fit a fancy technological tool.
(00:22:33.931) [Maurice]: With every step forward, we have to ask a critical question.
(00:22:36.394) [Maurice]: Who wins and who loses?
(00:22:38.477) [Maurice]: True ethical networking requires us to see and account for everyone.
(00:22:44.605) [Sara]: Absolutely.
(00:22:45.466) [Sara]: So, very good.
(00:22:46.727) [Sara]: Thank you, Maurice.
(00:22:48.410) [Sara]: And Daphne, what are your thoughts?
(00:22:53.285) [Daphne]: So I think kind of the key word here in this question is positive impacts, because positive for who and relative to what?
(00:23:00.416) [Daphne]: I mean, everything is subjective.
(00:23:01.878) [Daphne]: And that’s related to, I mean, what Alexander, you were saying about the situation in Russia, because what one might consider as being positive might be well perceived as negative by another.
(00:23:10.612) [Daphne]: Meanwhile, one can be, of course, a person.
(00:23:12.836) [Daphne]: They can be a community, a group of interest, can be a government, et cetera, et cetera.
(00:23:17.804) [Daphne]: So as the question shows, impact is multidimensional.
(00:23:20.508) [Daphne]: So we can’t expect there will be one group of people that will decide what positive impacts are.
(00:23:25.355) [Daphne]: So maybe here, I will answer as a researcher, because that’s my community.
(00:23:28.759) [Daphne]: But I think as researcher, the very important thing for us now is ready to engage in a practice that goes beyond this kind of mode of organization and silos that we’ve seen for research.
(00:23:39.715) [Daphne]: So you are a computer scientist, you are a mathematician, you are a biologist, you are a sociologist.
(00:23:43.680) [Daphne]: But at the end, what really matters is that we really work together.
(00:23:47.065) [Daphne]: so that we agree or at least we get some shared value on what positive impact we are aiming at, but also how we assess this impact.
(00:23:57.547) [Sara]: Okay, thank you Daphne.
(00:24:00.042) [Sara]: And Dennis, of course, go ahead.
(00:24:03.928) [Dennis]: The development of technologies like large language models or the Internet of Things hinges critically on understanding the interconnected nature.
(00:24:13.483) [Dennis]: So from an engineer’s perspective or from a management perspective, that means that we cannot compartmentalize ethics within single sub teams because things will just get overlooked.
(00:24:24.140) [Dennis]: Nor can we sort of like overlook that sort of like the social, cultural, political and environmental aspects are deeply intertwined.
(00:24:32.429) [Dennis]: So we cannot usually address one without affecting the other.
(00:24:35.472) [Dennis]: And so that means for developers, there’s sort of a dual responsibility here, building safety into the technical architecture or into the technical system, and also earning the public’s trust.
(00:24:46.804) [Dennis]: One does not necessarily imply the other in an interconnected world.
(00:24:51.528) [Dennis]: And we cannot assume that engineers or mathematicians or computer scientists by default understand how to navigate this complexity or how to raise the right questions.
(00:25:02.347) [Dennis]: They need to be taught this and given the space to think beyond immediate, localized, often monetary incentives.
(00:25:09.980) [Dennis]: And they need to be taught how to do this in a way that earns trust from society.
(00:25:16.351) [Dennis]: And once again, this sort of like requires balancing technical expertise and technical incentives with non-technical knowledge and non-technical incentives.
(00:25:27.324) [Dennis]: In this sense, I can only reiterate what Maurice said.
(00:25:30.207) [Dennis]: Perspective is really what matters here from my perspective.
(00:25:36.214) [Sara]: Okay, thank you very much.
(00:25:37.335) [Sara]: I think we have quite a few overlaps there.
(00:25:41.200) [Sara]: I think a common thread is education.
(00:25:45.022) [Sara]: education and integration with our interdisciplinary teams and interdisciplinary working environments.
(00:25:57.458) [Sara]: And in that sense, we kind of have this big interdisciplinary environment with the IGF.
(00:26:08.289) [Sara]: And that leads us to the next question.
(00:26:10.751) [Sara]: What role can the IGF and its stakeholders play in promoting sustainable and responsible internet governance?
(00:26:22.082) [Sara]: So let’s start with Daphne this time.
(00:26:24.985) [Sara]: Go ahead, Daphne.
(00:26:27.749) [Daphne]: Thanks.
(00:26:28.611) [Daphne]: Well, I think that really the idea is a platform to connect and get the visibility on what’s going on.
(00:26:34.285) [Daphne]: So, as I said earlier, I really think that understanding for who and relative to what technology, a model, a development demonstrates certain qualities is not simple.
(00:26:44.759) [Daphne]: So to me the idea really has the ability to reach out to a very worldwide audience.
(00:26:51.557) [Daphne]: So it must capitalize on that to provide I think a medium through which we can confront our perspective especially coming from different parts of the world.
(00:26:59.879) [Daphne]: because this raises perspectives that we need to embed into sustainable and responsible Internet governance.
(00:27:07.768) [Daphne]: I don’t think we should get a top-down approach where a small group of people would decide on the definition of these qualities for governance.
(00:27:15.376) [Daphne]: So I really believe that the IGF has a key role to play in supporting the diversity of background, cultural heritage, point of views that are really necessary to design and build this governance framework.
(00:27:28.448) [Sara]: OK, thank you, Daphne.
(00:27:30.313) [Sara]: Maurice, what’s your perspective?
(00:27:32.419) [Sara]: What do you think?
(00:27:35.628) [Maurice]: Thank you.
(00:27:36.109) [Maurice]: So in my view, the IGF’s most powerful role here is that of a convener.
(00:27:41.763) [Maurice]: provides the room and sets the tables for the essential multi-level ethical engagement that sustainable internet governance requires.
(00:27:48.993) [Maurice]: This is the space where dialogue is not just possible, but it’s the primary purpose.
(00:27:53.079) [Maurice]: By its very nature, the IGF assembles a diverse array of stakeholders needed to generate genuine perspective from governments and corporations to academics and activists.
(00:28:02.371) [Maurice]: As we’ve discussed, perspective is the single most critical ingredient for the ethical development of emerging technologies.
(00:28:08.948) [Maurice]: An engineer in a lab cannot foresee and understand all the implications of their work, just as a policymaker cannot grasp all the technical nuances.
(00:28:16.021) [Maurice]: The IGF is a place where these worlds connect.
(00:28:18.505) [Maurice]: It breaks down the silos between the technical and non-technical experts that often exist in industry and governments, which is crucial for finding and nurturing a common language.
(00:28:27.882) [Maurice]: In this way, the IGF already acts as the essential first step.
(00:28:31.149) [Maurice]: It gathers the necessary people and perspectives, creating the foundation upon which responsible governance of a decentralized mathematical technology like the Internet can be built.
(00:28:41.850) [Sara]: Okay.
(00:28:42.632) [Sara]: Thank you.
(00:28:43.073) [Sara]: Thank you, Maurice.
(00:28:46.980) [Sara]: Next, Alexander.
(00:28:48.323) [Sara]: Go ahead.
(00:28:50.461) [Alex]: Yes, for sure.
(00:28:52.624) [Alex]: But first I would like to point out that ethics and sustainability might be really different in different parts of the world.
(00:29:04.860) [Alex]: So I think that locations where a full-blast IGF was conducted have completely different approaches to what’s ethical, what’s not ethical.
(00:29:13.782) [Alex]: And events like Internet Governance Forum allows, first of all, to understand each other.
(00:29:20.250) [Alex]: Not to synchronize, but to understand each other’s approaches.
(00:29:24.936) [Alex]: So that still Internet Governance Forum not just connects different stakeholders from the same group, but understanding of what’s going on in different regions, different countries, different regions.
(00:29:38.352) [Alex]: Overall, IGF allows to connect all positively thinking people who are looking forward for development of the internet for good.
(00:29:51.246) [Alex]: I think not just IGF, maybe some other platforms like World Summit for Information Society, which actually spinned off IGF 20 years ago, it still have forums which are more populated by governmental people,
(00:30:08.390) [Alex]: So I think we should continue not just in IGF, in our local IGF, in our local communities, but also have broader interaction within United Nations and intergovernmental organizations.
(00:30:26.966) [Sara]: Okay, thank you, Alexander.
(00:30:28.309) [Sara]: Denis, what about you?
(00:30:32.572) [Dennis]: This is sort of a follow-up from Maurice’s answer.
(00:30:35.397) [Dennis]: I think that assembling the right people is only half of the process.
(00:30:39.164) [Dennis]: The IGF’s next crucial role is to ensure that the insights also radiate outwards.
(00:30:46.297) [Dennis]: And the IGF is already highly effective at collectively identifying emergent issues.
(00:30:52.969) [Dennis]: I think what can be done next is sort of like how do we
(00:30:57.040) [Dennis]: translate that awareness into action, because our research on ethics and mathematics has demonstrated that many technical practitioners, like mathematicians, computer scientists, network engineers, quite often view their work as separate from ethics, sustainability, and also from policy.
(00:31:15.552) [Dennis]: So while many people who are in this room understand that technology and ethics or technology and sustainability are inseparable, the understanding is not very widespread from our experience.
(00:31:30.608) [Dennis]: And so the primary role that we see here is for IGF stakeholders to act as ambassadors, championing this integrated perspective and spreading awareness within their respective fields, within their respective companies, and bringing it where
(00:31:45.404) [Dennis]: We are people who are not yet convinced that this is important.
(00:31:52.004) [Sara]: Very important point.
(00:31:54.191) [Sara]: Thank you.
(00:31:54.632) [Sara]: Thank you, Dennis.
(00:31:56.257) [Sara]: And finally, Mark.
(00:31:57.622) [Sara]: Go ahead.
(00:32:00.119) [Marc]: Well, when we look into the story of IGF and why it is and so, when we talk about internet, that’s not something we initially come up from the ITU.
(00:32:13.954) [Marc]: ITU has been, I mean, the very beginning of ITU was it weren’t before United Nations.
(00:32:20.321) [Marc]: But in 47, that was the very first chapter out of the Second War.
(00:32:26.208) [Marc]: to be United Nations before UNESCO and so on and so on.
(00:32:30.552) [Marc]: ITU is still there and so for standardizations for telecommunication.
(00:32:37.059) [Marc]: But come up in the meantime that we all know, internet, very different way of to be governed and coming origin and so the way to decide the standards are very, very different.
(00:32:49.252) [Marc]: And then it is actually winning compared to ITU standards.
(00:32:55.418) [Marc]: what we call about RSE, ETF, ERTF, the different things that’s coming up from this community.
(00:33:01.794) [Marc]: Very different.
(00:33:03.197) [Marc]: Then United Nations created IGF because they realized that something is missing.
(00:33:09.933) [Marc]: It went out of ITU.
(00:33:12.648) [Marc]: Then IGF is the good place actually to get many many people in a very different aspect to take over what we call the Internet today.
(00:33:21.523) [Marc]: But it’s not only in the tubes is in the way the protocol has been designed and also the content store all different aspects.
(00:33:30.417) [Marc]: And the thing that is very very open and we have this occasion today is very important.
(00:33:35.227) [Marc]: It could be, and then the missing part of it is how we can influence a little bit more.
(00:33:43.597) [Marc]: And participating a little bit more from the IGF community, interacting with EITF, the design and the department of technological standardizations as it is.
(00:33:57.934) [Marc]: There is gateway, people coming a little bit more in IGF from EITF and vice versa.
(00:34:03.661) [Marc]: But I think it’s very important as well.
(00:34:05.944) [Marc]: And then W3C and all different aspects and so on.
(00:34:11.532) [Marc]: Then I haven’t been participating much on understanding the relationship between studies and organization like this.
(00:34:18.422) [Marc]: But that’s very important.
(00:34:20.806) [Marc]: That could be for the future.
(00:34:22.949) [Sara]: So having those platforms in place gives us more leverage.
(00:34:29.160) [Sara]: Okay, thank you very much, everyone.
(00:34:32.666) [Sara]: Thank you, Mark and everyone.
(00:34:35.171) [Sara]: We’re doing okay time-wise, so we have time for one more question.
(00:34:38.477) [Keith]: Yeah, we have time for one more question.
(00:34:40.280) [Keith]: I’ll take that over as in sort of a question for myself as well.
(00:34:44.224) [Keith]: And then we’ll try and get some questions from the audience.
(00:34:46.247) [Keith]: So we have just about nine minutes left.
(00:34:48.990) [Keith]: And the last question is, how can we evaluate the human component of networks?
(00:34:52.614) [Keith]: We’ve talked a lot about the fact that these aren’t just systems, there’s people behind them.
(00:34:56.859) [Keith]: What can we do to learn more about how we learn new networks?
(00:35:00.723) [Keith]: What practical tools can we use to evaluate computer networking practices?
(00:35:05.068) [Keith]: So go back in reverse order maybe with Mark first.
(00:35:12.663) [Marc]: The human part of it is a good question.
(00:35:17.310) [Marc]: And we have some way of trying to understand this.
(00:35:24.582) [Marc]: But it is social work and so on.
(00:35:27.386) [Marc]: The only things I’ve learned recently, trying, and I mean it’s a fact, the quantitative space have nothing to see with the qualitative space.
(00:35:40.507) [Marc]: And trying to understand these two different spaces for deciding what quality we want to give to some evaluation we do as an engineer to get a better optimization process or performance of whatever system.
(00:35:57.683) [Marc]: And so finding the right gap to be able to get the quantitative design we want as a good quality as a beginning.
(00:36:06.612) [Marc]: We need people who guide us for pushing to the questions and finding the right way of making finite choices.
(00:36:20.668) [Keith]: Really quick, so maybe go off to Daphne next.
(00:36:26.204) [Daphne]: Yeah, I don’t know if I have much to add to this question.
(00:36:29.348) [Daphne]: So I think that’s really kind of a typical question.
(00:36:32.452) [Daphne]: Then we need collaboration across disciplines.
(00:36:35.095) [Daphne]: And yeah, for us, like people working on computer networks, that’s quite important.
(00:36:40.221) [Daphne]: We understand the human perspective.
(00:36:41.663) [Daphne]: But we don’t necessarily know, well, we don’t necessarily have the tools that we can use to actually access to human perception.
(00:36:48.631) [Daphne]: human feedback on this.
(00:36:50.313) [Daphne]: So I think that’s where we need to collaborate, for example, with social scientists.
(00:36:53.979) [Daphne]: I mean, we started working with you on that purpose, to learn how we can run survey, how we can do consultation, how do we analyze the feedback we get, I mean, through this method.
(00:37:07.358) [Keith]: Okay.
(00:37:08.680) [Keith]: Since we’re short on time, Maurice, Dennis, Alexander, would any of you like to chime in?
(00:37:14.585) [Maurice]: I’d be happy to, at this stage.
(00:37:17.508) [Maurice]: So I think the more pertinent question really to consider here is how to evaluate the network as a socio-technical system.
(00:37:25.237) [Maurice]: So humans and technical components cannot be assessed in isolation.
(00:37:28.340) [Maurice]: Their value and risks emerge from their interaction.
(00:37:31.003) [Maurice]: This becomes evident by looking at socio-technical systems potential points of failure.
(00:37:34.807) [Maurice]: So they must assess the potential for a failure of the technical or AI component, or a failure of the human component, or a failure of the process or workflow they’re meant to follow.
(00:37:43.677) [Maurice]: Crucially, we must also evaluate the human-machine interface itself, as this is the primary site of miscommunication and error.
(00:37:49.145) [Maurice]: And finally, we must account for failures caused by exogenous circumstances, acknowledging that no system operates in a vacuum.
(00:37:55.334) [Maurice]: This method ensures a comprehensive socio-technical evaluation.
(00:37:58.318) [Maurice]: And as you can clearly see, three-fifths of the problems listed above are neither purely human nor purely technical, instead stemming from their interaction.
(00:38:07.842) [Maurice]: Great.
(00:38:09.365) [Maurice]: Dennis or Alex?
(00:38:11.329) [Alex]: I just would like to add shortly that our main task is just not to lose our focus and continue observing developments.
(00:38:23.772) [Alex]: in case we shortly stop paying attention to latest developments, to technological advances, they could and I think will go the wrong way.
(00:38:36.679) [Alex]: So just keep an eye and follow and communicate with each other.
(00:38:41.188) [Alex]: That’s important.
(00:38:43.668) [Dennis]: Last thoughts, Dennis?
(00:38:46.493) [Dennis]: I think the really big first step is to not view human components of a network similar to technical or mathematical components.
(00:38:55.489) [Dennis]: Our experience of working with mathematicians, engineers, but also with users,
(00:39:00.037) [Dennis]: is that their actions, their awareness, and their motivation are almost equally important when it comes to eventual outcomes.
(00:39:08.010) [Dennis]: And the failure modes that Maurice outlined are deeply connected to who a human is.
(00:39:14.181) [Dennis]: So from that perspective, we really need to think about this question, how do we understand who the humans are involved in these networks?
(00:39:23.290) [Keith]: Thanks.
(00:39:24.351) [Keith]: And just to chime in myself, that this workshop itself really began as a questionnaire that Mark, Daphne, and I self-developed to try and learn about how humans are learning difficult new methods for operating computer networks.
(00:39:42.230) [Keith]: And just for our last four or five minutes, I ask Bailey, who’s with us online, to collect some questions from the audience, and maybe she can read them out to us.
(00:39:53.851) [SPEAKER_01]: Hello, everybody.
(00:39:55.875) [SPEAKER_01]: So we do have a couple of questions in the chat here.
(00:39:58.479) [SPEAKER_01]: I’ll start with the first question from Henan Zahir.
(00:40:04.950) [SPEAKER_01]: I apologize if I mispronounce anybody’s name.
(00:40:07.635) [SPEAKER_01]: But her question is, how can ethical networking be democratized to ensure meaningful citizen oversight over data-driven public systems?
(00:40:24.192) [Keith]: Would one any of you like to quickly quickly three minutes and 30 seconds half that time answer it No Alex did you wanna go ahead Dennis?
(00:40:42.753) [Dennis]: I think
(00:40:44.105) [Dennis]: It goes back to what Alex says.
(00:40:46.528) [Dennis]: We need to respect the different cultures and different regions of this world have different perspectives on this very question.
(00:40:53.736) [Dennis]: So in this sense, the IGF should probably try to be even more international and to really bring in these different cultures and perspectives.
(00:41:09.753) [Dennis]: But it’s a hard question.
(00:41:11.257) [Alex]: Yeah, and I would like to reply to this question by noting that technology could not insure you in something.
(00:41:22.418) [Alex]: You are your own insurance.
(00:41:24.843) [Alex]: You have to communicate, you have to oversight, you have to think about what’s going on with your data and how it’s being driven.
(00:41:33.920) [Alex]: So IGF is a good starting venue for discussions like this.
(00:41:39.732) [Alex]: But your participation is also really important.
(00:41:44.763) [Keith]: Great.
(00:41:44.903) [Keith]: And Bailey one more question.
(00:41:46.186) [Keith]: Two minutes.
(00:41:46.988) [Keith]: Question and answer.
(00:41:48.668) [SPEAKER_01]: Yep, so there’s one more question here from Anna Gretel Ichazu, and she’s asking, I would like to know how do you think of global north and global south dynamics across the issues you are arising?
(00:42:09.109) [Alex]: Yeah, let me answer this question because I’m from country which for a long time pretended to be global north, but now we’re pretending to be global south.
(00:42:17.677) [Alex]: So, Internet and these technologies actually could shorten the gap between what we call West world and the others, or North and South.
(00:42:32.334) [Alex]: But you also have to oversight really clearly, because in not very democratic developed countries,
(00:42:39.682) [Alex]: especially in countries of so-called global south, technology can easily be abused by the government, which will make gap to the north, economical gap, civilizational, well, not civilizational, societal gaps, democratic gaps, much bigger than it exists.
(00:42:58.588) [Alex]: So I will repeat my answer to previous questions.
(00:43:02.674) [Alex]: Technology could not
(00:43:06.192) [Alex]: close gaps.
(00:43:07.576) [Alex]: You have to oversight really, really accurately and constantly and not releasing it.
(00:43:13.714) [Alex]: Thanks.
(00:43:16.281) [Keith]: Last 40 seconds.
(00:43:17.164) [Keith]: Any other ideas?
(00:43:22.780) [Keith]: Okay, well then I will close off this session and thank everybody for coming.
(00:43:27.265) [Keith]: It was really interesting.
(00:43:29.248) [Keith]: I hope we can make a routine of this and produce some studies that also look into these very difficult questions.
(00:43:36.276) [Keith]: And hopefully we’ll have a publication or some other outputs for you all to read soon.
(00:43:40.120) [Keith]: So thank you everybody for coming.