the quiet living room - by quietsocialclub

Gen Z, Diversity & Responsible Tech Futures with Yasmin Al-Douri and Alexander von Janowski

September 03, 2023 Quiet Social Club
Gen Z, Diversity & Responsible Tech Futures with Yasmin Al-Douri and Alexander von Janowski
the quiet living room - by quietsocialclub
More Info
the quiet living room - by quietsocialclub
Gen Z, Diversity & Responsible Tech Futures with Yasmin Al-Douri and Alexander von Janowski
Sep 03, 2023
Quiet Social Club

When we speak of the digital future, it seems obvious that the voices of those who will be impacted by our decisions today should be part of the conversation. Yasmin Al-Douri and Alexander von Janowski are here to make sure that the next generation has a voice and a seat at the table. Together, they founded the Responsible Tech Hub, a youth and young professionals led non profit focusing on bringing the attention to making technology more responsible. Together we speak about:
 
- Who needs to come together for a better tech future
- Why we shouldn't underestimate the next generation in helping create a better tech future
- Why diversity is so important when it comes to shaping our digital future
- What challenges they are seeing in the roadmap  to responsibility

About RTH:

The Responsible Technology Hub (RTH) is committed to decisively shaping the emerging technologies of the present and near future. Our mission is to foster an intergenerational exchange that elevates the voices of young people and allows representatives from industry, academia, policy, and the general public to connect and co-create a responsible technological future.

About Yasmin & Alex: 

Yasmin Al-Douri is the Co-Founder and Co-Director of the Responsible Technology Hub. She is studying Politics & Technology at the Technical University of Munich and conducts research on the topics of "Regulation of New Technologies" and "AI Ethics". Yasmin has worked in various organizations and tech sectors, including as a research assistant at the Future AI4EO Lab, in the Techquartier Frankfurt, and until recently as a Business Program Manager at Microsoft, where she dealt with "Responsible AI". Yasmin collaborated with the American-German Institute of Johns Hopkins University, the Foreign Office, GIZ, and Welthungerhilfe, and was named the Young Global Changemaker 2022 by the Global Solutions Initiative.

Alexander von Janowski is the Co-Founder and Co-Director of the Responsible Technology Hub – the first technology Hub empowering young voices to shape a responsible technological future. Before founding RTH, he worked for the German Development Corporation, the GIZ, in Eschborn, Tunisia and Bonn on the topic of digital society and Public Sector Innovation and as a researcher in the Volkswagen Machine Learning Research Lab on methods for ethical and trustworthy machine intelligence. He holds a Master of Science degree in Politics & Technology from the Technical University of Munich, a BA in Political Science and Economics from the Goethe University of Frankfurt and was a Young Global Changermaker 2023 finalist.

Show Notes Transcript

When we speak of the digital future, it seems obvious that the voices of those who will be impacted by our decisions today should be part of the conversation. Yasmin Al-Douri and Alexander von Janowski are here to make sure that the next generation has a voice and a seat at the table. Together, they founded the Responsible Tech Hub, a youth and young professionals led non profit focusing on bringing the attention to making technology more responsible. Together we speak about:
 
- Who needs to come together for a better tech future
- Why we shouldn't underestimate the next generation in helping create a better tech future
- Why diversity is so important when it comes to shaping our digital future
- What challenges they are seeing in the roadmap  to responsibility

About RTH:

The Responsible Technology Hub (RTH) is committed to decisively shaping the emerging technologies of the present and near future. Our mission is to foster an intergenerational exchange that elevates the voices of young people and allows representatives from industry, academia, policy, and the general public to connect and co-create a responsible technological future.

About Yasmin & Alex: 

Yasmin Al-Douri is the Co-Founder and Co-Director of the Responsible Technology Hub. She is studying Politics & Technology at the Technical University of Munich and conducts research on the topics of "Regulation of New Technologies" and "AI Ethics". Yasmin has worked in various organizations and tech sectors, including as a research assistant at the Future AI4EO Lab, in the Techquartier Frankfurt, and until recently as a Business Program Manager at Microsoft, where she dealt with "Responsible AI". Yasmin collaborated with the American-German Institute of Johns Hopkins University, the Foreign Office, GIZ, and Welthungerhilfe, and was named the Young Global Changemaker 2022 by the Global Solutions Initiative.

Alexander von Janowski is the Co-Founder and Co-Director of the Responsible Technology Hub – the first technology Hub empowering young voices to shape a responsible technological future. Before founding RTH, he worked for the German Development Corporation, the GIZ, in Eschborn, Tunisia and Bonn on the topic of digital society and Public Sector Innovation and as a researcher in the Volkswagen Machine Learning Research Lab on methods for ethical and trustworthy machine intelligence. He holds a Master of Science degree in Politics & Technology from the Technical University of Munich, a BA in Political Science and Economics from the Goethe University of Frankfurt and was a Young Global Changermaker 2023 finalist.


Hello and welcome to The Quiet Living Room, a podcast by Quiet Social Club where we discuss and explore ideas on. How to live and work well in the digital world. My name is Liana and I am your host in today's episode. 

My guests today are two very inspiring young individuals who I've had the pleasure to work with over the last couple of months. Yasmin Algeri and Alexander Von janowski are the co founders of the Responsible Tech Hub, a youth and young professionals led nonprofit focused on bringing the attention to making technology more responsible. 

I'm thrilled to have the two of. Them with me today to talk about. The future of tech and how to make sure that it's aligned with our human interests and values. Yasmin, Alex, thank you so much for being here today. 

Thank you for having us. Yeah, thank you for having us. Tell us a little bit more about why you created the Responsible Tech Hub. What gaps were you seeing around you that made you think this is something that we need? 

Well, I think and this is actually something really amazing, what I love about the Responsible Technology Hub and the people behind it, everybody sees a different part of the problem and in a way, everybody has their own core motivation. 

I think that brought them to this. For me, it's looking at the landscape of technology and noticing that with all that crazy technology that's out there and all the crazy and amazing technology that is just around the corner, we need to start putting some flesh to the bone. 

When it comes to creating responsible technology and when it comes to what we call creating responsible technological futures, and that's why we at Rth are on a mission to foster collaboration, to foster understanding of technology, but also to foster action among diverse stakeholders from different age groups. 

From different disciplines, all united by a shared vision of creating these responsible technological futures. And this really is what Rth is for me. This invite for all these people. We call them responsible technology enthusiasts. 

These people who are engaged and are concerned with this problem, who want to see change change being put forward. That's why Rth serves as an invite to get all these people together and serves as a dynamic platform that encourages this dialogue, innovation, and ethical consideration and puts responsibility at the center of all of our doing. 

Yeah, I think one aspect that Alex quickly mentioned was the intergenerational aspect. So what really pushed us in founding Rth this is how we call it a responsible technology hub is the fact that young voices are not really heard or even taken serious in the space of responsible tech. 

So for those who are well versed within responsible tech, they will know that the space is super academic. Not very diverse in very different aspects. Being ethnically, being gender. Being age. And we said, well, our home base is Munich, and we all studied at some point, or still studying, like I am. 

And we had these amazing conversations within our generation. A generation that is very diverse. Germany as well. And we never had the possibility to get out and have these conversations on a bigger platform. 

So we said we actually need a space and a platform where we can do that, where our voices are not only heard, but they're also taken serious and also where. We're giving credit for the stuff that we say. 

It's not like somebody just takes up your idea and your concept and just runs with it, but you're actually getting credit for the stuff that you do. And that was the main push, the main motivation behind building art age. 

And I think it's really important that we continue to include younger generations in this discussion because, yes, they are the future, but there are also already here and they have agency today. So it's important to keep that in mind when we're talking about technology and technological futures, that young people are not only the future leaders, but they are here today and have agency today and they deserve to be included in that discussion. 

And this is obviously one of the main mission statements of Rth to make sure that the younger voices are also included and heard in the conversation on the future of tech. Now, who is part of this conversation? 

Who is having this conversation? Who's responsible for responsible tech? You already mentioned the importance of an intergenerational dialogue. Who needs to come together? Which stakeholders need to come together to make sure that our future is aligned with our interests? 

I think that's a super interesting question and I feel like in your question you already hinted at parts of what I think the answer is. And before I go there, I think in general we are at very exciting times because I believe we are moving, if we make this right, we are moving into an age of responsible technology. 

We see more and more people picking up this idea, more and more people being concerned about the effect that technology has. We have regulation around the corner here in the EU that is also concerned with giving us what they call trustworthy artificial intelligence. 

And I'm confident that if we look back in 20 years. We will know exactly what responsibility in technology means. But we are just at the beginning of this and so right now it's still a very complex topic and it's very difficult to point to a single actor to point to a single institution because I think the responsibility is shared among the different, the different actors and institutions in that systems. 

In that system they all play their individual part but they're all connected to one another. And therefore I cannot just point to companies and say, well, you need better products. Or I can point to users and say, well, you should just have higher willpower and finally close that social media app. 

Or point to regulators and say, you are the ones that should finally figure out how to solve this mess and give us the regulation that we all need. But we should rather understand that these things work in connection and we need to find the different leverage points right? 

And exactly. For companies this can mean, well, we adapt our measurements of success, we develop products that include some elements of responsibility. For policymakers that can mean we change the incentive structures by changing subsidies, by changing regulation and introduce legislative pieces that place responsibility at the center. 

And for civil society organizations like Rth is one I think this really means driving forward these positive feedback loops by raising issues, by leading with positive examples and by educating the general public and connecting the people that are, that are engaged in, in this and concerned with this. 

And I think you have to view it holistically if we actually want to really sustainably drive this forward. I'm glad you mentioned the word holistic because I think to a lot of us, responsible tech is very much connected to artificial intelligence. 

But obviously responsible tech encompasses many more things. Why is it and I'm being a little bit cheeky with this question why is responsible tech so important? And why is it a conversation we all need to have? 

What happens if we don't have it? I think some of the aspects onto or some of the arguments why we need responsible technology have been already been observed in the past years. One aspect is that we live in very unpredictable times, meaning the innovations that we bring out and the technologies that we bring out are used for specific use cases. 

But we live in a world where more and more use cases can be brought up for a technology to be used in. And one of the major issues here is that specifically developers and startups who are developing innovation with a specific use case in mind. 

For example, tech for good, using technology for the good, using it to, I don't know, heal patients from a specific disease or using it to find more sustainable ways to live. And these are aspects that we use in AI. 

But there are also different technologies, for example, based on blockchain where we have these amazing innovations that could lead to breakthrough technology, but we didn't really think of the backside of the, let's say, negative consequences it could have for our society. 

Blockchain is a perfect example. So let's talk about blockchain was basically used to democratize specific. Technologies or areas that we already use, for example the internet. So the whole web three discussion that we have is very much based on a discussion on blockchain making sure that technology is democratized in a way where everyone can use it. 

But the issue here again is we didn't think of the consequences it might have on our nature and our environment. So mining chain is not sustainable in any ways right now. So we are contributing to more emissions that we should have. 

It's also the case with AI. So if we develop AI for a specific use case or if a startup specifically has a specific use case in mind, that doesn't mean it will only be deployed for that use case. So if we say we want to use, let's bring in the perfect example that came in yesterday. 

So there's a team of researchers that was able to help a woman who had a stroke and who wasn't able to communicate anymore verbally. They basically helped to communicate again. How did they do that? They used an AI, an avatar and they basically implanted some kind of electrical cord inside of her brain to basically read the brainwaves. 

I think that was the case. Whoever's listening, please do your research and make sure I didn't say anything that's wrong. But what I'm trying to say here is this use case is amazing for patients who are not able to talk anymore, to communicate verbally anymore. 

Now, the first use case that came in mind where such technology can be used for for harm is by using it to profit off of it, by using it to predict behavior. And these are just small things where we're saying OK, this is the point where we need to talk about responsible technology. 

Where are we using technology and for what purpose? And when exactly are we using it? Because in the past we've also seen points or examples where tech was used to harm people. It was used to predict behaviors and manipulate behaviors. 

It was used to predict election outcomes in democracies just to manipulate it and influence voter behavior. So the whole question as to why we need responsible technology is because if we don't, things will go bad. 

And we don't want that to happen. We don't want technology to go bad. We want technology to be used for the good. You already mentioned quite a few examples here. Now, obviously you too are very involved in this subject and you know, the technical and the political implications much better. 

I'm curious, and this question is maybe a little related to the fact that you are the youngest guest I've ever had on the podcast. Do you see that in your generation there is a concern for the future of tech? 

Do you have a feeling that Gen Z and younger generations are worried about where we're headed and want to do. Something about Think and Jasmine? Correct me if I'm wrong, but I think it's a bit difficult to speak for an entire generation because with these complex societies that we live in, everybody interacts in a different kind of way with different kinds of technologies. 

For me, I think the technology of our generation is social media and AI, both predictive and generative. And I'm a huge fan of the center for Humane Technology and they have if whoever's listening, you don't know them, pause probably here and go watch something that they've produced because there's amazing ability to very aptly describe. 

Certain movements and certain things that we should think about. And they describe social media as our first contact as humanity with AI. And now with the release of Chat GPT and all these generative models, be it text, videos, images, voice, whatever it may be, as our second contact with AI, if we only stay with social media. 

And I think Ilyana, you know this best, right? We already have all these kind of problems that came with it, shortening of attention span, increase of teen depression, rabbit holes, polarization. The literature is one way or the other a little bit divisive of how much you can actually attribute directly back to social media. 

But this has had a massive influence on an entire generation that is not going to go away. And I think social media is definitely not yet off the table. But I also think it's amazing how aware the young people are, because we have one project and jasmine, maybe in a minute you want to tell a bit more about this. 

But we have one project called social media. And teens where we go in schools and we talk to these young people, and we talked about social media applications like Snapchat, for example, and went through how they are designed, how they are set up. 

And then the second half was getting the young people to actually think about how their ideal social media would look like. And you'd be amazed how in informed they are, how aware they are of these problems and how creative they are in positioning something else and in envisioning an alternative. 

But please, Jasmine, you were part of this project. Maybe you can tell a little bit more. Yeah. So actually, I'm really happy that you brought this up, because I think one thing that we have in common as a generation, and that's probably the only thing that we have in common in a very diverse generation, again, is the fact that we're underestimated, specifically teenagers. 

And it's insane because within the Social Media and Teens workshops and project, we actually talked to younger people, to teenagers specifically, and by letting them build their ideal version of a social media platform, we didn't tell them what's wrong with social media. 

They told us what's wrong with social media, and they told us how they would fix it. So what we did, we didn't come with this prejudgment of, yeah, those are young people. They don't know what they're talking about. 

But instead, we just offered them the space to just talk and to just exchange ideas. Whether those ideas have been critically discussed afterwards or not, that's a whole other question. But the main point that we're trying to make with this project is that young people are heavily underestimated and their thoughts and voices are taken serious. 

And the idea of this project actually came because I had a talk with my niece, who's a teenager, and I actually talked with her about her TikTok consumption, and I was like, yeah, are you aware of it? 

She's like, yeah, I scroll nonstop the entire time. And I'm like, you know, it's designed that way. And she's like, yeah. I also know that they're sending me this kind of content because I Googled this and this, and I know that my Google account is basically feeding them that data. 

And I also know that they're sending me a bunch of content that is making me depressive. And then when I talked with her about it and I told her, so are you going to change anything about your behavior? 

She said. Well, it's the only platform where I can stay connected with my friends. So the aspect of them being aware that these are big platforms where all their friends are and them not having an alternative shows a lot about them actually knowing what the issue is and also shows how much of time they're ending there, that they're aware of the fact that they're landing in rabbit holes and that they're fed content that is harmful. 

So I think there are so many different opinions within our generation when it comes to dealing with climate change, when it comes to social aspects, when it comes to equality and equity. But the one thing that really brings us all together is the fact that young people are not taken serious, although they do have the knowledge, and they are most probably even more knowledgeable of the technology today than older generations. 

There is a lot you just said, Yasmin, that we could explore and talk about further. But I'd also be very interested to know what were the features that the students wanted in their ideal social media? 

Would you be happy to share some of them with us? In our first workshop we did that in Berlin with a Gemnazum so with a higher up school, if you can call like that, in the German system. And we had three groups, just so you know, we never actually developed those platforms. 

It was more of a paper prototyping mechanism. So we said down with the students. We give them a paper prototype, which is basically a piece of paper where you see the outline of a phone, whatever phone they're using and then they can basically use that outline to write down concepts or draw concepts or whatnot, or draw the app that they want. 

So one group was focusing on saying yeah social media is not real it's all fake, it's a fake world and we want a platform that is more authentic. So I asked them okay so this is a problem that you saw, now tell me how would you possibly solve it? 

And they said well we want an app or a platform where we can see people build stuff and do their hobbies but also a place where they could basically sell it if they wanted to. So basically combining hobbies that might be not super, I don't know, popular, but showcasing your work and showing what you're doing and basically bringing in that authenticity that way. 

That was a group of twelve and eleven year olds. There was another group that was talking about focusing more on the attention span when you're using platforms so they basically said okay it's taking away my attention I don't want to be on social media all day I want to get rid of that rabbit hole and the constant loop. 

So what they did was basically say okay we're going to build an application or a platform where a, there's no hate speech so there's no way for users to comment. To show your appreciation of content you give out coins. 

So you're also basically supporting content creators and you also only have a limit of 1 hour a day. Now obviously this sounds a little bit yeah sure, whatever but what they were trying to basically say the problems that they saw without even defining the problem 100% was the fact that they saw content creators not being paid for the work that they do for the entertainment that they're providing. 

And having a platform that constantly takes off your attention from other things. So these are also problems that I mean, content creators know this, but a 1413 year old should not know that content creators have this issue, but they do know this because they're dealing with it. 

So these were just two groups. We have way more groups who talked about other things like algorithmic recommendation systems, changing them and whatnot. So yeah, most of the students who were done with their projects, specifically the eleven and twelve year olds came up to us and we're like, so when are we starting building this? 

Because we really want to build this. And we were like, yeah, sure, we would love to, but let's see if we can fund this entire project first before we continue. I hope we have some VCs listening right now. 

You know where to find your future tech leaders. There are so many ideas and so many people really want to see things going well. What do you think are the hurdles that we're facing and making that a reality? 

And what can we do to start laying the groundwork? Because I think a lot of people you and I speak to want to have a better tech future, want to have a good tech future. I think really the majority of people want that, but for some reason we're not quite getting there. 

What is in the way? I think maybe if I just start and then Jasmine, you jump in wherever you think is necessary. So I would try and break it down to the very essentials. I think first off, we need to have a clear vision of where we want to go. 

We need to create in these business canvases, this is often called like having a mission statement. Then you have to sit down and reflect upon this. And while you're doing it, you feel like, why am I doing this? 

But there's this having. You are doing this is really motivational. And at Rth, we are envisioning a world where technological advancements are driven by a deep commitment to ethical considerations, to sustainability, as well as the well being of all individuals in society as a whole. 

And that really gives us our responsible North Star and that leads the way. Right. And the importance of this is really because you are faced with something that's super complex, like you want to change technology for the better. 

Well, yeah, good luck. Maybe in 15 years, if you're lucky, you set yourself up for a big journey and that's good. You should pursue it. And I believe having that vision gives you a clear pathway where you want to go. 

And then the second thing I would say is finding community, because you can walk very fast if you're walking alone. This is saying you can walk very fast if you're walking alone, but you can walk very far if you walk together. 

And it might be a bit cheesy, but it's actually true. Because connecting to a community of like minded people that are all concerned with the same issue that you're concerned with, that see similar, that see the pathway that you are on or you see the pathway that they are on, creates synergies that neither of these people would have seen before. 

And I think this is what can really get an idea off the ground. Because not by maybe you have a motivation or you see something that's unjust, or you see something that you want to change, or you have a good idea. 

But this is maybe step number one. But you need the steps two, three and four to really take these things off the ground and then to push forward. Whatever. Ah, whatever project you are, then engaged in whatever you want to see become reality. 

Yeah, maybe to jump on that. I think one of the biggest issues that we're facing is that this mindset and I'm not calling out a specific group, it's a general mindset that we have in our society that is like be fast, or as Mark Zuckerberg said, be fast and break things. 

And this mindset is not only tailored to innovation and technology, it's tailored in so many know, just be, create, create, consume, consume, consume, and don't question the entire aspect. And I think specifically in tech and innovation, this is a huge issue because you can't bring out tech just like that and not question how it's used or how it's deployed. 

And I think that is one of the major issues that we're facing today. And I think there are three ways we could not necessarily overcome it, but start to overcome it. And that's very much questioning. 

Questioning our use of technology, our consumption of technology, but also collectively asking ourselves how far do we want to go with tech? Especially when we're not able to guarantee that the tech is not going to harm any norms that we have, specifically democratic norms. 

And this is a point where we're starting to have these conversations and these discussions just like now, but not having enough of these conversations, specifically not with the mainstream. And that brings me to the second point, which is kind of learning from and with each other, finding space where you can meet like minded people, but also people who are not on the same. 

Level of knowledge or who just generally don't agree with you. And the reason why I'm saying that is because we started to apply our filter bubbles that we have on social media into our normal life. And I can say for a fact that I generally really have to personally work hard to keep my personal life very diverse in means of diverse thoughts of minds and to not be mad when somebody is disagreeing with me. 

And the reason why I'm doing that actively is not because I want to be a better person or whatnot, but because I need to learn. Like I need to understand what the person in front of me is thinking and how they're thinking of our future and our present. 

Because if I don't do that, how am I supposed to even have an opinion on specific things? So bringing people together, even if they have differing opinions and letting them exchange and network and use it not only to their advantage but to the advantage of all of us. 

And the third thing and that's probably the issue that most nonprofits have, I think spaces like ours are not invested into. There are a lot of people who are working for a better tech future and they're doing that unpaid. 

That means they can't put their whole life, their whole heart and their whole passion into those projects simply because they have to make ends meet. They have to work while doing these projects. And that's a big issue because it's mostly organization that do not have as much funding, who do the major work when it comes to community building, community learning and education but also holding, let's say, corporations or stakeholders accountable. 

And that's an issue because actually this is something that we don't talk about a lot is creating a space where you have different people coming together also creates a space for corporations to come together and to come together with people like us. 

So users, for example and it gives them the major opportunity to connect with people and to understand what the issues are that they're facing. Because at the end of the day, specifically when we talk about democratic systems, we're all living in one system together. 

Meaning if, for example, Big Tech, let's call one out, let's call about Meta. If Meta is actively working towards influencing voter behavior that's going against democratic norms. And whether Meta likes it or not, at the end of the day it's going to backbite them as well. 

So I don't think that these discussions are held within such corporations and it's mostly organizations like ours who have to bring different people together to have these kind of conversations and to exchange experiences. 

So it's actually a huge opportunity for them as well. But the problem again is there are not enough spaces like these where they can come together or if they are spaces, they're heavily underfunded. So one other step towards overcoming this issue is fund these organizations, help them get their projects going and help us do our work properly. 

Speaking of organizations that are failing to bring Responsible Tech into their company DNA and organization, are there any examples of organizations, companies or perhaps even on a policy level that you can give us that will make good and Responsible Tech a bit more tangible for us? 

What would Responsible Tech look like in everyday life? Well, from a policy point of perspective. Responsible tech could look know, bringing out legislation that to some degree protects users. So the EU started that with GDPR, with the General Data Protection Regulation, and they did that because they wanted to support users knowing how to use their data, but also protecting their data. 

As we know, data is the base for AI development. So when big tech corporations are using data, they're probably using your data as well that you're handing over for free. So how could that look like in our everyday life when you go to websites and you have to go through that really annoying bar that tells you to accept the cookies? 

Well, this is an example of how things might not be like how responsible technology might not make our life convenient because now we have to go through that thing and then we have to click Accept or don't accept. 

Or if you want to reject all of them, then we have to swipe through. These are aspects that should be changed. So there are people actually working on that specific aspect of cookies, for example, to make it more user friendly. 

But it's one aspect of our everyday life of how responsible tech could look like. The EU is also working on the EU AI act, where they're definitely regulating AI and the use of AI and how it's deployed, making sure that corporations are held accountable, but also trying to make sure that AI is somehow safe for society in our everyday life. 

I think there are so many examples of how technology is not used for the good that I feel like we need more tech that is responsible. And this is the one thing where I want to like I hope that Alex. Um, we call it tech for good. 

So technology that is generally used for good and there is a lot of technology out there that is for good but nobody really knows it. And one perfect example is also linked with surveillance. So when we think of surveillance the first thing we think of oh it's negative. 

And funny thing is I'm writing my master thesis on surveillance so more of the negative part but not the positive part. But there's also one thing that is very positive on it and that is for example using satellite imaging to predict the next floodings or using the data that is basically harvested through satellites to predict the next droughts. 

So for agriculture it's amazing, it's an amazing way for farmers to know when to go to specific areas of their land and put in the stuff that they want to put in. And these are aspects that we never think of because we are just consuming for example the food. 

When in fact this food was not only hard labor by farmers but these farmers actually use that technology to get the best outcome for our food. Also pesticides, using AI to find specific pesticides that are more sustainable with our environment. 

Also one possibility but I think more and more corporations and more and more countries are aware of their responsibility when it comes to tech and having countries within the EU for example who are pushing for that as well has obviously also pushed corporations to think of these issues as well. 

But Alex please feel free to jump in if. Yeah, I think you already mentioned a lot of things. As I said, I believe that we are really at the beginning of many responsible tech innovations. Many products, services, organizations, policies will come into place in the next years. 

And we already see in the beginning of this, if we look into the policy ram, jasmine already mentioned it, right? We had the GDP are now almost ten years ago. We have the AI act that's currently being debated. 

We have a DSA that just got interference. DSM, these are all legislative acts that see the risks of unregulated technologies and that really want to share that responsibility with the platforms, with the developers of these technologies and tell them, wait, if you want to introduce an AI application into the European market that falls into a high risk system. 

And of course, we are still debating what exactly high risk means and details are still being discussed. But if you fall into this category, then you now have to be aware of do you have any bias in your data set? 

Do you have any potential privacy issues that your application could bring about? Is your application fair? And I think all of these questions are connected to responsibility and the same thing also in the companies. 

If we've been bashing a little bit on the big tech and companies, and so maybe just also throw in a good word for them, we see a lot of good research coming out of these companies, be it Google, be it Microsoft, right? 

They're working on different methods for data sets for your AI models. How can you actually find out if you have bias in your data sets? And it's not only them, there also we have. Amazing research institutes across the world that are doing this and that are really tackling these questions head on and for users, which I think is probably one of the most interesting questions. 

Think about what kind of apps and what kind of services you want to use. And this can sometimes be a little bit annoying because we have this thing like this platform Power, which basically just means there are some platforms which a lot of people use. 

And if you jump on a different platform, well, then it's going to be a little bit hard to connect with all of your friends and families and acquaintances because they're not using this platform. So I get that there are limits to this, but when it comes to messaging apps, jasmine and I within Rth as well, we are only using Signal. 

I think this is an amazing organization. I think they're doing really nice stuff. This is something where you could easily switch, convince some people to switch from WhatsApp to Signal if that's important for you. 

Since we've also talked about social media a lot. There are apps, like Onesec, for example, or Time Well Spent I think they're just called well spent that try and really break your endless habit of opening the app and scrolling endlessly by one SEC. 

Puts a little gray screen and tells you to deeply inhale and deeply exhale and become aware of what you're doing and be present in the moment. Do you really want to open Instagram for the 20th time and spent after? 

I think you can set the personal time, but for me it's after like ten minutes. It asks me hey, is this really time well spent? Or do you want to go outside and enjoy nature or call a friend or taming? 

That's it it's funny because you can actually set the tone. You can go from very friendly to very aggressive where the app goes like close this app right now, you can choose your personal preference, whatever resonates the most with you. 

And so I think looking for these apps and looking for these alternatives and also using them, I think is really important because it's only when we use them that we can help these organization we can help these companies, these organizations grow. 

Because if they always only have 50 users, well, they will most likely not grow. And even though now it's an app that can be improved in many instances, give them 10,000 users and given them the ability to do that, then they will also grow. 

And I think if you're looking for in terms, what else can you do if you're looking for other content right there. We are one organization within, I think, a variety of nice organizations. Like I mentioned the center for Humane Technology before, Mozilla Foundation. 

They are, I think, very important players out there that are all concerned with topics like we are and that are trying to provide content, provide toolkits, provide recommendations. And I think it's time to actually we are in a time where we can start doing things and not only talk about it and say, well, it would be nice if we had something. 

There are already tangible solutions out there that we can rely on. You mentioned social media. This is something I talk a lot to people about. I have people on the podcast who say that the world will be a better place without social media. 

And I think we all know those videos on YouTube of people who say, I deleted my Instagram account and I feel so much better now. On a whole, we can realistically say that social media will definitely stay. 

It will not go away. And we also know that social media is deeply impacting our humanity in the way we relate to each other or to ourselves. How do you think we can make sure that something like social media actually has a positive impact rather than a negative one? 

I honestly don't think social media is going to go anywhere. First of all, It's companies that are building these platforms have created billions of dollars for the US. So it's a big economic strand of the US. 

So there's no way they're going to go away. But I think the thing we have to, even as a society, really have to rethink is the model of profit over everything. So these business models need to change. 

And of course, this is an ideal thing. And I already know that somebody who's listening right now thinks, yeah, sure, not profit over everything. Sure, tell that to the people who invest into those startups. 

But in all honesty, if we as a society decided to say, hey, we're going to tax any company that is doing harm to our society and we're going to tax them really badly, then things will be different because the profit will not be the same. 

And if we start thinking of alternatives on how to hold these companies accountable, we can get to these points. We can get to a point where social media is actually used for the good. And social media is something where we can the base idea of social media was always to connect people, to connect loved ones who have, haven't seen each other. 

And I'm going to be very honest with you. If it wasn't for social media, I probably wouldn't have seen half of the world. I wouldn't have traveled and I wouldn't have lived abroad and I wouldn't have lived so many kilometers away from my family because it's the only way to keep up with my family and to see them. 

And that's how I feel connected to them. But at the same time, I know that most social media platforms are feeding us very negative content. Why? Because it keeps our attention. And keeping our attention means rising and increasing profits. 

And I feel like it's our responsibility as a society to say, okay, if you're doing that and if you're willingly bringing up these tools to negatively affect our society just to make profit, then we need to take that profit away from you by taxing you. 

And these are aspects that have been talked about but not as much because obviously it's not as popular to talk about taxes. But I do feel like we're at a point where we can necessarily say, okay, social media is not going to go anywhere. 

How are we going to deal with the business model that they're using? And how are we dealing with the idea of a profit over everything? And this is one way to do it. Another way, I mean, in all honesty, a very radical thing to do and not sure if a lot of people are going to do this. 

Start a movement like the Logo movement in the US. Also created by young teenagers who logged off social media for a day or a couple of days. And it had a direct consequence to the profit margin of social media companies. 

So it's either on us as a society to say we need to push policy or we say, okay, as consumers, we're going to push these companies to change something. But again, that needs a lot of organization and organizing such a movement is not easy and it's mostly unpaid. 

And find somebody who's going to do this willingly unpaid for and is going to put a target on their back just for trying to change things for the better. Always a little bit. Yeah, it's not the easiest task to do. 

I think it's funny just you sitting here talking about taxes reminds me of Rutger Bregman sitting at Davos, dropping the T word and creating a lot of blank stairs. So let's see where your things go. But I think Davos that's for sure it but I would like to add on what Jasmine said, I think you've already said very nicely, is that we probably need to think of a more complex solution to a complex problem like social media is, right? 

So if we say, well, let's just ban all of social media tomorrow, I would like to ask the question, well, what is the problem? To which banning social media is the solution? What exactly have we solved? 

And that's a little bit the annoying part of social media. When we talk about the problems that social media has produced, right, we are talking about information overload, addiction, doom, scrolling, shortening of attention spans, polarization, bots, deepfakes fake news, right? 

These are all different kind of problems. And I'm not trying to say that they all need individual solutions. We can probably come up with a solution that treats many of these problems. But it possibly needs to be something that's a little bit more complex than just deleting social media. 

The answer probably lies in some form of movement or in some form of change in incentive structure, where I think the users, even though they are very spread out, have a lot of influence, potential theoretical influence on the social media companies if they are aware of what they want and if there is a channel for them to communicate this, Because I can create. 

Facebook post or Instagram post and say, well, here are three things I would like it to change about Instagram. And that probably never reaches the ears of anyone who makes decisions at Instagram. So we need to find a way that actually works. 

At the same time, let's also not forget that a lot of people like spending time on the internet. They like doing esports, or they like sitting in head forums or they like do all this kind of stuff. Why would you want to take this away if people actually this is where they meet. 

This is where they have their time well spent. And so I think looking at different profit incentives is definitely a path worth considering. And this is the issue that we try to raise with Rth, finding ways to expand the scope, expanding your metrics of success. 

How can I be a successful, profitable, legal company that's ethically committed and that innovates, responsibly? And if I manage to achieve that, I think I'm like an organization, a company of the future, and that also demands that I understand. 

What do I mean by responsibility? What is my company? North Star. That gets me to a place where I can responsibly innovate, where I can make sure that I bring a product onto the market that is profitable but that also benefits everyone that uses that and doesn't harm the environment or doesn't harm environment in terms of the earth, but also an environment of people that are around it. 

And like Jasmine said in the beginning, social media companies like Facebook were created to bring people together. And I don't think that anybody ever sat down and said, I want to create a platform that. 

Creates information overload that creates polarization addiction, deepfakes doom scrolling and all this kind of stuff. But we also shouldn't turn a blind eye on these problems now that we have them. You raise a very important point because I think we talk about social media in such a negative way, but obviously there are many positive aspects to it. 

I mean, this is why use them, use our social media accounts on a daily basis. We forget that they can be tools for connection, for engagement, for creating movements and communities. So it seems that it's really about understanding where does this add and enhance and where does it take away, where does it positively influence my life and where does it lead to stress, anxiety, doom scrolling and social comparison? 

Now, towards the end of the podcast, I always like to ask a bit of a big and bold question just to stretch our thinking a little bit and serve as a thought experiment. My question to you two would be if you could implement one law universally that all tech companies had to abide by or guideline, what would it be? 

I mean, I would revert back to I don't think that there is a one size fits all solution and maybe Jasmine has a better answer. So I'm curious to see what she I mean, for social media companies, we mentioned this. 

This can mean not only maximize profit, but maybe maximize time well spent. And for users this can mean be becoming aware of the type of apps that you use and also the type of apps that you want to use. 

If you want bigger privacy, then maybe you should consider switching and you should maybe consider joining people, having people join you. And maybe let me put it differently in May. Of this year, 2023, sam Altman visited the Tactical University of Munich, and he said one thing that I think was really interesting. 

He said the room was basically full of engineers, and he encouraged them to start a career in tech. He said, Start a career in tech because now we are at the beginning of a new time, and now is the perfect time to start a career in tech. 

And I would agree with him. I would say, get that idea that you have about improving the world out of your drawer and put it into an application and find investors that are willing to fund this and convince them that, looking at our regulatory framework, things are starting to change, and now is really the time of responsible innovation. 

It's no longer just a high level moral discussion of we should have this, but it's actually it's here, and this is the future. Markets. People are becoming aware of this. And so just my encouragement maybe I don't have a law, but my encouragement is start your career in tech in one way or the other. 

Yeah, I also think it's kind of hard to focus on one law. I think there are two things that we need to consider that regulation generally is not bad. Regulation sets a tone. It sets a framework for people to know how to behave and for corporations to know what exactly to bring to the market. 

So when we talk about regulation, we're not talking about limiting innovation or limiting profits. We are simply giving a guide to corporations and to people to understand what they're doing. When it comes to regulating tech specifically, I think the reason why it's so hard to regulate it is because we think of a utopian. 

Society when we think of that regulation, but we don't know whether this utopia can turn into a Dystopia. And the example that I always bring in is the idea of data dignity. So for those who haven't heard of it, data dignity is basically the idea of making any data associated to you a right, meaning you're possessing those data. 

It's basically yours and only yours to decide who you're going to share it with and whatnot. So basically kind of making data ownership not a human right because it is a human right, but very implementing it. 

So making sure that when corporations are harvesting your data, they're not just able to do it just because you said, yes, you can use my data to use their services, but to actually be held accountable for that in. 

Some cases, some scenarios, data, dignity people, those who are highly in favor of it, even say this is how we can create an income stream for people who maybe do not have the means by basically saying, okay, if you own all the data that is associated with you. 

You can also decide to sell it, to actually sell it, to monetize it, to make money with the data that you're already sharing for free, but to actually make a profit out of it, meaning you share the profit of a company that is already using your data. 

And the idea is, through that data dignity approach, to bring more equity into the system, to basically give more autonomy and control to the users so they can decide who's taking my data, they could also possibly donate it to research to say, hey, I have. 

All this data laying around. If you can think like that, I'm going to donate it to this research group who are working on this and this topic. So controlling more the data flow that is out there and controlling who's having access to it sounds amazing. 

I love that idea. I love the idea of giving users more autonomy and more control and actually including users into the profit making of these corporations. But then you think a little bit ahead and you're like, wait, is it really solving the issue of corporations using your data? 

Okay, now you actually sold your data to them and you made a profit out of them, but they're still using it. So how is that really solving the issue? And might that even lead to another issue, a bigger issue, which is now everyone wants to sell their data to make profit and then all of a sudden we live in a world where, for example, advertisements can be some form of compensation if you can't afford something. 

So let's say I go to the supermarket, I don't have money anymore. So they're saying, you know what, sell your data for us. Or you know what, we will give you a specific customized advertisement that you have to watch through for 20 minutes so you can buy a piece of bread or can buy whatever grocery stuff that you have. 

So again, whenever we think about regulation, we really have to think about not only the aspect of are we limiting profits or are we limiting innovation, we also have to think about what are the consequences of the regulation that be set up? 

And I think instead of regulation, we should make sure that tech education is implemented and that specifically ethics is implemented within the education of developers. And making sure that the teams who are developing and deploying tech are diverse and interdisciplinary because only that way you will be able to bring in different aspects and different opinions and different experiences to make sure that the tech that we're using and that's developed is safe and somehow inclusive as well. 

So instead of we need this law, we just need more tech education and we need to make sure that those who are developing it are aware of the ethics behind it. Definitely and perhaps also to make sure that ethics is a more integral part of the curriculum and not just an elective. 

I'm also glad that you brought in the regulation side again and the complexities and difficulty to regulate something that moves so fast that we have no idea of where it's going to go. Sometimes even the people who are creating the thing don't know where it's going to go and I think that's important to remember for us as users as well. 

You two, thank you so much for coming. I feel like this conversation could go on forever. Do you have some parting words for us? Maybe also how we can stay in touch with the responsible tech hub? Thank you for having us. 

I think we're really happy to have this conversation with you. I don't think we have too many of these kind of conversations so thank you for having us. And for those who are interested you can always visit our website. 

Theresponsibletechhub.com also our socials are always open. If you want to join our cause you're more than welcome to join us in any means be it giving up your time or joining our events or if you know somebody who's really interested in that topic, connect them and yeah, really happy that we were here today. 

Thank you. Thank you so much for having us. I don't think there's much left to say except stay responsible. And if you're looking for an organization to join, our doors are always open, as Jasmine said. 

But thank you so much, Ileana, for having us. And of course, we'll make sure to link everything we discussed in the show notes of this episode, including all the exciting events you have coming up at The Hub. 

Yasmin. Alex, thank you so much for being here today. Thank you. Thank you also to you, our listeners, for being here. We'll be back in two weeks with more conversations on living and working well in a digital world. 

If you enjoyed this episode, make sure to subscribe to the podcast. I look forward to seeing you again next time you close.