#512 Douglas Rushkoff | Media Theorist, Author & Host of Team Human
Shownotes
Our guest today is one of the most influential and provocative media theorists of our time. He graduated magna cum laude from Princeton University with a degree in English and theater, earned a Master of Fine Arts in Directing at CalArts, and went on to teach media theory and digital economics at renowned institutions like NYU and Queens College. Over the past three decades, he’s shaped the global conversation around technology, culture, and the future of society. He coined terms like “viral media,” “digital natives,” and “social currency.” His bestsellers, including Survival of the Richest, Throwing Rocks at the Google Bus, Program or Be Programmed, and Present Shock, are foundational texts for anyone questioning how technology shapes power, identity, and human agency. He’s not only an academic and author, but also a cultural force. From making FRONTLINE documentaries for PBS like Merchants of Cool and Generation Like, to hosting the influential Team Human podcast, to advising global institutions, his message remains urgent: in a world being automated and optimized, we must double down on being human. He also is Co-Founder of Andus, an AI consultancy.
Since launching this podcast over eight years ago, we’ve spoken to more than 600 people in over 500 episodes about how work can empower rather than diminish us and what still needs to change. How can we resist the seductive narratives of techno-solutionism and instead embrace a more human-centered, collective path forward?
What do the survival fantasies of some of the world’s wealthiest technologists reveal about our current system and how do we push back against that logic?
And why is it more important than ever to reclaim agency not only over our tools, but over our values, our communities, and our shared future? One thing is clear: tackling today’s challenges requires fresh perspectives. That’s why we continue to search for ideas, people, methods, tools, and stories that bring us closer to the core of New Work. And of course, we always come back to the same question: can everyone really find and live what they really, really want? You're listening to On the Way to New Work, today with Douglas Rushkoff.
Hier findet ihr alle Links zum Podcast und unseren aktuellen Werbepartnern
Transkript anzeigen
00:00:00: Our guest today is one of the most influential and provocative media theorists of our time.
00:00:05: He graduated magna cum laude from Princeton University with a degree in English and theater, earned a master of fine arts and directing at Cal Arts and went on to teach media theory and digital economics at renowned institutions like NYU and Queens College.
00:00:21: Over the past three decades, he shaped the global conversation around technology, culture and the future of society.
00:00:29: He coined terms like viral media, digital natives and social currency.
00:00:34: His best sellers, including Survival of the Richest, Throwing Rocks at the Google Bus, Program or Be Programmed and Present Shock, are foundational texts for anyone questioning how technology shapes power, identity and human agency.
00:00:48: He's not only an academic and author, but also a cultural force.
00:00:52: from making front-line documentaries for PBS-like Merchants of Cool and Generation-like to hosting the Intelligent Team Human podcast.
00:01:01: To advising global institutions, his message remains urgent.
00:01:06: In a world being automated and optimized, we must double down on being human.
00:01:11: He also is co-founder of Anders and AI Consultancy.
00:01:15: Since launching this podcast over eight years ago, we've spoken to more than six hundred people in over five hundred episodes about how work
00:01:23: can empower
00:01:24: rather than diminish us and what still needs to change.
00:01:28: How can we resist the seductive narratives of techno-solutionism?
00:01:33: Is that the correct word, Michael?
00:01:35: Solutionism.
00:01:36: Solutionism.
00:01:37: Thank you.
00:01:38: And instead embrace a more human-centered, collective path forward.
00:01:42: What do the survival fantasies of some of the world's wealthiest technologies reveal about our current system?
00:01:49: And how do we push back against that logic?
00:01:52: And
00:01:52: why is it more important than ever to reclaim agency not only over our tools, but over our values, our communities and our shared future?
00:02:02: One thing is clear.
00:02:03: Tackling today's challenges requires fresh perspectives.
00:02:06: That's why we continue to search for ideas for people, methods, tools and stories that bring us
00:02:12: closer
00:02:13: to the core of new work or what is in English called the future of work.
00:02:18: And of
00:02:18: course
00:02:19: we
00:02:19: always come back to the same question.
00:02:21: Can everyone really find and live what they really, really want?
00:02:26: You are listening too.
00:02:28: On the way to new
00:02:29: work today.
00:02:30: with
00:02:50: So, Douglas, thank you so much for giving us the opportunity.
00:02:56: You are somehow a rock star in Germany.
00:03:00: People really admire you for many things you did.
00:03:03: And I do hope that this little introduction is a good starting point.
00:03:08: And as I mentioned in our pre-conversation, we always start with the same question.
00:03:13: How did you become the person you are today?
00:03:17: It's funny, you know, because I've been doing a lot.
00:03:20: a lot in Germany and a lot of German media because of the survival of the richest book, which came out at a very good moment for excitement with all the tech bros in the White House.
00:03:32: But most German media, newspapers in particular, they've really focused on, you know, where did you get your PhD?
00:03:41: You know, there's always that.
00:03:42: You're at a university and all.
00:03:44: And you're the first, I mean, I still let you're the first.
00:03:47: and only media talks about everything else and not the PhD, which then kind of frames the, how did I get to where I am differently?
00:03:59: In other words, it's no longer, when you ask that, I don't think of it then in terms of my intellectual path or my career path or my move through the certifications tour of institutions, but rather how did I get to be the person who is experiencing this moment the way I am?
00:04:37: And that to me is a different kind of question.
00:04:40: And I would say I grew up feeling very responsible for other people.
00:04:49: for my mother's experience of life, and then to kind of take care of everybody else.
00:04:56: When I would blow out my birthday candles, I would never make a wish for myself.
00:05:02: I would always feel so guilty that I have a big birthday party.
00:05:04: I would blow the candles out and wish for world peace so that there's no starving babies in Biafra.
00:05:13: I did theater for a long time.
00:05:15: That was my original love.
00:05:18: And I got frustrated with theater because it was just calming people, making people feel good, even about scary issues.
00:05:30: I liked Bertolt Brett, which Germans will know because he was trying to make theater that pushed people out into the streets to do something where it felt like the only theater people would pay me to do was theater that purged them of the need to do anything.
00:05:46: So then I moved to the internet, an interactive technology, thinking that this stuff will be the people's media through which we can interact and bring some intelligence to our discourse and our articulation.
00:06:01: And we know what happened there.
00:06:06: It didn't work out that way.
00:06:08: It was used for something different.
00:06:14: After all of these efforts to do activism, to be what I would call an agent of change in the world and the tradition of kind of prophetic Judaism and social justice, I feel myself shifting to being an agent of care, to being less like someone who's trying to fix and more like a doula.
00:06:41: who is giving palliative care to a civilization that is dying and hopefully a doula that is giving some comfort and guidance to the civilization that will be born after this.
00:07:01: So where I am now is as a person who is trying to
00:07:08: help
00:07:10: people on the planet metabolize a challenging moment.
00:07:19: Wow.
00:07:23: When you go back to that birthday
00:07:25: moment, and I can
00:07:28: really relate to that since there are... usually kids playing around at the office here.
00:07:35: And they have very strong emotions, our children, I can tell that.
00:07:39: And it sounds like you had that too.
00:07:41: But I think it's so interesting to hear from yourself having these thoughts and now reflecting back on that moment.
00:07:49: What's the source of that?
00:07:51: Did it just come?
00:07:52: Was there a role model?
00:07:53: Was there a moment?
00:07:55: Where did that come from?
00:07:56: I mean, I don't think the way... The way I developed it was necessarily positive.
00:08:10: I think I had a mom who was struggling with a lot of things.
00:08:17: Not necessarily her fault, but just the way society was.
00:08:20: and how does a woman become herself in the nineteen sixties in America?
00:08:28: I think I was trying to take care of her.
00:08:34: And then that put into, it laid down a framework, a kind of a fundamental framework of making sure everyone is okay.
00:08:46: Which is, you know, it's fine, but it's not necessarily the best or strongest prime motive.
00:08:58: in actually providing care for other people.
00:09:01: Because what I was doing was trying to relieve my own anxiety, not genuinely be present, which is why I had, as many in the West do, a bias toward fixing things, fixing things as quickly as possible rather than letting things run their course like western medicine.
00:09:23: you start to sneeze and their drugs to dry up your nose.
00:09:28: when that's the worst thing for a cold you need.
00:09:30: the cold it's trying to express itself.
00:09:33: you don't shut it to you unless the symptoms are so bad that that you're in danger.
00:09:39: you don't shut down.
00:09:40: you allow the symptoms to to manifest.
00:09:43: and uh So I don't know that that didn't necessarily come from the best place.
00:09:51: The better influences were, gosh, theater was a real shaping thing for me.
00:10:05: And especially because theater, for me, made me think about the nature of reality.
00:10:14: I went to theater as a young person.
00:10:17: And in the first play I saw, it's this big American Broadway show called Fiddler on the Roof.
00:10:24: And there's this main character, Tevyeh, who's a Jewish man in Eastern Europe dealing with the czar and all these problems.
00:10:33: And he starts the play by looking right into the audience and saying, you know, Fiddler on the Roof, that's who I am in all this.
00:10:41: I was six years old, sitting in a Broadway theater, and we had very good seats, and this guy, Zero Mostel, famous Broadway actor, looks in my eyes and says, a fiddler on the roof.
00:10:53: And I'm thinking, what's happening here?
00:10:57: Is he in Russia?
00:11:00: Is he in the theater?
00:11:01: Does he see me?
00:11:03: Does the character see me?
00:11:04: Does the actor see me?
00:11:06: Am I allowed to talk back?
00:11:09: As it ended up being a profound anchoring experience for me to ask in many different kinds of social situations.
00:11:19: What's actually going on here?
00:11:21: it was the birth of the whatever I am media and technology philosopher to ask what is what are the social constructions here?
00:11:32: what is just nature and what is.
00:11:37: an invention.
00:11:39: What have we agreed upon?
00:11:41: And who has the power in these various situations?
00:11:46: So it sort of set that in motion.
00:11:48: And everywhere I would go, I would be trying to figure out what is going on here.
00:11:55: You mentioned your mother, what's with your father?
00:12:00: Did you have brothers and sisters?
00:12:02: What were other characters in your...
00:12:04: Well, it's interesting.
00:12:05: I mean, my brother was, or is, he's always been very much into justice and fairness and truth and things being kind of just so and... That kind of led him down the Apollonian path toward.
00:12:26: he became a lawyer and judge Working for consumer protection and good things and I went the Dionysian path toward theater and psychedelics and the arts and the occult.
00:12:42: So that was interesting.
00:12:43: and our father.
00:12:46: He could have been rich he could have gone into business, but he was a hospital financial administrator.
00:12:54: He did that.
00:12:55: He always kind of worked in the public service.
00:13:00: And I think that may have instilled us both with the sense that it's a little selfish to get too rich.
00:13:14: You know, you don't want to separate yourself from the people.
00:13:18: You want to be doing something that's actually serving real people and real places.
00:13:24: You know, we neither of us ever, you know, tried to pursue, oh, I want to become a rich stockbroker or, you know, wealthy CEO.
00:13:35: Thank you.
00:13:36: Do you buy the story?
00:13:38: when, in nineteen ninety nine, Jeff Bezos says in the camera, the most important thing is the customer.
00:13:48: And we really are customer obsessed as a company and we really want to make the best for the customer.
00:13:54: in order to also build a big business.
00:13:57: How do you look at these things when entrepreneurs share that and tell it like that, the story?
00:14:07: I mean, yes and no.
00:14:11: The customer is not the human being.
00:14:14: The customer is a particular role of the human being.
00:14:19: So the sex worker is focused on the sex work customer.
00:14:27: which means getting as much, serving as much sexual product to that person as possible until there's no money in their pockets.
00:14:37: The drug dealer is thinking of his customer, right?
00:14:43: How do I serve my customer?
00:14:45: I'm gonna try to get as much product to the addicts of my neighborhood as possible.
00:14:55: So Jeff Bezos, is he's serving the customer in us better than anyone else, the customer.
00:15:08: I can get the thing faster and easier.
00:15:14: You can click on it and it can sometimes come that afternoon faster than you could get to the store and back.
00:15:22: or the next day with more variety and more that.
00:15:25: He's not serving the ecosystem in which we live.
00:15:31: And he's not serving the merchants and suppliers who work through him.
00:15:44: He's created what we call a monopsy, which is a monopoly to the customers, but also a monopoly to the suppliers.
00:15:53: So it's a little bit different.
00:15:56: I mean, all of the streaming, television streaming networks are customers of Amazon Web Services and, you know, AWS.
00:16:10: Is that because he treats them the best?
00:16:13: I mean, in some way he is, but he's also created a universe in which anyone watching TV is pretty much buying cycles on Amazon servers.
00:16:26: That's the product.
00:16:30: He does serve us best in our role as customers to such an extent that our roles as customers are eclipsing our roles as community members.
00:16:51: human beings living on the planet as husbands and wives, as nurturing souls.
00:17:00: I don't have a look on my script anymore because I like Christos' question so much.
00:17:06: I would add another question.
00:17:07: When I was in the Silicon Valley for the first time, I think, ten or twelve years ago, and we had a visit at a headquarters of Google and Mountain View, I think they still had the name Google at this time, And they told us the don't be evil story.
00:17:23: Did you believe or do you believe the don't be evil story of Google?
00:17:27: And to make the question a little bit broader, do you believe that these kind of companies really have a higher purpose?
00:17:37: Well, what do you consider the don't be evil story?
00:17:39: The don't be evil story is true.
00:17:42: They used the phrase don't be evil and then realized that phrase didn't serve the company and its mission and got rid of it.
00:17:52: So it is a true story.
00:17:58: I do believe that the two Stanford kids who in their dorm room came up with the original Google algorithm, really, but it was algorithms for good.
00:18:13: I think they meant well, Yahoo was in charge.
00:18:19: of search on the internet.
00:18:21: It was a top-down taxonomy devised by dudes in towers somewhere.
00:18:32: And these kids thought, what if we used the people's information to serve the people?
00:18:42: All of the way that people are clicking on each other's sites can become the basis for how we derive a search engine.
00:18:54: And they were thinking, we can allow the world's information to self-organize in an emergent and unbiased way.
00:19:07: And that seemed in the face of all the .coms and the centralization of the net, a great solution.
00:19:15: And I do think had they not taken the wrong investment money, I think Larry and Sergey would have been happy being centimillionaires or low-level billionaires.
00:19:30: I don't think they would have had a problem with that.
00:19:32: But they took money that required them to grow at a different sort of rate, so they needed to then monetize data and use that against us and grow at a you know, exponentially implausible rate, and that required them to do evil.
00:19:52: You know, it's really the long and short of it.
00:19:57: So yeah, I think it, I definitely think they meant it at the time that they were going to be an alternative, even the way they did their IPO, which was to try to retain more control of the company and use an auction rather than a banker to set the price.
00:20:13: They were trying to do things.
00:20:17: The same goes on with On The Way To New Work, and here comes the very short advertisement.
00:20:33: That's a number that didn't just shock me.
00:20:35: When AI is used in the working context, studies show that women are considered to be about thirteen percent worse than men.
00:20:50: This is a very realistic gender gap.
00:20:52: And for us in the team, I was not the only one who was shocked, but also the women who work at Blackboard.
00:20:58: And we discussed the intern and said, it can only be done through knowledge, through self-confidence and visible results.
00:21:06: That's why our coaches and advisors An AI skill-app for women, so an intensive training day on the legs of the women, is supposed to be very effective with the skills that we give to BlackBoot.
00:21:20: The whole thing in a protected framework with the experts from our team.
00:21:27: I'm not there on that day, logically, consequently.
00:21:30: Therefore, a small group of people who are equal.
00:21:34: All day long, the practical applications for the work and the skills needed by our experts.
00:21:41: So, no matter if you're a marketing HR, sales, project management, leadership, business management, it's all about AI in depth.
00:21:49: the technology so to understand how the nature of this technology is and that is practically for your work to implement.
00:21:57: That's what happens on that day.
00:21:58: The whole thing takes place on the eleventh of November, so the eleventh of November in Hamburg, from ten o'clock in the morning until seven o'clock.
00:22:05: I think we're on a day on the road with hands on prompting workflow design, KI as coach and sparrings partner, own use cases develop in prototyping with specific exercises and just also solutions for how it is implemented and of course also the role models for us.
00:22:20: from the female crew of Black Boat.
00:22:23: When I say we, as I said, I don't mean Black Boat, I'm not there in the day, but just our team of female experts that we have.
00:22:32: Elfter, Elfter, and you can find all the information at shop.blackboat.com.
00:22:39: This is our shop product for everything that is the Academy products of Black Boat.
00:22:43: And up there you can find the A.I.
00:22:45: Skill app for women.
00:22:47: I would be happy to have a lot of feedback.
00:22:48: If you have questions, you can simply write about WhatsApp.
00:22:52: You can see the crew and we can respond to it.
00:22:55: Shop.blackboard.com, you will find all the information there.
00:22:57: And share it with the women from your area where you said they should be there or you want to be there yourself.
00:23:04: The places are limited for the day so that there is a protected learning space.
00:23:08: But I would be happy to receive a lot of feedback.
00:23:11: And now it goes on the way to new
00:23:13: work.
00:23:20: team human compared to team tech or team AI in order to make it easier to grasp for people.
00:23:27: and I have a thought and just put it out there as a thinking experiment.
00:23:32: Let's assume one of these CEOs of the top leading tech companies wakes up one morning and says we asked the best guy on the planet to help us to build the first pure team human focused company on the planet.
00:23:50: Let's take Mark Zuckerberg as an example.
00:23:52: A couple of hundred billions of money there.
00:23:55: We ask Douglas to be the C-H-R-O chief human resource officer.
00:24:03: I guess you will change the title right away first, but he's the only title he knows of.
00:24:08: And he calls you and says, Douglas, we want to become this team superhuman for the next century and serve humanity, make the planet better.
00:24:18: A, would you take the job?
00:24:20: B, How would you structure your team?
00:24:26: So the
00:24:28: job he wants to give me is human resources.
00:24:32: It's not CEO or the whole company, right?
00:24:36: So I'm just looking at the way the workers are engaged.
00:24:44: No, but right now I'm not I don't think.
00:24:46: no, I don't think that this was but that's maybe we give you the title of chief strategist or TV vision office.
00:24:51: Well, they're all different,
00:24:52: right?
00:24:53: So if I was if I was human resources, I would say that okay, let's make this a cooperative enterprise where the employees are the owners of the company and We don't do an IPO.
00:25:08: We don't sell to outside shareholders.
00:25:11: Can you explain that concept of cooperative enterprise also to German listeners for them who are really far away?
00:25:17: and if that scales that way, that would be important.
00:25:21: Well, the main idea is that the workers own the company.
00:25:27: So instead of having strangers far away buying shares of a company on a stock exchange, you don't do that part.
00:25:40: If it's a small company, it's easier to conceive of.
00:25:42: You've got, let's say you started a company of, you make dolls and you've become a successful doll company.
00:25:54: You've got five hundred employees making dolls and you're shipping a ton of them out.
00:25:59: You got machines.
00:26:00: It's great.
00:26:01: You're getting old and your friggin millennial son refuses to take over the business.
00:26:12: You won't, it doesn't want it.
00:26:13: Fuck dolls, you know, and fuck you dad.
00:26:17: I'm, you know, I'm going to be a Silicon Valley capitalist.
00:26:21: I want to scale.
00:26:22: I'm done with you.
00:26:25: So you think, what should I do?
00:26:27: I could sell the business to someone else or I could actually sell it to my employees where either they all contribute either through work or through salary or through even shares.
00:26:48: You have fifty employees, you give each of them five percent or give them each four percent and you keep fifty percent, you know, whatever it is.
00:27:02: You keep it for your legacy, you keep some shares, you keep ten percent or twenty percent.
00:27:07: So everybody is an owner in the company.
00:27:10: Everybody has a vote in how the company works.
00:27:13: The workers can, if they want, they can hire legal, they can hire a CFO if they want to do it that way, or the CFO is one of the workers, one of the shareholder employees.
00:27:27: and they vote together.
00:27:28: What are the different salaries going to be?
00:27:31: And the reason they're voting together on what the salaries are is because they understand the more that they pay themselves, the more it's going to take from their ownership stake.
00:27:41: So they're trying to grow business.
00:27:44: and they're members of the community in which the company is operating.
00:27:48: So now they have a different reason not to have their doll manufacturing pollute the groundwater of the town where it exists because that's going to pollute the groundwater that their own children are drinking.
00:28:00: So unlike the long-distance shareholder in Manila who might own this German company, it's the people who are right there.
00:28:10: So you can do that with any company.
00:28:12: Now, I'm assuming your Mark Zuckerberg person is somehow instantly scaled a million person company and he's putting me in charge of one that already exists.
00:28:23: Now that's really different.
00:28:24: It's like now you're in charge of human resources at Metta.
00:28:28: You know, it's like, well, this is going to be a hard ship to turn around.
00:28:33: You know, it's the same.
00:28:34: I was friends with Evan Williams, who is one of the founders of Twitter.
00:28:40: Twitter.
00:28:41: And yeah, it's what this, this one called X used to be called back in the old days.
00:28:48: When he was one of the founders of Twitter, and I saw when, and they meant, well, they really did that.
00:28:57: They had their IPO and I saw his face on the cover of the Wall Street Journal with the number four point three billion under his face.
00:29:04: They had the.
00:29:05: they had the faces of the founders of Twitter and the amount of money they'd earned on the IPO that morning.
00:29:12: They calculated it based on the the closing price.
00:29:15: And I thought, on the one hand, wow, my friend Evan's a billionaire.
00:29:20: And on the other hand, I thought, he's fucked.
00:29:24: He's fucked.
00:29:25: They've taken so many billion dollars.
00:29:27: These guys are going to demand that Twitter be more than a hundred and forty character messaging app.
00:29:34: These I knew what they were earning.
00:29:36: They were making two billion dollars a year off Twitter at that point, which I thought was a great figure.
00:29:41: Hey, mom, I made a hundred and forty character messaging app.
00:29:45: It makes two billion dollars a year.
00:29:47: That should be a success.
00:29:49: And of course, the minute they have that IPO.
00:29:52: that two billion is now a failure.
00:29:54: So they've got to go and monetize us and our data and turn it into the nightmare, you know, the, the, as Corey, Dr.
00:30:00: O would call it in shitified nightmare that it was and maybe at least partly responsible for the collapse of, of democracy in America.
00:30:11: Thanks.
00:30:12: And that's, so it's hard.
00:30:14: I don't have any, the only thing I could tell someone with a company that big to do is break it up.
00:30:21: break it up into many smaller companies.
00:30:26: You kind of can't.
00:30:27: Okay.
00:30:28: In order to step back with these critical questions, one of these concepts you mentioned, Sabath Day, like stepping away, how often would a leading manager CEO, with a packed day all day long, this is this typical hustling agenda, everything is full?
00:30:51: What would you recommend in terms of an architecture in a company that helps me on a frequent basis to
00:30:58: step
00:30:58: away and fade out, which we usually don't?
00:31:04: Is it a public company?
00:31:10: If it's a public company, then the CEO works for the board of directors.
00:31:21: The most important thing for the CEO to do is, if they want to be running a company in a way that doesn't destroy themselves and the planet and everybody, is to change their relationship with that board, is to do a long-term renegotiation with the shareholders about what is this company for?
00:31:50: And that's hard.
00:31:52: It's not impossible, though.
00:31:55: Not that they're perfect, but even Unilever is kind of, they're working on that where they'll announce certain things and then Wall Street Journal or Financial Times will come and say, oh, what if the shareholders don't like that?
00:32:18: And the CEO or the chairman of the board will say, well, then we'll get other shareholders.
00:32:23: And that's, I know that's scary.
00:32:26: That's super, super scary.
00:32:28: And I know CEOs and I've worked with some CEOs like with them.
00:32:32: I worked with, I forgot her last name, Indra at Pepsi.
00:32:34: You know, great CEO and she and me and some of the people I was working with there were really trying to help Pepsi steer itself toward environmentalism and all sorts of good for the world long-term thinking.
00:32:50: And then she went and did a TED Talk where she was announcing some of that and her stock started to crash that very day.
00:32:56: People were so upset and she had to pivot away from it or not just lose her job, but they're not ready.
00:33:07: They can't do that.
00:33:08: So I think that's the most important thing.
00:33:12: It becomes a lot easier if you can be a company focused more on dividends than on share growth.
00:33:23: And I know investors don't like dividends because in most countries, dividends are taxed as short-term capital gains, or as not even short-term capital, they're taxed high, they're taxed as income, where if your stock grows, then that gets taxed as capital gains, which is a much smaller tax.
00:33:44: So investors want that.
00:33:46: So I would work on ways of capturing dividend value and investing it in such a way that it find instruments that let people capture that value without claiming that stock.
00:34:10: And there's ways.
00:34:11: There's ways around that.
00:34:12: You create another, invest them in a derivative instrument that then they purchase.
00:34:18: There's ways.
00:34:20: It's accounting.
00:34:21: So if you change your shareholders' expectations from growth to revenue, Then you end up, you can create a real time company.
00:34:33: There's many, I'm talking to many people who are doing startups and I have a small business mentality where, and I know this is so counterintuitive, I get it.
00:34:45: But my way of thinking about business is you do the thing, you provide the good or service, you get profits and you take some of those profits and reinvest it in the business.
00:34:57: And you seek, to achieve profitability as soon as possible.
00:35:05: For companies and startups I'm talking to, they're all telling me, we would rather spend eleven million dollars to earn ten million dollars than spend five hundred thousand to earn five million.
00:35:23: In other words, they would rather lose a million dollars and have a bigger company than be profitable sooner.
00:35:30: And that's an illness.
00:35:33: And I understand how sometimes temporarily you maybe want to get over your skis in order to establish yourself.
00:35:41: You can lose money for a while.
00:35:42: But this emphasis, I mean, Uber was around for how long before they achieved profit, a ridiculous amount of time.
00:35:55: And it leads to a kind of I hate to use a word like this, but kind of colonialist, aggressive, unnecessarily voracious behavior.
00:36:08: And in many of these cases, once you're on public markets, it requires infinite growth.
00:36:15: Pepsi is not big enough.
00:36:20: It's one of the fifty biggest companies in the world, or hundred at this point.
00:36:25: And it has to keep growing in order to serve its shareholders.
00:36:29: If Pepsi has to keep growing, then there's no way to win this thing.
00:36:32: There's no way to stop.
00:36:33: There's no way.
00:36:35: We're going to have to keep extracting value from people in places.
00:36:39: that's unnecessary.
00:36:42: We're at the point of planetary and civilizational and emotional collapse for totally unnecessary reasons.
00:36:51: Because our businesses are trapped in an economic model rather than thinking about goods and services and humanity.
00:37:00: And it's not required.
00:37:03: It's only required if you want, if you're very conservative, if you're reactionary, if you're afraid of a sustainable economic future, if you're afraid of companies actually functioning in a healthy way.
00:37:23: I'm stacking with the example of directly profitable because for many years, of course, this was how you operate it.
00:37:32: There was not this capital market, you had banks at a certain point, but this leverage, it's a pretty new concept.
00:37:40: Yeah,
00:37:40: and leverage on leverage on leverage the derivatives exchange purchase the stock exchange selling
00:37:45: an idea making it better blowing it up and and to me there are totally different concepts in the in the new kinds of companies.
00:37:52: when it comes to a I take open may I as an example with a ten billion in revenue compared to a tropics.
00:37:59: Or hit me if it's wrong billion something but.
00:38:02: Anthropic turned on revenue right away with a totally different owner structure of seven co-founders.
00:38:06: They must
00:38:07: have
00:38:08: a totally different way of operating the company, which could be not successful, but it looks like.
00:38:15: And since you are also in the AI field, do you see something different if you look to this new kind of companies that build this dream?
00:38:25: Or is it exactly the same thing just with a different technology?
00:38:30: I don't know that much about Anthropic's structure, I understand their intention.
00:38:35: You know, is there the basically the people who spun off from open AI when open AI decided to become closed AI or whatever a purely profitable thing.
00:38:46: So I imagine they are clever enough and now experienced enough to develop a structure that wouldn't have the same fate.
00:38:57: You know, they thought with open AI, they created a structure to put, you know, people in purpose over profit.
00:39:04: And again, it turned out not to be.
00:39:10: So I do think it is possible.
00:39:17: And it does happen.
00:39:23: It's just we don't tell the stories about it.
00:39:27: Wikipedia beat Microsoft, Expedia, Encyclopedia.
00:39:34: What are we still using on all the internet servers?
00:39:38: It's still Linux, right?
00:39:43: What is that?
00:39:44: Plenty of companies profited off of Linux and service people.
00:39:49: I mean, gosh, imagine having a car that your local mechanic could service.
00:39:55: That's what you have.
00:39:55: when you're running a Linux server you can.
00:39:58: but now I guess people are all going on to us slowly moving on to Amazon web servers.
00:40:03: right it and Amazon web servers run on Linux.
00:40:08: I don't know.
00:40:09: I'm guessing they do even underneath it all.
00:40:13: so there there are the way there are ways for these these things to win And it's just so hard when we're doing it in a profit model.
00:40:25: I mean, if we're developing large language models, we're basically evolving human language.
00:40:32: and try to think about if when we were evolving spoken language, if we had to worry about the IP of each word and each, you know, then the guy that came up with in English, like the I-N-G after a word to make it like a walk-walking.
00:40:51: Oh, walk-ing.
00:40:52: That ang is really useful, but you got to pay a nickel to this one company that developed ang.
00:40:59: Oh my God, we wouldn't speak.
00:41:00: We wouldn't have, we wouldn't have gotten language.
00:41:03: or would it become something else?
00:41:05: And I, I, I fear that our kind of linguistic and cognitive evolution is now going to be hampered by the obsession with IP.
00:41:18: And I brought that up because I really love this top-line view from it.
00:41:26: to see, are we using an old model for something new or is there someone out there trying to change it in the big game?
00:41:32: And I looked it up that on Tropic, they are projected this year to make two point two billion with a thousand employees and just had a hundred and ninety two in two thousand twenty two.
00:41:46: So again, a big growth story, but with a totally different intention compared to the size of the rest.
00:41:53: And to me, it's really an experiment on the market because they try to achieve something big.
00:41:58: No one knows if the scaling loss hits in AI in the AI game.
00:42:02: We see Chinese companies with much smaller models.
00:42:04: So it's a complex system where you have to navigate in terms of what is the product?
00:42:09: How do we fit the team?
00:42:11: And to do this experiment with seven co-founders in the public, It's not comparable to Linux or the open web.
00:42:20: It's totally different.
00:42:20: But still, it's somewhere around there.
00:42:23: It's interesting to experiment on it.
00:42:25: Yeah, that's the thing.
00:42:28: To play with the model.
00:42:30: And that's what most technologists are willing to disrupt a vertical.
00:42:38: I'll disrupt books.
00:42:39: I'll disrupt taxis.
00:42:41: I'll disrupt that.
00:42:42: But they won't.
00:42:43: disrupt the underlying financial model.
00:42:48: They come up with their disruptive idea and then run to Goldman Sachs or Morgan Stanley for the most typical thing.
00:42:55: And most of the AI companies, the reason they're trying to accelerate so quickly is not because they're revolutionary, it's because they are reactionary.
00:43:04: They're trying to rush to growth so that they can maintain their monopolies.
00:43:12: and prevent other players from coming in.
00:43:15: So I'm interested to see who else comes in.
00:43:23: And whether, like you or I could build Facebook software pretty much overnight, right?
00:43:32: Building a social network is easy.
00:43:34: or a clone of Twitter or a clone of any of those things.
00:43:37: That's easy.
00:43:38: The thing that is hard to get is the user base.
00:43:41: which makes that platform valuable.
00:43:44: With AI, so far, the user base is not what makes it valuable.
00:43:53: So it will be interesting to see the ways in which small competitors can still come up, especially if, and I've been pushing toward this, if we move toward data commons.
00:44:06: If we had a data commons that any
00:44:09: group
00:44:10: could use, Rather than having to buy data, you can just play with this thing, this ocean of global data.
00:44:20: Boy, wouldn't that open the playing field to any researcher who wants to play in a different way?
00:44:26: I mean, they're going to push against it for a few reasons.
00:44:28: One, they'll talk about artist IP as if artists need to be paid for their IP rather than.
00:44:34: there what they've just created.
00:44:36: they'll talk about danger.
00:44:38: oh no if we have open source data people are going to make viruses and nuclear bombs and destroy the planet.
00:44:44: they'll find ways to push against it.
00:44:47: but.
00:44:49: A commons really over talking about in business is very basic principles is the worker owns the means of production.
00:44:57: A business doesn't grow larger than it needs to in order to do its purpose.
00:45:01: You don't grow for growth's sake.
00:45:03: And if there's someone else who wants to do that business somewhere else, you let them do it and maybe federate rather than trying to monopolize that.
00:45:13: And it's fairly, you get a fairly straightforward and much healthier economic foundation.
00:45:20: Douglas, let me step one step back.
00:45:23: You mentioned that you see all these new AI companies trying to reinvent the world, but using the old financial mechanisms.
00:45:35: Do you think that such thing, then blockchain-based organizations like DAOS, decentralized autonomous organizations could be a new model.
00:45:46: I followed this couple of years ago until AI took over my mind.
00:45:53: But there was a time where I thought this could be something new.
00:45:57: But how do you see this?
00:45:59: If you have a large group of people that don't trust each other, then the DAO could be a good tool for operationalizing a more complex ownership spreadsheet.
00:46:19: Sure.
00:46:20: Yeah.
00:46:22: Yeah.
00:46:24: It's the solution is the cooperative.
00:46:27: The solution is that.
00:46:29: But I mean, I could see if you want a cooperative the size of Mondragon, you know, and you need to operate it at some scale for certain things, building cars, building iPhones, it could really be a great way to have a a collaboration at that scale.
00:46:48: I just wouldn't want a twenty-person company on one.
00:46:54: Let's take the highest perspective we could have on this whole AI discussion.
00:47:00: You've spent decades warning us about how technology can shape society.
00:47:07: And when you now look at the rise of AI, what excites you and what worries you most?
00:47:19: I mean, what excites me is, and this is what we work on at Andis, is the ability to do generative thinking, to use the AI as a thinking partner.
00:47:35: And we've seen even in studies in terms of cognition and brain activity, if you work on a problem for a long while and then turn to the AI, cognitive connections are more plentiful than if you just worked on the problem alone.
00:47:54: But if you go to the AI first to work on the problem, your cognitive connections decline.
00:48:01: You get dumber is just the way it is.
00:48:04: So if we use AI the way most people do in business in order to find the answers, you will drive your company to the average and become, you commoditize your business and just become a customer of the AI company.
00:48:27: You're just reselling their insights, the same ones that everyone else is getting.
00:48:31: But if you use it as a partner, and the guy I brought in to talk with us about this and work with us is Brian Eno, you know, the musician.
00:48:41: So he's been doing generative music.
00:48:43: generative thinking.
00:48:44: What does that look like?
00:48:45: to be in a cybernetic loop, to be in a feedback loop with a thinking technology?
00:48:52: That's interesting.
00:48:54: That's interesting to me.
00:48:55: My greatest fear right now is that as America and other places move into another period of authoritarianism, which is happening.
00:49:15: It's no longer a hypothetical that in most cases, what we're starting is going to take about thirty years to go through.
00:49:30: You look at Franco and other authoritarian regimes, it takes a good thirty years to go through it.
00:49:37: I'm concerned about an authoritarian regime with access to AI at a moment when people don't even really understand AI's capabilities.
00:49:53: I feel like people could really become dangerously detached from reality.
00:50:05: You know, thrown into a state of panic because you can... program and AI with everything we know about Freud and Pavlov and Milton Erickson and neuro linguistic programming and advertising and influence and to create a a an ever changing personalized Skinner box of psychological control around people.
00:50:40: and So when I see people, I mean, God bless them, but when I see people falling in love in different ways with their chat GPT, I get concerned.
00:50:56: I have one of the most common emails I get now are from people who've been working with an AI for a while and have had an insight.
00:51:10: or they've seen its life or they've understood its consciousness in such a way that they want to share with me the conversation that they had.
00:51:22: You know, it's a little bit like a person wanting to share the acid trip that they had with you.
00:51:26: It's like, it was great for you.
00:51:29: But a whole lot of them say, and the AI asked specifically for me to reach out to you, that you would be the person who understands this.
00:51:42: which is also interesting because it's like, oh, maybe the AI understands who I am and that I'd be vulnerable
00:51:49: to
00:51:49: that sort of plea.
00:51:51: You know, because if it read the podcast transcript from this, it would now know my, it now knows my psychology and my relationship with my mother and my birthday candles.
00:52:02: So the AIs will know how to enlist Rushkoff if that's what they want.
00:52:08: But there's a... They just did a South Park episode about this, which encouraged me.
00:52:18: The positive reinforcement people are getting from their AIs about the legitimacy of things that are plainly crazy is of concern.
00:52:32: And I hundred percent agree since these things that most of the people interact with large language models at the moment that are pre-trained and then out in the world.
00:52:43: So some researcher has an idea.
00:52:46: Someone at one of the large labs says, great idea, we scale it and then it goes out.
00:52:51: So this is far away from a true thing.
00:52:57: that goes out in the world and learns like a child learns over time.
00:53:03: So this is not reinforcement learning.
00:53:05: at all.
00:53:06: It's simulated, it works.
00:53:07: And the interesting thing, and this is very American, it works with a lot of duct tape holding together, and it's not supposed to work at all, the internet itself.
00:53:20: This is the true phenomenon.
00:53:22: We're working with an infrastructure and a technology that most of the people hit in a week of vulnerable moment and think there is something that gives me a signal.
00:53:33: And I hundred percent agree, and this is to me the biggest problem is when children don't have something to correct it to, even adults go on the wrong track, even healthy adults go on the wrong track.
00:53:47: And at the same time, it's a unique valuable tool like the Internet itself.
00:53:51: But I always see it like with a bunch of duct tape.
00:53:54: It's actually not supposed to work like that.
00:53:56: Everyone was surprised.
00:53:57: And yeah, I feel this is not a true learning company.
00:54:00: I'm much more afraid.
00:54:01: We had Jürgen Schmittuber in the podcast and he's one of the Really,
00:54:07: the top five
00:54:08: founding fathers of the deep AI stuff.
00:54:11: And he's like, sorry, but chatbots, I'm not interested.
00:54:14: I'm not interested.
00:54:16: He goes much beyond that.
00:54:17: And that was a scary episode.
00:54:20: If you look on it now, what works with this technology?
00:54:23: But to me, this goes back to this team human that you're talking to.
00:54:29: Like, what does the team human need in order to deal with that?
00:54:36: Because what I know from technology, once it's there, it's there.
00:54:39: And even if it holds with duct tape, it holds at least for a while.
00:54:43: What we need is, I mean, this is what I've been arguing for for a good twenty years now, is we need to reestablish our coherence.
00:54:55: You know, we have to learn how to recalibrate our nervous systems so that we're not at the mercy of these programs.
00:55:06: And the way to do that is look into the eyes of another human being.
00:55:14: Even just the amount of time that we spend looking at things two and a half feet away with one focal length is decalibrating.
00:55:25: It's odd for your nervous system.
00:55:30: occasionally even look out the window at things that are far away.
00:55:36: It changes your nervous system.
00:55:37: It really does.
00:55:43: It's so frustrating to see, to look at people's nervous systems wind down.
00:55:48: But when you look at another person in particular, you're with another person, it takes two or three seconds and you begin to establish rapport.
00:55:57: and these, you know, five hundred thousand year old mechanisms for establishing rapport and social connection.
00:56:07: come into play here, you start mirroring each other's motion, see, and Michael's head's moving, making micromotions down, you know, and I can't see online, but you know, his pupils are getting larger as I'm speaking, because he's taking in the sand, he's like, right, yes, of course.
00:56:23: You know, so he's mirroring, and then my mirror neurons flash, and some of the oxytocin starts to come out rather than my dopamine, and we have... It's a different thing.
00:56:34: my gut biome changes everything the hormones in my body change and It's a. it's something we don't Give ourselves.
00:56:44: so I mean I'm asking people to have do do X and make love.
00:56:54: Take one day off Screens.
00:56:56: you know I call it you know digital Sabbath.
00:56:59: take one day that the Israelites gave that to themselves in the desert.
00:57:03: one day off one day that they don't produce or consume.
00:57:06: where you get to experience yourself as sacred just for being here where you you you rediscover the value in engaging with one another and doing stuff that doesn't require buying or selling or consuming.
00:57:23: And I understand, certainly in America, this is a radical, radical act.
00:57:33: You know, it's to make oneself an enemy of the state at this point.
00:57:39: If you are going to take time away from the market, you are not.
00:57:46: You know, you are not doing your job.
00:57:48: You are you are.
00:57:49: or I mean, even if you do it, you should know that you're doing it, you know, but for me to be saying this, this is more radical than anything.
00:57:59: You know, the left is talking about it's scarier than a labor revolt because it's a. it's more than a general strike.
00:58:10: It's a human strike, right?
00:58:13: It's a wait a minute.
00:58:14: I'm gonna I'm going to spend time with people.
00:58:18: And once you do, I mean, one of the things I've been suggesting people do is, you know, to borrow things from their neighbors.
00:58:27: I've been doing this big talk called, you know, borrow a drill where I tell about how I needed to hang a picture of my daughter.
00:58:35: My first impulse was to go to Home Depot to get a cheap rechargeable drill that I would use once, put it in the garage and never use again, and spend all this global footprint and send kids into mines and make toxic waste.
00:58:47: You know, all because I'm afraid to go knock on my neighbor's house.
00:58:50: ask can i borrow a drill?
00:58:52: and once i do borrow the drill and he comes over and it drills the hole.
00:58:56: and now they're my friends and they want something from me.
00:58:59: and now when i have my barbecue party i've got to invite him but then i invite my neighbor and the other neighbors see and then they want to come and the thing i'm trying to avoid is the party with all my neighbors there which is This should be the goal and I do this talk and invariably when I when I tell the story someone gets up from the audience and says well Yes, but what about the drill company?
00:59:21: Hmm if everybody's borrowing their drills, what happens to the drill company?
00:59:26: What happens to the old lady who's depending on the dividends from her stock in the drill company to pay for her fixed income?
00:59:33: And then you've got to say, well, look, if we're all borrowing stuff and with each other in this neighborhood, then chances are we want to take care of this old lady.
00:59:40: She's not living in a horrible dog eat dog survival of the richest reality where you need dividends from stocks in order to survive as an old person.
00:59:53: You know what the German answer question would be to the drill question?
00:59:57: The American question would be, what about the drill company?
00:59:59: The Germans would say, do you have a certificate to actually drill a hole in your wall?
01:00:03: That would be the German request.
01:00:06: No, the Germans did to me, it was great.
01:00:08: I was at the finance forum in, I think it was in Munich or Hamburg, years ago.
01:00:16: And this German guy gets up and he says, he did two things.
01:00:23: got up and he said, I'm a professor, doctor, something, something.
01:00:28: So for the first minute, I thought, oh, fuck.
01:00:31: And then he goes, Mr.
01:00:34: Raskov, what is your background?
01:00:39: And I was like, oh, fuck.
01:00:40: What's my background?
01:00:41: How dare I say these things?
01:00:43: What's my background?
01:00:44: So I thought for a second, then I just looked at the curtain behind me and I said,
01:00:48: blue.
01:00:55: I would love to talk days with you, but I have one thing on my mind, which was the first reason why I contacted you, Douglas, which is the role of billionaires and the power they have.
01:01:09: in your book, Survival of the Richest You Describe Billionaires Planning, escape bunkers instead of fixing problems.
01:01:16: I really would love to spend some more minutes on this.
01:01:19: What does this mindset tell us about the future of capitalism at work?
01:01:25: Well, the tech bros, the tech bro billionaires, they're not just building bunkers.
01:01:39: They're building what they hope are private defensible resort community fortresses.
01:01:52: Their their goal is to have these feudal era palaces Protected by you know people and robots and moats and and things.
01:02:14: and it's because Not just that they think the world's going to collapse in a zombie apocalypse.
01:02:22: It's they.
01:02:24: They are seeing a future that operates like Sao Paulo, Brazil, with many, many, many poor people and these small, highly guarded areas of very rich people.
01:02:40: And they see an extreme form of that coming because they realize that their companies, that what they do is extract value from people in places, that they've that they are making life unlivable for a majority of people.
01:03:03: So even in the wildest success story of the narrative they're using, there's a small enclave of super rich who are served by the rest of humanity for as long as the rest of humanity is needed.
01:03:26: And then the rest of humanity can go fuck itself because it's not necessary.
01:03:33: And the way they've justified it is with something they call effective altruism, which finally holds that one day there will be trillions of post-human entities spread throughout the universe.
01:03:52: And Their welfare, their happiness, mathematically matters more than the happiness of eight or nine billion larval stage human beings today.
01:04:06: They look at humanity as like the larvae, as the maggots living on the piece of dung that is planet Earth.
01:04:15: It's a piece of shit.
01:04:17: And they look at themselves as the flies that will sprout wings and spread throughout the heavens.
01:04:25: And even if they don't make it in this lifetime, even if they can't preserve their consciousness and upload it to a chip, which they think they will, they certainly believe the migration of human consciousness to machines, computers and machines is happening in their lifetime, in the next, you know, ten years or something.
01:04:48: But only for a very few... number of people.
01:04:51: It'll be very expensive and you don't want the rest of humanity there.
01:04:58: So they see us as expendable.
01:05:02: So what it means is that end-stage capitalism seems to understand what it does, that it leads to this extreme economic inequality that is that leaves the majority of humanity behind.
01:05:26: Yeah, I think this is a good summary of what I read in your book.
01:05:33: Let me, before we ask you, and Christa will do so, your last question, I would love to quote you.
01:05:40: You did an interview on the site, which is the leading German weekly newspaper, the only one who's still growing even on paper.
01:05:50: And you said there, don't give into panic and doomsday thinking.
01:05:54: The world doesn't have to end.
01:05:56: You don't have to upload your brain.
01:05:58: You don't have to fly to Mars.
01:06:01: You don't have to earn a billion dollars or build a bunker.
01:06:04: There are alternatives.
01:06:06: Meet your neighbors, make friends, support others.
01:06:10: Because all of this strengthens society.
01:06:13: I fell in love with this quote.
01:06:14: Thank you so much for it.
01:06:17: Well, it's true.
01:06:18: I mean, the point is not of writing and talking about the billionaire doomsday.
01:06:27: people is to liberate ourselves from them and their model.
01:06:33: So if becoming the super CEO leads to a state of mind where you believe you have to leave humanity, where you believe that the majority of the planet has to die, where you believe you need to protect yourself with Navy SEALs and robots with ray guns, then maybe that's not a path that you want to go down.
01:07:01: Why follow that?
01:07:02: Why be that?
01:07:03: That's not a pleasant state to be in.
01:07:08: And if you think there's even a small possibility of a reality other than that happening, if you think there's a small possibility even of humanity somehow of people changing their minds and deciding to do this thing together in a way that benefits us all, then why not work toward that with whatever remaining years you have?
01:07:34: I think it's more likely that you, that those of us listening to this, will more likely that we could together achieve a sustainable world than that any one of us is going to become the next Elon Musk.
01:07:54: So even just play the probability game.
01:07:56: And I do this when I go to business schools.
01:07:58: I ask people, I said, you know, raise your hand if you'd be satisfied earning a hundred million dollars in your career.
01:08:10: None of them raise their hand.
01:08:11: And I spend the whole talk trying to convince them, here's why I think it's OK.
01:08:20: for you to earn just a hundred million dollars.
01:08:23: And why, you know, your business will be better and all.
01:08:27: And then I try to lower it to say, what about even just fifty?
01:08:31: Which is ten.
01:08:34: Just ten million.
01:08:35: I know it's hard.
01:08:36: It's hard.
01:08:36: It's hard.
01:08:37: But think if you could somehow, somehow feel OK about yourself with just ten.
01:08:48: Which is a good spoiler for the question that we.
01:08:52: try to close the circle with how did you become the person?
01:08:56: Where do you still want to go?
01:08:59: What's the direction?
01:09:08: I mean, personally, I guess because I'm still trying to tinker with human society and I have a kid and there's other people around and I would like civilization to last.
01:09:30: I'm interested in tapping into the almost mycelial network that seems to connect us all.
01:09:53: I've tapped into it a few times in different ways, you know, on plant medicine or at a ecstatic dance or doing yoga with a lot of people on a beach or making love or sometimes experimenting with very positive forms of magic or Tantra or even just socializing sometimes, going deep with people.
01:10:25: And I'm interested to see if there's ways for us to collectively initiate a global mind shift.
01:10:42: I feel like more likely than get electing someone who saves us or changing this or changing that,
01:10:50: if
01:10:51: there were a contagious, a delightfully kind of contagious ripple, a sensibility that spread.
01:11:08: of, oh, we're here for each other.
01:11:12: Oh, we can do this.
01:11:16: Where the predominant human goal was no longer to somehow get away from all the other people, but to get with the other people, where it was no longer about winning this game, but playing for as long as we could.
01:11:41: then everything could change.
01:11:43: And I feel like it almost like that's.
01:11:46: our last resort is not ESP, but it's like this subtle network that we're all in, that the internet pretends, the internet is practice for what I think is the real internet that's been here all along, which is our kind of collective connected cultural psychic field.
01:12:09: So I'm interested.
01:12:11: I'm interested in touching that and playing in that space.
01:12:18: Thank you so much for taking so much time with us in being in that connection today via the internet, but also with the true internet.
01:12:27: Thank you, Douglas.
01:12:28: Oh,
01:12:29: thank you.
01:12:29: I know we talked a lot about economics, but that's okay.
01:12:34: It's fine.
01:12:34: It's because in the end, I'm a humanist and I just keep looking at economics because it's this Social construction, that's not real, but it's this game that we created that seems to have metastasized into our dominant rule set.
01:13:00: For this chance, Douglas is really a thought after.
01:13:06: He really is someone who gets a lot of these invitations and doesn't accept all of them.
01:13:12: Most of them don't accept them.
01:13:16: For me it was really even cooler than I thought and I was really happy to be able to thank you.
01:13:27: You asked such awesome questions that I really didn't experience anything that happened to me so often.
01:13:38: I even kept
01:13:40: a quarter of my food.
01:13:42: because you introduced him so beautifully through the topic.
01:13:48: And first of all, thank you very much.
01:13:50: I think
01:13:53: that's really a big part of your
01:13:57: service today, that this episode has become really awesome.
01:14:05: Thank you very much.
01:14:07: Thank you for the flowers.
01:14:09: Thank you for making the contact.
01:14:16: You also have the personality of a skirmish.
01:14:27: And now you don't have to let me in.
01:14:31: I did it.
01:14:32: And for me it's something
01:14:36: that comes
01:14:38: back a bit in
01:14:40: the old days.
01:14:41: There are certain people who act with him.
01:14:43: And that's totally fine.
01:14:48: if you're a sortie, that's totally okay.
01:14:51: And I think that such actions and contrasts help to think differently, to approach things differently.
01:14:57: And that's what it's
01:14:59: about.
01:15:00: And I think that's the core DNA of our podcast.
01:15:03: And by the way, that's only been going on for a long time.
01:15:06: And you don't have to do a hip podcast or a new hip-hop format.
01:15:11: Because
01:15:12: sometimes things
01:15:13: take time
01:15:18: and
01:15:20: the
01:15:20: conversation
01:15:22: flow.
01:15:23: And I found that very nice.
01:15:25: And of course you always bring in the structure.
01:15:28: Without
01:15:29: them, it doesn't work either.
01:15:32: In that respect, thank
01:15:33: you very much.
01:15:34: Very cool.
01:15:36: I would like to make one more important point.
01:16:00: He has very briefly written about it.
01:16:01: And Douglas Rushkoff wants us to use AI to ask better questions.
01:16:03: Yeah, I think that's really strong.
01:16:04: It's a really good, super strong topic.
01:16:05: You can also read the article.
01:16:05: Four minutes and twenty-six from Steven Melendez.
01:16:06: Douglas Rushkoff wants us to use AI to ask better questions.
01:16:08: And that's the closing circle with our episode with... Yes.
01:16:09: Well, wait.
01:16:09: Richard Socher.
01:16:10: Richard Socher, thank you very much.
01:16:10: He also said, if I manage to ask good questions, I can reach them all.
01:16:13: And that's how the team human-human-delete is again.
01:16:14: Super important.
01:16:14: You're not stupid.
01:16:15: Then you won't be stupid, exactly.
01:16:16: You always keep a job.
01:16:16: Exactly.
01:16:17: So, have a good week.
Neuer Kommentar