"The Data Diva" Talks Privacy Podcast

The Data Diva E247 - Michael Robbins and Debbie Reynolds

Debbie Reynolds Season 5 Episode 247

Send us a text

In episode 247 of “The Data Diva” Talks Privacy Podcast, Debbie Reynolds talks to  Michael Robbins, Social Entrepreneur and Civic Builder, and a visionary in building human-plus-digital learning ecosystems. We discuss his decades-long journey at the intersection of education, technology, and community, from grassroots innovation to White House policy. Michael shares a compelling vision for the future of AI in education, centered on empowering individuals to create and control their own AI narratives. He introduces his data model, called DOTES (Do, Observe, Tell, Explore, Show), which captures real-world learning experiences and enables the training of personalized AI agents grounded in data integrity and digital personhood.

Our conversation explores the concept of implication models, AI systems that learn from and work for people, rather than exploiting their data. Michael draws parallels between decentralized data governance and the design of AI trusts, where individuals have full control over their digital identities and contributions. We also explore the limitations of current large language models and discuss new frameworks that could rebuild AI from the ground up, centering privacy, consent, and community.

Together, we envision a future where youth and adults alike use AI not as a replacement for human intelligence but as a tool for self-expression, empowerment, and democratic participation. This episode is a masterclass in AI ethics, digital sovereignty, and the urgent need to shift from extractive technologies to human-first ecosystems. We hope for a future where data privacy is not just a legal checkbox, but a fundamental principle of technological design and societal infrastructure.

Support the show

[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice. Hello,

[00:07] my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with the industry leaders around the world with information that businesses need to know.

[00:19] Now I have a very special guest on the show, Michael Robbins. He is the co founder of Learning Pathmakers. Welcome.

[00:28] Michael Robbins: Thanks, Debbie. I'm excited to be on the podcast.

[00:31] Debbie Reynolds: Well, I think it's only taken us, I don't know, maybe four or five years to get this going.

[00:36] Michael Robbins: Yeah, A lot has changed, I think, since we first started talking about me coming on here.

[00:43] Debbie Reynolds: Yeah. Yeah. Well, you are a builder of human and digital learning ecosystems. And you and I had talked a number of years ago. We are very connected on LinkedIn. I love the things that you talk about, about education and data modernity, like what's happening next, what we need to be thinking about.

[01:01] And so tell me a little bit about your career trajectory and how you came to this particular juncture at Learning Pathmakers.

[01:11] Michael Robbins: Happy to. For the last three plus decades,

[01:15] I've been working at this interesting intersection of education and technology and community development from the grassroots level to the White House.

[01:27] And it's an interesting moment now to think about how we're learning as humans alongside machines.

[01:36] And I. So when I talk about building human digital learning ecosystems,

[01:42] that's what I've been deeply engaged in over the last decade,

[01:47] which is how initially the focus had been more around how do we add technology into human learning ecosystems. And now there's been an interesting shift, which is how do we put people back in the loop and in charge of AI,

[02:03] that in many cases people are concerned that it's careening out of control.

[02:09] Debbie Reynolds: I think one of the things that concerns me a lot and I hear a lot,

[02:15] I see all these stories in the news. It's interesting now because of,

[02:20] you know, people, this AI Go rush, as I call it,

[02:23] there are so many different articles and some of them, I don't know about you, I'm like, oh, that's not right. You know, kind of screaming at the, at the screen about those things.

[02:31] But I think concern that, that in terms of education,

[02:37] we're either not teaching as much as we should or we're overly relying on AI. So people aren't getting the type of, kind of deep problem solving skills that they should have.

[02:50] But I don't know, maybe I'm just going off the rails. What are your thoughts?

[02:54] Michael Robbins: No, I mean, it's really interesting to look at what's happening inside schools. And that was a deep part of my work.

[03:03] Back in 2023,

[03:06] I did a deep dive with a network of schools here in Washington, D.C. where I live and work.

[03:13] And you had this great conversation. This was March of 2023,

[03:17] when the new ChatGPT,

[03:20] I guess that was 3.5 at that point, had come out, and everyone was just wowed. But it hadn't really hit the headlines yet. And so there was a school I was working with here in D.C.

[03:31] and I sat down with some middle schoolers to introduce them to ChatGPT.

[03:37] And they were huddled around me as I'm on the laptop. I said, watch this. I did a prompt where I put their names. I said, you're at your school in Washington, D.C.

[03:48] and you're a team of renegades who are taking control of AI that gone off the rails. And as the story scrolled up on the screen, their eyes just got really wide.

[03:59] And then we translated into Spanish, and then we translated into a rap song. And each moment they were just blown away. And I said, this is amazing, but is this story about you true?

[04:10] And they said, no.

[04:11] And I said, okay, let's talk about where the data comes from that's in this story. And it took them two minutes to understand this, that they recognized some tropes that were embedded in the story.

[04:21] Some things. I said, no. Wouldn't it be better if the story about you was created differently, if you controlled the information that goes in here? So the story that AI is telling about you is true.

[04:33] And that's where they looked at me and they said, oh, so we can have our own AI?

[04:37] And I said, yes. They're like, we want that, right?

[04:40] Debbie Reynolds: We.

[04:40] Michael Robbins: We want that. They got it. It took them five minutes to get this. And so this interesting moment in education, there's a lot of teaching about AI. People are increasingly teaching and learning with AI, but the.

[04:53] What we really need to move towards is how do we put young people in charge of AI for the future? And that's going to require a different approach.

[05:02] And that's what I've been working on. I'm really excited to talk to you about today.

[05:07] Debbie Reynolds: That's so cool. That's so cool. That's amazing. And I agree. Well, not only students. I think that's what businesses want too.

[05:17] Michael Robbins: I think all of us want it. Right?

[05:19] And so what does that look like? Three years ago, if I tried to talk to someone about AI and data,

[05:26] they would just look at me like I was from Mars.

[05:29] Now it's part of Every conversation. I was in a Uber this morning in Washington D.C. and in front of us was one of the new Waymo taxis. Have you ridden in one of those yet?

[05:41] Debbie Reynolds: Oh, Lord, no.

[05:44] Michael Robbins: I had a chance last summer when I was out in San Francisco to ride in one. And the first five minutes was a little disconcerting.

[05:51] But after that, I think I was more concerned with figuring out what kind of music I wanted to play. Because you can pick your soundtrack in the back. It was incredible.

[05:59] Right. And so this Waymo's in front of us, but it's. They're not autonomous yet in dc, they're training them. There's a person actually behind the wheel. And I think about AI the same way.

[06:10] Right. We have this inflection point now. People are talking about some of the limitations of the way the LLMs foundation models are built, and what would it take to do something different.

[06:23] Right.

[06:24] I think, you know, recently, you know, folks like Yann Lecun at Meta, who's the.

[06:30] Their chief scientists, has been talking about the limitations of large language models, Right. That we've thrown all this data that's been scraped from the web from every source possible,

[06:40] and, you know, we just put more and more compute.

[06:43] We're starting to get some diminishing returns there. And what he's proposed, what he calls is this joint embedding, predictive architecture, J E P A JEPA we can call it. And this architecture that he's proposed would be to help machines build a model of the world so that they can actively reason.

[07:03] So I think about that as like that Waymo car. It's building a model of the city, it's helping to understand what the traffic patterns are.

[07:11] What people don't realize is that's what Tesla has been doing as well.

[07:16] Right.

[07:17] Only we've been driving these Tesla cars and building the model for them that then they can use to sell it back to us.

[07:24] Right. Tesla isn't just a vehicle car, it's a vehicle company,

[07:29] it's a data company.

[07:31] Debbie Reynolds: Oh, absolutely.

[07:33] Michael Robbins: So instead of jepa, what I'm proposing is starting somewhere even deeper,

[07:40] which is what I call implication models.

[07:43] And it's how we actually create a new class of AI that learns from us. And so why would we want to train AI?

[07:56] Why would you want to tell it the things that it needs to know in order to learn from us,

[08:01] if that's going to sacrifice our data,

[08:05] if that's going to reach others instead of us.

[08:08] Right. And so what this leads to, and we could talk more about this, is what Are these cascading things that then happen if we think about implication models as the goal for the next generation of AI?

[08:22] Debbie Reynolds: Right.

[08:23] I think a lot of.

[08:24] And I want your thoughts about this as well. So when Deep Seek came on the scene,

[08:30] it was jarring for many different reasons. But to me, that the two things that stuck out about Deep Seek when they kind of launched onto the scene is that they were building models.

[08:42] First of all, open source,

[08:43] first of all. Second, something that you could, you didn't have to run in the cloud if you didn't want to, like you run on your own equipment.

[08:51] And that, that it just by virtue of doing that, it may type of thinking could eliminate some of the problems that we have with the larger LLMs, where it's more discreet in terms of the data that goes in there, if someone puts more targeted data in that model, and then it greatly minimizes the risk of the person having to share their data into kind of these larger models.

[09:18] But I want your thoughts.

[09:20] Michael Robbins: Absolutely. I mean, this idea of creating small language models that are built from a foundation of data dignity, recognizing that our data is really part of our presence,

[09:33] it's who we are. And so there are emerging these abilities to create things differently,

[09:40] have the standard architectures, but with different data protection.

[09:45] And that is a stepping stone on the way to these other models.

[09:50] I talk about these implication models.

[09:53] Debbie Reynolds: Yeah. So tell me a little bit about the cascading. This sounds interesting.

[09:57] Michael Robbins: Sure. So, so in order to do this, we have to. Or you have to figure out, like, what data do we put in there and why? And so what I'm introducing is a data model called dotes.

[10:10] And this comes from my work in education,

[10:13] from learning more broadly.

[10:16] And DOTES bridges the way that we experience learning in real life with the way that we need to tell our stories.

[10:24] It's an acronym. It stands for do,

[10:27] Observe,

[10:29] tell,

[10:30] explore and show.

[10:32] So the show and tell piece of this is the earliest way that we tell our learning stories. I mean, I think about, you know, my time in elementary school,

[10:39] sitting around the carpet in Mitch Levinsky's class in Camarillo, California.

[10:45] You'd bring something from home or something you made in school, and then you'd show it and you'd tell people about it.

[10:51] Show and would tell is actually a learning model.

[10:53] And then I think about later on as I started to do science experiments.

[10:59] You'd observe things,

[11:01] you'd explore what comes next.

[11:03] So this observe and explore pieces of dough are just that.

[11:09] So to break this down in order,

[11:12] what it does is it actually is data about learning through experience.

[11:19] It's how we experience the world.

[11:21] And so we get this small kind of data capsule,

[11:26] a learning capsule. If you think about it as a entry in a ledger that you do something,

[11:32] you can give a narrative description about it.

[11:35] That's the tell piece on Observe. It's, oh, okay, what happened? What, what, what did I feel about this? Right? What were my insights?

[11:45] What went well or didn't go well?

[11:47] And then explore is, okay, well, what would I do next? What would I do differently?

[11:53] And then show is the evidence. It could be a multimedia object,

[11:59] right? An image, the kinds of things we post on social media.

[12:03] It anchors that whole entry of do, observe, tell, explore, show. Now let's think about this.

[12:12] That's some pretty personal information, isn't it?

[12:15] Right. And it's not the kinds of things that exist in our current data sets only,

[12:21] right? No one's putting that. I mean, you can infer some of it,

[12:26] right? That's the thing. Right. The models we have right now are inference models.

[12:31] What this will allow us to do is to get to implication models.

[12:36] What's the so what the what next?

[12:40] Right. And so imagine then that you start to get a record of your stories that then could be used as a basis to create your own AI representative. We can do that even with the large language models we have now and what you're talking about in terms of like private and security secure LLMs and the right kind of data protection and storage.

[13:00] And so we immediately start to get something that's useful. And we've done some demos on this to show what DOTE can do. I did this with the middle schoolers I was working with that story I told you about working with middle schoolers that led to me teaching an elective class for fifth through eighth graders for four months last school year.

[13:21] Oh my gosh. Middle schoolers are amazing.

[13:25] I learned so much from them.

[13:27] Debbie Reynolds: And that's cool.

[13:28] Michael Robbins: Yeah, but they, they got dots, right? And we,

[13:31] we worked on building chatbots that showed them that if they,

[13:36] they collect their data and they protect it in particular ways,

[13:42] that they can create AI for that works for them.

[13:45] So here we have something that also flips the dominant narrative upside down, which is instead of protect your data because people are exploiting it or protect your data because the AI apocalypse is coming.

[13:57] So if you protect your data, you can use it to tell your story, right?

[14:01] Debbie Reynolds: Absolutely.

[14:02] Now I love that. I love it. Because the way that you have students using AI is the way that all of us should be using it where it's kind of augmenting or helping a human.

[14:15] And so the narrative that I don't care for is AI is going to replace humans. And I don't think that's the right thing because I don't want your thoughts because you have deep insights on experience.

[14:27] AIs don't have experience.

[14:31] That's what humans have, or they don't have wisdom. Right. So there are multitude of types of knowledge. And so to assume that AI tool can be like a human, I think is just.

[14:46] Is not right. I don't think it's the right.

[14:49] It's not complete. Right. Maybe it can do some things.

[14:52] They can augment what a human can do or maybe could do maybe certain things faster, better, cheaper.

[15:00] But I don't think again, this narrative like, oh, well, it's going to take over and hopefully not be our AI overlords. But I feel like people are really not discussing the nature of what intelligence means.

[15:15] And part of that to me is experience.

[15:18] I want your thoughts.

[15:19] Michael Robbins: You're absolutely right. I mean,

[15:21] this also gets us to the part where it's AI that works for us,

[15:27] that has this purpose.

[15:29] And I think about that on multiple levels.

[15:32] One is that just the process of creating this data and training the AI this way has to involve us.

[15:42] Right?

[15:43] Is we have this paradigm where AI has been created in an extractive way, in an exploitative way, without consent, credit and compensation.

[15:53] People are really waking up to that and saying, well, what do we do next?

[15:57] And so, number one, there's process here. And then secondly, it leads us into this conversation about, well, what do we do with this?

[16:06] If we're creating these things and we recognize that AI is going to transform the way that we work,

[16:13] what are the structures that we need to build in order to have the value that's created from these new implication models accrue to the people that helped build it. And that's where I'm proposing a model for AI trust.

[16:30] Debbie Reynolds: Oh, I love that. Tell me more.

[16:32] Michael Robbins: Sure. So AI trusts combine two things.

[16:36] One is what people have been talking about for a long time around data compacts or data trust. So if you think about like a, a nonprofit credit union,

[16:45] but your account is your secured data backpack.

[16:49] That's your private and secure account. You have control over that, how that's data used,

[16:55] revocable consent,

[16:56] et cetera, the right to be forgotten.

[16:59] Now imagine too that because you have this account,

[17:04] just like money in a bank,

[17:06] if you're banding together with others,

[17:08] the collective value of that data now has Real power.

[17:16] So it could be that this is now used to train not just what I'm talking about with implication models, but our existing approaches to LLMs. It could be used to extend this and reframe data governance more broadly.

[17:36] Because the second part of AI trusts is the infrastructure piece.

[17:41] And there my analogy is things like electric cooperatives,

[17:47] which we've seen in rural America over the decades,

[17:52] that there also needs to be collaborative,

[17:57] cooperative infrastructure for data storage and computation.

[18:03] There'll continue to be reliance on cloud providers.

[18:08] But this issue of sovereignty has to involve both data protection and the infrastructure itself.

[18:14] So thinking about pushing all of this to the edges and these data trust, then allow us to create,

[18:23] along with our dotes are AI representatives, our AI reps,

[18:29] which I see as kind of multifunctional.

[18:34] It's more than just about AI,

[18:38] it's more than the web.

[18:40] Debbie Reynolds: Right.

[18:41] Michael Robbins: It's about data dignity and this concept of digital personhood. And we could talk more about digital personhood and what that means not just for the web, but for what's coming next, which is this nexus of spatial computing and ambient technology.

[18:57] Right. So augmented virtual reality and the Internet of Things.

[19:01] Debbie Reynolds: Well, this is a topic that I love. So connected systems,

[19:05] Internet of things, augmented the edge. So this is great that we're talking about this.

[19:11] Let's expound upon that a bit. And I think what people don't understand,

[19:15] and obviously I'm saying this, and you just tell me your thoughts. So the technology exists to do the thing that you're saying.

[19:24] Right.

[19:25] I think a lot of the incumbents really want to continue to sustain a model where so like the age of the beginning of the Internet was kind of like all your data went into a bucket, into the system, because your systems weren't big enough, strong enough, powerful enough to do the things that they could do.

[19:46] But now we have technology or even on our phones or our computers that are much more capable of taking on more tasks and more work. And so that means that there are things now that you couldn't do five years ago on your phone, on your laptop.

[20:03] So there are,

[20:05] to me, that creates the opportunity where for a lot of us, we're very concerned about what we share out, right? What goes into the cloud. We're seeing a lot of these data breaches and things, but really being able to make data something that we can protect by virtue of the fact that we're not sharing as much or sharing as widely,

[20:25] and we're using it in different ways that really enhance what we want to do, as opposed to Doing what other people want us to do. But I want your thoughts.

[20:33] Michael Robbins: Yeah, I mean, I think back in history, in these cycles of centralization and decentralization.

[20:39] Debbie Reynolds: It's hilarious.

[20:41] Michael Robbins: It is. I,

[20:42] you know, I learned, gosh, I,

[20:45] I wrote my first computer program,

[20:48] like real one that I remember when I was in 6th grade on a TRS 80 Model 12 with 128k of memory.

[20:58] Okay. Now this program was to make Dungeons and Dragons characters from my friends.

[21:03] That was my middle school nerd dates. When I got into college, I started programming Pascal on a Dec Vax mainframe. I also was on Relay Chat at night when I should have been Debron my PASCAL programs.

[21:18] Right. This is the Internet before it turned into the Web, before Sir Tim's vision there. And, and I started working for Apple Computer when I was,

[21:26] when I was 18 as a student representative. And so I've seen this shift from mainframes to personal computers. And now we've seen this other kind of turnaround where we have these centralized systems made possible by open source and server farms and has resulted in the centralization of power again.

[21:49] But there's another piece that people are missing and it's in the way that the web evolved. The web evolved without a model for citizenship,

[22:02] actually for a model for personhood. If we just boil it even further down,

[22:07] we don't exist on the web.

[22:10] Everything,

[22:10] it was made with all the power on the server side.

[22:14] And so when we look at what's happened in digital society and the imbalance of power,

[22:22] what we don't recognize is that because we haven't existed,

[22:27] we haven't been able to claim our basic rights. I'm excited about what we can do with AI trusts and AI reps built on these models of dote because that's going to give us what we've been missing so far,

[22:41] which is this model for personhood.

[22:44] Debbie Reynolds: Yeah. And so I like the way that you're talking about personhood.

[22:48] As you know, there's personhood talk now, but what they're talking about is another type of centralized system where it's like, give me your eyeball, we'll put it in a big cauldron database and you can identify yourself across systems.

[23:03] So I think that's the opposite what you're talking about.

[23:06] Michael Robbins: Yeah, exactly. We, so we think about like. And that's where these AI trusts are so important and the identity mechanisms that will go along with that because we don't want our existence of the digital world to be controlled by any corporation or by any government.

[23:25] So because then we can get deleted totally.

[23:31] And we see a lot of things getting deleted right now.

[23:35] That's true in many ways.

[23:37] That's true,

[23:38] yeah. So establishing personhood will allow us to claim like if we just go back to like the Declaration of Independence, life, liberty and the pursuit of happiness.

[23:49] Thomas Jefferson wrote those words into the Declaration of Independence. These came from John Locke,

[23:57] an English philosopher who was writing about natural rights and he talked about life,

[24:04] liberty and property.

[24:06] So if we just look at like what's been. We haven't been able to claim life, liberty and property in the digital realm because of the way the web has been organized and evolved.

[24:16] We don't exist as individuals.

[24:20] The technical term there I come back to is we don't have a persistent digital identity across domains.

[24:27] Debbie Reynolds: Correct.

[24:29] Michael Robbins: We don't have liberty in many ways. Freedom of association, freedom of movement,

[24:34] freedom to make the decisions without it being constrained by corporate platforms and algorithmic decision making.

[24:42] And property.

[24:44] We can't own our stuff, whatever our conception of that is. Right. Is that data protection, is it excluding it from others?

[24:51] And because we haven't had these abilities,

[24:55] what we've ended up with is what some people are calling these feudal systems. Right. Techno feudalism. Right. That we are, because we haven't been able to be people,

[25:04] much less citizens of a digital realm. We're just serfs and sharecroppers on other people's feudal farms.

[25:11] But this gives us an opportunity to change that and we can talk a little bit more about this.

[25:17] It's a different approach than Web3.

[25:20] Yeah. So to go back to another English philosopher,

[25:24] Thomas Hobbes,

[25:26] in his book Leviathan,

[25:28] there's this phrase that has been, I'll paraphrase it, made still part of popular culture.

[25:35] He said that essentially life outside the kingdom walls is nasty,

[25:40] brutish and short.

[25:42] And he had a point. Right. What he was arguing was actually in favor of the British monarchy,

[25:49] saying that we need these structures in order to have order so that people don't kill each other is what he was talking about. Right.

[26:00] And what we've seen with Web3 is this push for decentralization that's been relying on this faulty premise of self sovereign identity,

[26:12] that you can exist and that you should exist independently of anything else. The thing is, that's not how humans work.

[26:19] We don't exist.

[26:21] We don't have identity by ourselves. We build identity and community. Also, it's not something that's given to us like a blockchain address or a digital wallet or a username or password.

[26:31] Right. It's something that grows with us over time.

[26:36] So this idea of decentralized IDs, these will be things in the toolset. Blockchain,

[26:42] consensus ledgers will be important parts of the tool set for building AI trusts.

[26:49] But mere decentralization is anarchy,

[26:52] right? It leaves people exposed.

[26:56] It goes right back to centralized control.

[27:00] And we've seen that in cryptocurrency,

[27:03] right?

[27:04] The movement back to these centralized exchanges that have been rife with fraud and abuse and criminal negligence and activity.

[27:16] Debbie Reynolds: So how does this new approach in your view, how does it enhance privacy or data protection?

[27:21] Michael Robbins: So what it does is it allows us to band together to create this. And I've been working with a group of collaborators,

[27:29] two key components of this,

[27:33] the,

[27:34] the thing that's been missing, you know, we have no shortage of technology solutions around this, right? And we wonder like, why hasn't this come together,

[27:44] right? We have,

[27:45] you know, data protection initiatives for like data vaults.

[27:51] We've had people do demonstrate data trusts. We have things that exist to drive AI inside businesses that are, you know, it's company confidential,

[28:05] military organizations, the Defense Department,

[28:08] they're deploying these in classified environments, right? We know the technology is there.

[28:14] What's been missing is the pathway to get there.

[28:18] What do we do first and why and how do we build that?

[28:23] And that's where we're proposing a game based approach that focuses on a particular ecosystem of teens here in Washington D.C.

[28:33] and what we're calling this is game show.

[28:36] And game show is a way for teens, but ultimately everyone,

[28:44] to kind of journal with the help of AI.

[28:48] Imagine the way that these AI systems are working now.

[28:54] Like character AI replica. They're interviewing you, talking to you.

[28:58] Who knows where that data's going, right? But imagine then that these interviews are kind of the part that gets you to creating your dots.

[29:09] And then you can have a part of the platform, the game where you are workshopping these,

[29:18] right? You can see them because they're organized in this,

[29:22] right? We have an ontology now for learning stories that works for people and AI of do, observe, Tele, explore, show. And we have seven different categories of dotes. Character, excellence, service,

[29:35] relationship,

[29:37] adventure, making and wellness.

[29:39] There's a shortcut to a post that I've did today that has more details on dote. If you go to Dot xyz,

[29:48] you can see that.

[29:50] But now we have this sort of corpus of dots that you started to collect in a private and secure space.

[29:56] And now you can use that to start training your AI rep right now because we're using small Language models or foundation models or LLMs, whichever we want to talk about there,

[30:09] we know that it's going to hallucinate.

[30:13] Debbie Reynolds: Yeah, right.

[30:14] Michael Robbins: It's not going to get the story right, but we can give it feedback and it'll get closer and closer and closer. And that's where the fun part comes in. It's actually training your AI rep.

[30:24] So if you imagine that as like a game show where you get to quiz your AI rep,

[30:29] it gets things right, gets things wrong, you can correct it. And what that gives an opportunity to do is then some back propagation of your personal knowledge graph that you're starting to build.

[30:42] Debbie Reynolds: Yeah.

[30:42] Michael Robbins: Right. So we think about now not just retrieval augmented generation based on dotes, but now you've started to build your, build your own personal knowledge graph.

[30:53] Debbie Reynolds: Right.

[30:53] Michael Robbins: Well, why would somebody want to do this? Well, it turns out then that when you have this information on dotes, this is immediately really useful for a lot of different things in school, in work and in life.

[31:05] Debbie Reynolds: Well, tell me a little bit about.

[31:07] This is definitely different than AI agents, which we're hearing a lot about.

[31:11] But tell me what's different about this than AI agents for people who don't know.

[31:15] Michael Robbins: Yeah. So imagine your AI rep as the AI agent that represents you right now what we're able to do though is start with kind of a blank slate instead of I'm going to start by training an AI agent to do my expense reports or to book travel for me, or to set up meetings or do my social media posts.

[31:43] I'm going to start with learning data because owning your learning data is a way to learn about data ownership.

[31:53] Building an AI agent that works for you with that data is a way to learn about AI agents.

[32:00] Debbie Reynolds: Totally.

[32:02] Michael Robbins: Right. And I mean we're going to have this whole world, you know, we talked about, you know,

[32:07] the eye scanners and biometric things and we're going to have this old world world of corporate owned solutions.

[32:16] But we need the solution that works for each of us.

[32:19] In fact, it's the only way that we can stand up to these.

[32:23] Debbie Reynolds: Yeah.

[32:23] Michael Robbins: At the individual level, at the societal level.

[32:27] Debbie Reynolds: I agree. And then how, how would the adults, how would like you and I, how will we collaborate? How would they work that way?

[32:35] Michael Robbins: I mean that's part of what we get to figure out. I mean that's why we're at this amazing moment in history.

[32:41] Debbie Reynolds: Yeah.

[32:42] Michael Robbins: Because imagine like all of a sudden now we have personhood in the digital realm.

[32:50] Right. That's a basis for,

[32:52] for the future of digital civilization.

[32:55] That's a lot to get our heads around.

[32:57] But be like all of a sudden everyone landed on this island,

[33:02] right? Instead of going to, trying to go to Mars and setting up a society there,

[33:07] right?

[33:08] If we imagine the AI trust as like the moon base,

[33:11] but we're not going to live in Mars gravity and eat Mars rocks for dinner.

[33:17] What we're doing is launching an expedition to make the planet that we already live on habitable. I call it terraforming technology on Earth, right? There's no plan, there's no planet B.

[33:33] Debbie Reynolds: No.

[33:34] Michael Robbins: And so,

[33:36] right, so here we have this, we have this moon base of AI trusts with our AI reps. Then we get to figure out everything, right? We get to figure out what does it mean to organize ourselves differently for economic enterprise.

[33:49] What does it mean to organize ourselves for learning ecosystems where we're learning alongside with our AI reps,

[33:58] not just for school, but also for a lifetime of learning in our careers.

[34:05] And then what does it mean for democracy?

[34:08] All of a sudden we have citizenship and that's where a long background in civil society work in policy and political philosophy. I start to think like, well,

[34:21] what happens when we can build democracy into AI?

[34:27] Then we can use AI to rebuild democracy.

[34:30] We've been trying to govern AI from the outside and it's a containment strategy.

[34:35] Debbie Reynolds: I agree.

[34:37] Michael Robbins: We're bailing out the Titanic with the teacup.

[34:41] One of the analogies I. Oh gosh, the middle schoolers I taught, we were going through parts of speech. I had them watch the old Schoolhouse Rock videos, you know, I love those.

[34:54] Yeah,

[34:56] Lolly, lolly, lolly, get your adverbs here.

[35:00] Debbie Reynolds: Great.

[35:01] Michael Robbins: They don't really. They don't teach sentence diagramming anymore.

[35:05] Right.

[35:06] And so what I discovered is that HEI is a, is a,

[35:10] I don't know, a gateway drug to grammar because the hottest new programming language is English.

[35:16] Debbie Reynolds: Yeah, right. It's true.

[35:18] Michael Robbins: And, but I also, I. We were talking about like AI reps with them and what it's going to take to stand up to big AI we watched Power Rangers. Did you ever watch a Power Rangers?

[35:31] Debbie Reynolds: No, no, no.

[35:32] Michael Robbins: Okay, the Power Rangers, you know, there were five of them, they're different colors and, and you know, it was Godzilla esque kind of monsters would show up from time to time and in order to battle them, they would have to do what they call their Mighty Morphin Powers where each of them would transform into the part of this giant robot,

[35:54] right?

[35:55] And then they'd battle whatever.

[35:57] That's the only chance we have against AI is with AI that we Control. So you talk about, like, these deep SEQ models that can be deployed anywhere.

[36:06] How do you regulate that? You don't.

[36:08] Debbie Reynolds: Yeah, exactly.

[36:10] Michael Robbins: Right. So we have to create this world of checks and balances. So I think about again, like, what do we do with AI reps?

[36:17] It's about our educational ecosystems, it's about the world of work,

[36:22] and it's about the future of governance.

[36:25] Debbie Reynolds: I agree with that wholeheartedly. Right. I think this opportunity,

[36:30] one of the big problems that we had was that there was all this data out here, there were all these systems, but we couldn't manage it. Right. So we had to go to these big players to do it.

[36:41] So what AI is doing,

[36:43] I think,

[36:43] is giving us opportunity to be able to take on more of that control that we really want. So I think it'll be really interesting.

[36:52] Michael Robbins: Yeah,

[36:53] I mean, it's a scary time and people are really concerned about what's going to happen with their jobs,

[37:04] future of work.

[37:05] And this isn't going to happen overnight. But we have to commit to doing this now because what happens next isn't just AI on the web, as I was talking about.

[37:17] It's when you're actually living inside this,

[37:20] the real world metaverse,

[37:23] which you imagine, these augmented reality glasses, mixed reality glasses.

[37:30] Predictions are that for many people, those are going to replace the phones in our pockets.

[37:36] We know that wearables,

[37:38] sensors,

[37:40] all the things that come along with surveillance capitalism, just continue to grow.

[37:44] And so if we don't claim digital personhood and data dignity now, or we don't start on that pathway, we're going to be living inside somebody else's metaverse, like,

[37:56] and riding around in the equivalent of a corporate robo taxi that's surveilling and dictating every aspect of our digital lives, which isn't just our digital lives.

[38:11] Debbie Reynolds: Right, right, exactly.

[38:13] Michael Robbins: People talked about the Singularity as being the merger of humans and machines. I look at this different singularity, which is a merger of our physical and digital worlds.

[38:24] And what that nexus is.

[38:28] Debbie Reynolds: Definitely, you.

[38:29] Michael Robbins: Know, we need new words for a lot of this. It's hard to describe it. You know, people talk about, like, even the word metaverse. People would argue about that AI, right.

[38:38] Blockchain, all that stuff I talked. I've started to think about this word spatia,

[38:44] S P A T I A.

[38:47] It's Latin for space,

[38:50] not outer space,

[38:52] but all the spaces that we interact in all the ways.

[38:57] Right.

[38:58] And so what is this journey into spatia look like for us?

[39:03] Debbie Reynolds: Yeah. Wow, that's so deep. Oh, my Gosh. Oh, my gosh. We could talk for hours about this. We could.

[39:08] Michael Robbins: And I, you know, but the thing is, like,

[39:11] we can't.

[39:12] I may write a book at some point. I've been spending most of my time on LinkedIn,

[39:17] which my teen son is called Facebook. But for even older people,

[39:25] yeah, that one really hit home.

[39:28] But we can't explain this to people and expect it to take hold. They have to be part of building it.

[39:34] Debbie Reynolds: I agree.

[39:35] Michael Robbins: And so that's why I'm not a pundit, I'm not author.

[39:40] I'm a social entrepreneur.

[39:42] And,

[39:43] you know, so I'm looking increasingly for people who want to help us build this,

[39:49] want to support this. And so this is a really amazing time to be on your podcast because we're at that moment,

[39:56] perfect.

[39:57] Debbie Reynolds: Well, I'm happy to, you know, I'm happy that you're doing this. I've watched you for many years, and I love the way you're thinking. And so you're definitely in the right direction.

[40:06] So I love it. Definitely.

[40:08] So where the world according to you, Michael, we did everything you said. What would be your wish for privacy or data protection anywhere in the world, whether that be regulation,

[40:18] human behavior, or technology?

[40:21] Michael Robbins: Yeah. Was it?

[40:23] I've heard, I think one of your other recent podcast hosts restated this, that privacy is the ability to control how your story gets told.

[40:36] And I think that what this ultimately is about is those stories.

[40:41] It's, you know, you know, the line from Hamilton,

[40:44] who lives, who dies, who tells your story?

[40:48] And the most important stories we tell are the ones about ourselves.

[40:53] Debbie Reynolds: I agree.

[40:54] Michael Robbins: And so being able to tell that on our terms in a world where there's so many other people trying to tell a different story about us for all these different reasons.

[41:06] Right.

[41:07] Is taking charge of these stories.

[41:09] Stories are so important.

[41:11] I'm a storyteller. I've the Moth Radio Hour.

[41:16] I've done story slams, and I think a lot about the role of stories in human history.

[41:27] It's how we convey wisdom.

[41:29] Debbie Reynolds: Right. Wisdom that AI doesn't have. Okay, let's keep going. It's true.

[41:34] Michael Robbins: Yeah. It gets at what it means to be human. I also have done intensive studies in public theology, and I look at the variety of spiritual traditions across the globe throughout history,

[41:48] and they're all based on stories because it gets at this fundamental importance of what? Of who am I and who are you and who are we together?

[42:02] Right. It gets at wisdom,

[42:07] and it gets at our relationship to what it means to be human and our relationship to the greater whole however we.

[42:17] However we choose to believe or think about that.

[42:20] Debbie Reynolds: Wow. Oh, my gosh. That's amazing.

[42:23] So excited. I'm sorry. So excited. We finally got a chance to do this, and this is a great story that you're telling, and being able to share it with so many people around the world will be just.

[42:33] Just tremendous. So thank you so much. I really appreciate it, and I'm sure we'll be able to talk soon and collaborate in the future.

[42:40] Michael Robbins: I've been so grateful to be on your show. Thanks so much, Debbie. Take care.

[42:44] Debbie Reynolds: Okay. Take care. Thank you.