Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
The Only Podcast on OpenAI You'll Ever Need* image

The Only Podcast on OpenAI You'll Ever Need*

E29 · Esquiring Minds
Avatar
114 Plays1 year ago

*By us.

Recommended
Transcript

Introduction and Banter

00:00:01
Speaker
Three, two, one. Well, no, that was delayed. I already did the three, two, one. We're starting. This is all in the show. No, I'm messing with you. Oh, okay. It's a little behind the scenes peak for dear listener. Hello dear listener. Well, hello everyone.
00:00:16
Speaker
Hello. I'm becoming the guy who says pod that hates it when people say pod. Friends of the pod. You hate it when people say pod? It's not my favorite. It's a weird... Go ahead. It's most annoying. I think I hear it from Pod Save America and it's really annoying.

Podcast Origins and Hosts Introduction

00:00:34
Speaker
I don't know why. I actually like listening to that because I listen to it and I haven't given up on listening to it, but still saying the pod
00:00:42
Speaker
Yeah, I just realized I don't think I've heard they're the only ones that I back I listened to them back when they were keeping at 1600 and I don't know if you remember if you know them back then but they were part of
00:00:57
Speaker
either the Ringer or Grantland, whoever the Bill Simmons group, whatever it was around back then. Like 0809, right? Around that era? Yeah. No, it was 2016. It was 2015. Oh, really? Like leading office. Wow. Yeah. So it must have been the Ringer. It was as Obama's terms were coming in. But yeah, I used the pod.
00:01:21
Speaker
Cause like all the time, like semi, semi-ironically, but I didn't realize, yeah, I guess they're, are they the only ones that use it? I don't know. I don't know, but it's kind of annoying, but probably because I don't know. They're sometimes annoying. They're also really like clever and insightful and I enjoy listening to them, but I find that one particular thing annoying and now I'm doing it because I've listened to them.
00:01:41
Speaker
Y'all should go listen to a good podcast like that one. There's nothing good to be found here. Like not this one. In any event, hey guys, it's been a quiet week. We haven't even introduced ourselves yet. I'll introduce everyone. This is episode 29, in case you didn't realize that, of Esquiring Minds. It's November 20th, 2023.
00:02:03
Speaker
the show's three lawyer friends just goofing around. We're the three friends. I'm one of them. I'm Andrew Leahy. I'm a tax and technology attorney. And I'm also an adjunct professor of law at Drexel Klein School of Law. Oh, you got it this time. Yeah. I added it, right? As you told me last time. I always follow instructions. But that person who corrected me last time, that's Jake Schumer. And I haven't gotten you down yet. You're a board-certified physician, I think.
00:02:25
Speaker
No, I'm a board certified local government attorney land. No, that's not the actual title, but I'm a land use and local government and construction attorney in Florida. That's my and I'm not a professor of anything. That's my job. You are my favorite. I probably tell you this a lot. You're my favorite land use attorney in Florida. Thank you. I agree. I don't think there's a better one to be found.
00:02:49
Speaker
Certainly the best one on this podcast. Sorry, Jason. Yes, that's you. I am Jason Ramsland. I am an employment litigator. I've indicated employees' rights. I do that primarily in Indiana situated around the great city of Indianapolis. It is a great city despite what Brett Moore may think. Brett Moore is maybe the first person who we actually know personally who's getting called out on the podcast instead of just
00:03:13
Speaker
So, congratulations Brett, you're in a category with Elon Musk. First podcast beef. Brett Moore, Elon Musk, and Brett Moore. These are the enemies we have on our list. Are these favorite personal injury attorneys in the Atlanta metropolitan area? I don't know about that. I've got personal injury attorneys in the Atlanta area that I like way less than Brett.
00:03:32
Speaker
Oh, I like Brett. Hold on. I'm going to defend. No, I like him too. I'm just being a pill. Oh, okay. So guys, do we have anything to talk about? Nothing happened this week, right? Oh my God. No, I mean, that's the problem is I really thought coming into this, we don't, I don't know what you guys want to really talk about. Nobody else's sleep deprived from, uh, refreshing threads for updates. This is irony. There's a lot to talk about. So we probably ought to get started with the chit chat and, uh, skip over the baseball chatter this time.
00:04:03
Speaker
It's a bad week to stop the baseball chatter. We'll go to the meat and potatoes. So I think the meat and potatoes, I expect that I do not have as much of an understanding of all of this as either of you, but I can introduce it that way I can then step back and hear you two fight like animals over a piece of food or

OpenAI Controversy and Microsoft's Role

00:04:22
Speaker
whatever.
00:04:22
Speaker
Our main topic, I think, is going to be OpenAI and the fact that they, over the weekend, canned Sam Altman, their CEO and founder, right? And then attempted to get him back. Well, there's like four founders of OpenAI. We need a term in the popular lexicon for the founder that everyone knows about, right? It's not really material if he's the main founder or the main shareholder, but if you've heard of OpenAI, you likely have heard of Sam Altman.
00:04:52
Speaker
Well, you've probably also heard of Elon Musk, who also founded OpenAI. He founded it as an investor. No, he was a founder. Oh, God, really? If he wasn't a founder, he was one of the first investors, and then they separated. They went different ways. It's a dramatic organization. There's four steps to this story. Jason, if you don't mind, I've been neck deep in this.
00:05:18
Speaker
since Friday at noon. Don't be surprised when I chime in with my little tidbits. I want to give you one quick thing. Yes, Elon Musk was a founder. So do everything I've got to hear Jake say you can now take with certainty that he is. He's absolutely correct. Absolutely. This is a 100% accurate podcast. We've always said this. Never made a mistake. So step one, OpenAI is the company behind chat GPT, Dolly 3.
00:05:46
Speaker
They are the somehow this company is the leader of the revolution their their chat gpt app is obviously like the biggest thing in the world it's. It is through a combination of gathering.
00:06:02
Speaker
basically the most talented AI and machine learning coders and developers and researchers. This is the MIT of AI research. This is the major leagues, the premier leagues. This is where the best minds in AI have congregated. There are good minds in other places, but this is the highest density of just supremely intelligent AI researchers and developers.
00:06:30
Speaker
Yes. The number one destination. Right. And they also had first mover advantage in terms of basically scraping the entire internet and likely many copy written works and various repositories of copyright infringement, you know, like books and textbooks and things like that. So it's through a series of above board actions and perhaps, let's say, just barely on the board actions. I'm sorry, go on. Yes, Jake.
00:06:58
Speaker
And to be and the thing is, it didn't start as a company with a real business before it's it was really started as a nonprofit thank tank for
00:07:09
Speaker
Uh, so tech types that think too much about AI and about like, uh, you know, responsibly developing for the good of humanity, artificial intelligence. Um, so it is ethics. Right. Yeah. Uh, and so the nonprofit, the for-profit company, uh, is actually held by a holding corporation, which is itself held by a not-for-profit governing board. Um,
00:07:39
Speaker
So it's kind of strange in that way. And then the for profit company, they realized at some point they were going to need to make money because it costs a whole lot to perform all these computations for AI that they want to research and develop. And so they need a for profit wing of this. And they built this company. They started this company, which got investments most notably from Microsoft.
00:08:05
Speaker
who invested $10 billion into it and is one of the like major, major stakeholders in this for-profit company, who invested mostly credits for AI computing because Microsoft has invested like $50 billion of its own money into AI computing centers, data centers. And so they invested mostly computing credits for their stake in this business. So that's where we're at.
00:08:34
Speaker
It's the computer technology equivalent of sweat equity. And they are promising their sweat equity, except in real dollar terms. Yeah, you can use all of our Azure, A-Z-U-R-E, I assume it's Azure, Azure. I've never actually heard it pronounced Azure, whatever. They were one of the premier, they being Microsoft, builders, outers of massive data centers, along with Amazon and Google and a few others. So that really was something they could offer OpenAI that is really,
00:09:03
Speaker
nearly as good as cash for something like an AI company that is attempting to build a large language model, right? They're going to spend all of their money on processing. And so if Microsoft comes along and says, we'll just give you the processing, that's nearly as good as the investment of cash itself. Yeah. Um, and for Microsoft, if they never use it, if they, if they completely fail, then they don't lose the $10 billion because they're never going to use that compute. So, um, the,
00:09:31
Speaker
So Microsoft invests and importantly gets a unlimited perpetual IP license to use all the stuff that gets developed at OpenAI.
00:09:43
Speaker
So because of that, Microsoft with having its partnership with this corporation, which after the launch of chat GPT last year became the the New York New York Yankees, maybe the wrong term now became the Yale, the top of the line company by a lot, the leader of this of this industry.
00:10:09
Speaker
Microsoft was with them because they have this deep partnership and investment and this, this, uh, this license. So they got to integrate chat GPT there. Uh, they got to integrate all this AI that they were doing that where they are there at the front of the pack, uh, into all of Microsoft stuff. So Microsoft has co-pilot now and all of its windows stuff. And, and in.
00:10:32
Speaker
Edge and it has chat GPT and Bing. So it is thoroughly deep into this organization that is.
00:10:41
Speaker
structured kind of strangely. And importantly, Microsoft felt comfortable doing all of that because of the perpetual license, right? Any concern Microsoft might have about like throwing in behind a relative new player in the tech sphere is kind of softened by the fact that they have a perpetual license. And so they start Microsoft co-pilot for Office, for Bing, for all these other things, even for GitHub, right? Which is owned by Microsoft. And if things don't go well for open AI, they could just pivot to some other AI company.
00:11:11
Speaker
But for now they can throw in all the way and they know there's nothing like basically a open AI cannot develop GPT out from underneath Microsoft's hands. They know they have their arms around it permanently. They being Microsoft, right? Yeah, and they have because they gave they invested with compute units. They know that open AI really can't move away from Microsoft, right? Like they fungible. They are both. They are both really dependent on each other.
00:11:39
Speaker
Yeah. And so unless open AI was for some reason to like commit corporate suicide, right? Microsoft's totally safe. Uh, and who would do that? Right? No one. Uh, so, so here we are. It's Friday. Uh, chat, uh, open AI just launched this GPT store, um, or announced the ability to create your own GPTs. And eventually they're going to have a store.
00:12:06
Speaker
people went wild for these GPTs and were able to successfully create a bunch of crazy steps to the point where they had to turn off new paying accounts. They were actually starting to make money on this stuff, kind of, not really, but like they were getting people to pay money for it. When all of a sudden on Friday at like noon, with one minute's notice to Microsoft, the board of the nonprofit, which governs the for-profit company,
00:12:34
Speaker
Um, fires the Sam Altman, the CEO with no notice claiming in a public statement that he was not being, he had not been had, uh, yeah, there was a lack of candor to the board. Oh, right. Right. Right. Uh, for, for a little bit of context about that, Sam Altman is himself was, uh, I think still is, uh, one of the six members of the board, the governing board of open AI.
00:13:01
Speaker
Uh, and so as it happens, it was not the entire board. Uh, and Sam Altman was not the only person who was fired in this whole exchange, Sam Altman, uh, and one of the other board members, uh, his name is escaping me at the moment. Jake, you have it. Is that his name? Greg Brockman. Yeah.
00:13:19
Speaker
was the chairman of the board of directors. And so somehow the board of directors of six people convened a meeting. They convened a meeting. That seems illegal, but I'm really wondering why we're just taking this for granted. To some extent, we have the wrong Professor Leahy here. My wife Gina is a bizorgs professor at Drexel. And so she's attempted to explain to my relative smooth brain,
00:13:44
Speaker
how this could be possible. And my understanding is that there's a difference between, so basically you need to get the bylaws to really see how this could have happened because the issue is oftentimes with a, all you need is a quorum. And so it could be a simple matter of who was there voted to oust and that's it.
00:14:01
Speaker
She seems to be thinking that this is a very strange structure, the fact that the majority shareholders don't have control interest. I mean, there's no equity at this nonprofit level. None of them have equity in the for-profit corporation at all, which is interesting. Even Sam Altman.
00:14:19
Speaker
He's kind of unheard of. He was in this not for the money, but the CEO job. He wasn't in it for the money and he got fired and he didn't have any equity in the for-profit company that he was the CEO and built into a company that just had an $82 billion valuation, something like that. Right. And the flip side of them, him having no economic interest in it is Microsoft has a pure economic interest. That's all they have. They have no say in anything. They have a capped profit company that they have
00:14:49
Speaker
They're a minority owner. I'm looking at the org chart that they have, right? They're a minority owner in a capped profit company, which means basically they're entitled to a certain amount of profit and that's it. Nothing. They have no say in anything. They obviously had no right to receive notice, right? That, hey, listen, we're thinking about firing the CEO or we're thinking about wiping out the board or whatever. No rights whatsoever.
00:15:09
Speaker
So Microsoft put a lot of money granted in compute units, but that is real money to them as well. Right. They are paying for it in some way. It's not a dollar for dollar ratio for what it's worth for them. But it's real. Real obligation. Real obligation. And they get no right to even being told that this is going to happen. I bet they would light those. They would light those compute 10 billion dollar compute dollars on fire compared to all of the work that they've done into building their entire ecosystem around having
00:15:39
Speaker
around chat GPT and all that, and Dolly 3 and all that. That's really years of work to re-strategize around, oh, we have to build our own app now.
00:15:55
Speaker
Right. So the jumping off point where I started to derail you, we'll just kind of tie the loose end on that is four out of six board members convened a meeting. And I was listening to Neelai Patel and some other folks talk about this. And they said it was on a Google Teams or on a Google Meet. So Microsoft Invest was a jab that one of these guys made. Microsoft Teams
00:16:22
Speaker
can't even get the spotlight over Google Meet when Microsoft is a $10 billion investment into the whole operation. Still can't get them to use Microsoft Teams. But they had this Google Meet where they made the decision and on that Google Meet, these four members of the board decide to terminate and then they Google Meet
00:16:43
Speaker
uh, Sam and Greg and say, uh, Hey, by the way, you're fired. And, uh, it's also weird to me. So they didn't fire Brockman. They know he resigned and yeah, but he resigned to protest for fire for the firing of, um, yeah, but also he,
00:16:59
Speaker
probably partially in protest of the fact that he was on the board and they kicked him off the board. So how do you just kick off a board member with a bare majority? Quorum. I think that's the quorum issue. What a silly bylaw that you can literally, you get a, you get a bare majority and you can just kick everybody else off that you don't like. Move fast and break things, man. Oh my God.
00:17:23
Speaker
That's exactly what you want from the AI ethics company, right? The AI ethics nonprofit. These people are going to keep the Terminator in a box for sure, right? Like they can't handle, they're not going to keep, they can't keep this under control. They're going to, if there's any concern with AI, like, you know, rising up and taking over the world, they're the ones who are going to keep it in, you know, in its pen. Yeah. These are the, yes, the, the, the brilliant minds that came up with this decision, the way that this went down, uh, as a, I'm not a corporate governance person, but I,
00:17:52
Speaker
deal with local government structures a lot. It's insane that they can do any of this. Agreed. Agreed. Yeah. So they fire him immediate giant WTFs all across the tech
00:18:07
Speaker
paying attention world of what, what he must've done something really bad for them, not just to fire him, but to burn him in a public statement. Uh, there was that window of time where people were just like, what could it be? Right? They were just like open speculation of, Oh man, just wait for whatever this is going to be. Yeah. Uh, and then it's like, no, ever we're finding out. No, there was no, uh, no notice to anybody involved, including the CEO that they appointed behind him.
00:18:36
Speaker
Who is I'm what is her name? I'm trying to remember her name. Mira Morati. Mira Morati. Yes. Yeah, that sounds right. Mira Morati, the who is also highly respected. She apparently had no notice. And it's just kind of everybody's kind of wondering what's going on. And eventually this reporting comes out late Friday. And meanwhile, markets, Microsoft stock is taking a hit that it was because
00:19:06
Speaker
Ilya Sutskever, the chief AI scientist at OpenAI and a board member was worried that they were moving too fast in developing AI. And this tension, which is part of like a tension that's kind of essential to the
00:19:30
Speaker
the structure of this organization because their stated mission is reasonable development or responsible development for the good of mankind. Apparently can that concern just swung all the other members of this board and they that's why they fired him. He was moving too quickly.
00:19:50
Speaker
I am not entirely sure that that message comes through clearly to me because not very long after that, maybe it was on Saturday, maybe it was yesterday, Ilya has apparently posted somewhere online. I'm sure it's on X because this whole thing has been unfolding on Twitter. Ilya has posted something to the effect of
00:20:09
Speaker
I regret that I participated in this. And so it makes it sound. And maybe this is face saving by Ilya. Maybe it's not. Maybe there's more under the surface that we just don't know about. And honestly, might never. But it made it seem like Ilya was brought along rather than being the mover. Maybe that's a PR move. Maybe it's something else. I don't know. But it's not entirely clear. That's a later story development. Oh, sorry. But yeah.
00:20:37
Speaker
And so, yeah, he would eventually say, I am I'm ashamed of my words and deeds. But he said, I don't know that we can trust him. I'm with Jason. I'm not. He might be face saving because in his avatar on Twitter, he's wearing a goofy hat. I don't trust people who wear goofy hats around too much. But so what he said was, I deeply regret my participation in the board's actions. I never intended to harm open AI. I love everything we've built together and I will do everything I can to reunite the company.
00:21:04
Speaker
And everything he can includes, anybody who's listened to us probably has a basic understanding of what happened here. But there was a whole soap opera for the rest of the weekend. If y'all don't mind, I'll like speed through a little bit of this. There was like, will they, won't they? Are they gonna bring him back? Because immediately the interim CEO that they appointed was like, we're gonna work on bringing him back.
00:21:32
Speaker
Meanwhile, the board has no communications of their own at all, apparently. And Sam Altman set a series of 5 p.m. deadlines, which he ignored, kind of, saying, all of you resign and appoint me again, or I'm taking everybody somewhere else. And there's a social media outpouring of support from everybody at OpenAI, basically saying,
00:22:02
Speaker
We're with him. We're not open. AI is nothing without us. Uh, you dis, you know, attacking the board basically. And there are very public anti board statements. Like it's hard to imagine. There's no way open AI continues to go on as it is with this board and these employees. Cause God, they hate the board. Um, but so on Sunday, there was like, uh, there was a question cause something had to come out because, um,
00:22:30
Speaker
Cause Microsoft had to have something ready, uh, for shareholders. Otherwise, uh, otherwise Monday was going to be terrible for a terrible day for the market because, um, you know, for all it was really bad. They threw their chips in behind this company that's now imploding behind this disaster. Uh, and so on Sunday night, uh, open AI and I'll take a break after this open AI announces their new CEO.
00:22:59
Speaker
The guy who founded Twitch, what's his name? Sheer. Sheer? Yeah, I believe he's Sheer, right? Yeah.
00:23:11
Speaker
Shearer, Emmet Shearer, CEO. Emmet Shearer, who founded Twitch and who everybody at Twitch hated, by the way, were coming back across my domain. Awesome. All the gamers hate this guy because he sucked. But they announced he was, he's the new interim CEO.
00:23:30
Speaker
And Satya, the CEO of Microsoft, announces that Sam Altman and Greg Brockman, the fire board members, are going to be joining Microsoft and starting a new AI wing.
00:23:47
Speaker
Which is insane. I mean, which is yeah, there are so many questions one huge one that I didn't think of until we talked about it with the compute units thing is that Microsoft's cheap chief investment in open AI has been these compute units, right? Basically what that means for people who aren't necessarily Technically savvy is that they are allowing open AI to make use of Microsoft servers. That's it, right? So a lot of what I
00:24:10
Speaker
Yeah, it's a cloud. Exactly. So a lot of what open AI is doing, it's doing on like the real estate, the cloud real estate of Microsoft. Now you're going to have the entire team
00:24:21
Speaker
Now it's up to 95%, by the way, of employees at OpenAI have signed something as of four hours ago that they will leave and go to Microsoft as well and join Altman. Now you're going to have the entire team move over to the company that owns the servers that are running all of OpenAI's processing. I mean, the incentive to just access that and truly take everything that OpenAI has in terms of the model or whatever is
00:24:48
Speaker
quite strong. And if open AI wants to continue to exist and be in competition with Microsoft, they have to find another patron. They cannot continue to like fly in the face of Microsoft and use their servers as it's not going to work out. Ford can't be using GM plants to manufacture their Mustangs. That's not going to work out long-term. And nobody would partner with them anymore because now they're an unreliable partner. And you're a hustle.
00:25:15
Speaker
Yeah. There's nothing like, what are we partnering with? Let's say Google went like, well, no, I think we want to be, Bard's going to get better. So we're going to partner with OpenAI next. For what? What's there? You don't have any employees. I suppose you have the model, literally. It is structured to operate on Azure servers. I mean, you can fix all of that, but you need talent to do that, right? Who's doing that? Google's talent? They're the people behind Bard.
00:25:40
Speaker
Yeah, that's not working out. It's crazy. Yeah. And the statement from Microsoft said they'll look forward to getting to know Emmett Shear, you know, because he's not an AI guy except that he tweeted about how we need to slow down AI development from where it's a 10 now to one out of 10 from a 10 to a two. So he really doesn't want to move at all.
00:26:04
Speaker
Yeah, yeah, development. He also tweeted some weird stuff about sexual fantasies. Oh, my. Oh, no. You don't need to. Yeah. Absolute all of this is insane. If Microsoft actually pulled it off, which it's still an open question, it would be like the greatest coup, one of the greatest coups I could ever imagine, because they basically get open AI for free.
00:26:32
Speaker
they get the technology. The only thing that they don't get is the actual ownership of the IP. This is one of the interesting bits to me here because this rhymes a little bit with Apple and its investment and unlimited license in ARM and how they've really taken that and exploited that to great advantage for Apple.
00:26:55
Speaker
It reminds me of that situation a little bit here because I don't know exactly what the license looks like for Microsoft to be able to use OpenAI's chat GPT and all that stuff. Basically, if we operate under the assumption that that license is, we get to use all of your stuff until at some point in the future or perpetually, I think it's perpetual. It's perpetual according to some reporting I've seen.
00:27:20
Speaker
So if we have this perpetual license from Microsoft to use open AI's chat GPT client, like you don't even have to, like if Microsoft were just taking the people from open AI, that's bad enough, right? Because you're hiring the team that's built it and you're hiring all of the knowledge that they've gathered along the way in doing it. And so like,
00:27:43
Speaker
Once you've done something, I had this experience with making ethernet cables this weekend. The first one you make takes hours. The second one you make takes minutes. It's like the first pill that you make in a pharmaceutical regime where you develop this new medicine. The first pill costs $40 million and the second pill costs a nickel.
00:28:03
Speaker
And so you've got the whole team there. And the second one's also better. So you've got this team that can go out there and remake this thing if they need to, but they don't even need to do that. They don't even have to start from the starting line. They get to start from wherever they left off at OpenAI because Microsoft has this
00:28:23
Speaker
a perpetual license to use the product. This is the best possible way that they could bring in all of this incredible talent that was at this organization that built this already shaking the world up tool and be unshackled for now.
00:28:43
Speaker
and bring them in under your roof. The other thing that I think about, of course, the employment litigator is thinking about this is surely the folks at this open AI company, the for-profit wing that was owned, surely they have non-competes preventing them from doing this, right? What are you willing to bet that the non-competes for all of these people had a carve out for their biggest patron, Microsoft, because of course there's going to be a revolving door going back and forth.
00:29:09
Speaker
between Microsoft, one of the biggest computing giants in the whole world, and this enterprise that is doing really great and interesting stuff. And so I bet you that they don't even have a non compete problem between these two organizations because Microsoft has that kind of gravitas in one of those deals to say, we want to be able to exchange people. You know, I'm not sure, you know, all these these companies are based in California or Washington. Yeah, I'm not sure how much there's a non compete culture in
00:29:40
Speaker
in tech. Do you know that? Because I get the sense that there's not because when this was announced, when Microsoft announced this new AI team at Microsoft, Brockman was listed like six different people at OpenAI who had not been fired and who had agreed to come over.
00:29:59
Speaker
So like this was not like, uh, did we talk about it in the context of Twitter? I, we might have, when we talked about like 3000 people getting canned and how NDA is we're going to make it that functionally, they couldn't, there was no place to go work. I thought I remembered something about that. Maybe. Well, as a matter of fact, Jake, that's a great point because California, uh, they haven't, I don't think that they've entirely fully outlawed non competes.
00:30:23
Speaker
Yeah, but California cracks down on them in a way that we talked about this and when the FTC was floating the rule about non competes and kind of cutting those down. Oh, yeah. Yeah. And like California is pretty strict on that. I don't know the extent to which this is strictly California based. That's an interesting question, but the questions basically solved anyway, because I bet you that they've got an exception written for Microsoft.
00:30:50
Speaker
Yeah, I, yeah, I bet that's not a, it's not a problem because they are, cause otherwise you'd be hearing people talking about that a lot more considering now on Monday, I think they announced this Microsoft announced this for two reasons, even though this all happened over the course of like 48 hours. There was no way to stabilize their share price. Oh yeah. No question. There's no way you could have details figured out in that time.
00:31:17
Speaker
Like let alone numbers for the CEO but like what exactly this wing is gonna be how it's gonna work with your partnership with with open AI this was just like Okay, we have an agreement if this doesn't work out for you to come back a CEO then you're gonna come build out you're gonna get I'm gonna give you your own company you're gonna have a
00:31:40
Speaker
you can have all the money you need to rebuild the whole thing, because Microsoft would definitely pay all that money to do this. Well, here's something interesting. So when you were talking about the license Microsoft has, I found this isn't any sort of actual license document. This is just in the press release. But what they said when they partnered with OpenAI was through a partnership with OpenAI that aims to accelerate breakthroughs in AI from jointly developing the first supercomputer on Azure that is powerful enough to meet the demands of a very large AI model, blah, blah, blah. Microsoft has a license to the code
00:32:09
Speaker
behind the GPC 3 model that allows it to integrate the technology directly into its products. If they have a license to the code, a perpetual license to the code, they have the code. Yeah. Yeah. They don't even need to bother recreating anything. They just take the code and that's it. Yeah. They have the license to every, every bit of code other than what's the artificial general intelligence, which is the name for like the, like that's the theoretical like quasi religious name.
00:32:38
Speaker
that technologists use for the alive thing that they're actually thinking about and spend all their time, which is why they destroyed this company because they want that to be done responsibly.
00:32:56
Speaker
But yeah, they have literally everything. And it's literally already on their cloud real estate. So they are literally just saying, no worries. We'll just switch these servers, literally these accounts, these Azure processing accounts. We could just switch them right on over to Microsoft. No big deal. We could just duplicate it. We could just spin up another image of your moment-to-moment updated model. We can just spin up a second one that is ours.
00:33:24
Speaker
Obviously, there may be very well hammered out licensing aspects for this, but the structure of everything else would suggest that that's unlikely. None of the rest of this seems to be thought out very well. Altman can just be canned with basically whoever happens to be sitting at the board table. One person could just sit there and decide, well, I have a quorum, and guess what? You're out.
00:33:46
Speaker
what are the odds that this license agreement with Microsoft hammers out the kind of details of like, look, if we ever have some sort of divorce, you can't just take all of this and you can't, you know what I mean? Like you're only, you only have access to some small subset of our, of our model. And yeah, they really can't divorce because Microsoft is an owner and like, you know, and, and then they can't divorce from each other at all. And it's built to run on Azure. I mean, like, I can't,
00:34:12
Speaker
overstate the limited amount I've done stuff with cloud. If you know how to work on one cloud, you are not building stuff for another. That's what you build on. That's it. It is its own thing. It's like writing it for Windows or PC. It's not the difference between playing a football game on Astroturf versus Grass. This is the difference between playing a football game on Earth or the moon.
00:34:35
Speaker
Exactly, right. And the idea that you could just go, oh, well, no worries. Microsoft doesn't want to work with us anymore. We'll just repurpose all this someplace else. You really may as well just say you're going to rebuild it someplace else to some extent. I'm not saying all the way to there. But like, it is a lot of it is a labor intensive thing. And with open AI being the, you know, whatever you said, the 90s Yankees of the AI, where is the talent coming from to do that? I don't know that it exists outside of there.
00:34:59
Speaker
This has not been a discipline that has been around for that long. You know what I mean? Like there's no one with 25 years of experience in AI. It wasn't around 25 years ago. No. You have like theoreticists like...
00:35:13
Speaker
Yeah, theory people like Ilya Sutskever, who's the guy and who after so after Sunday, when they announced the interim to Ilya Sutskever, one of the people that voted to remove Sam said, I'm ashamed of my words and deeds. And he and every single staff executive leader seemingly and now 700 of the 750 employees signed a letter saying
00:35:41
Speaker
resign within 48 hours, another deadline, or we are all leaving. And one of the things they said in there was that the board in communication with the employees said that destroying the company would be consistent with their mission. So that could like
00:36:01
Speaker
If that's the perspective you're taking, like it might actually happen. Like it's, it's hard to, like these people are already never going to be able to show up at a party without somebody kind of like, you know, looking at them funny, but they go through with this. Uh, they like, they just will have to go live in a, in a shack somewhere or something. Um, because you're literally destroying an $80 billion company because of your,
00:36:32
Speaker
weird beliefs, your weird beliefs and which are, by the way, the destruction of this company doesn't even accomplish that thing because they get to just pick up the pieces and the for profit behemoth that definitely doesn't give a crap about what all your weird stuff. Right. So congratulations. Microsoft is now in charge of the, of AI. I don't think that's consistent with your mission where the S is spelled with the dollar sign. Like it used to be when people would criticize Microsoft.
00:37:00
Speaker
I mean, like that prophecy was foretold 30 years ago, right? That Microsoft would be the end of everything. Well, guess what? Congratulations. You've managed to make it come true on that in terms of them being willing to destroy the company. All that is enabled because the main company, the main, uh, uh, open AI, Inc is a 501 C three. So there's no fiduciary duty to shareholders to maximize profit or shareholders. Yeah. Well, right. But I mean, or I assume even to continue to exist as a going concern.
00:37:28
Speaker
I have heard that these are massive. So here the way I have not looked deeply into this and I'm not a corporate guy, but the nonprofit is the owner, right? The nonprofit as the owner and still has a fiduciary duties to its co-owner of the of the for profit entity. So nonprofits have no duties to the nonprofit.
00:37:56
Speaker
but as co-owners of the, as an investor of the for-profit entity, you do have causes against the nonprofit entity for destroying the value of the company that you co-own together. Because they're, that's my understanding. And so they basically said, yeah, we might get sued. We don't care. Come sue us for our no asset for destroying this company. Our asset is this company. So they don't care about that.
00:38:26
Speaker
Um, but that, that lawsuit does exist. I don't, I don't know. I mean, I'd be interested to hear that from a corporate, uh, from a corporate law person. I've been, get your wife on the phone,

OpenAI's Structure and Potential Microsoft Acquisition

00:38:38
Speaker
help. Yeah. I mean, the, the public charity element of it is like, I do wonder.
00:38:43
Speaker
So if you, if you form a 501c3, I know I understand the tax angle of this, but I don't understand the corporate governance portion of this. If you form a 501c3 and your goal, your stated goal is something that is potentially at odds with, uh, potentially odds with the company itself continue to exist. You, can you not pursue that goal? Like, is it a weird, like, um, you can do like a duty to not commit suicide for, for 501c3, you know what I mean? For a nonprofit entity. Yeah. What's the,
00:39:12
Speaker
You have a fiduciary duty to duty of care, duty of loyalty, duty of, you have a fiduciary duty to your fellow members in a closely held corporation. Like if you, you're allowed to destroy your own company, but it ceases to be only your own company once you allow investors in, right?
00:39:30
Speaker
So you have a duty to those investors, but that's, I mean, I guess that's what you're saying before, right? That's imputed up to the 501c3. Well, it's right in the same way where if you owned a share of a company through an LLC, right? That wouldn't stop the LLC from getting sued. If you, if you make a bad decision, you, uh, you know, yourself, um,
00:39:54
Speaker
through your ownership of a share through an LLC, the LLC gets sued for your decision. It doesn't make a difference that your tax exempt, you know, what kind of corporate form you have, if that's, you know,
00:40:08
Speaker
I'm not sure how much it matters because what are the assets that can be taken from this? Well, I also wonder, like you think of a fiduciary duty, you typically think of it as like an obligation to not act in your own best interest at the expense of the company. Would you say this is acting, I mean, it's not, right? So you would say, well, I wasn't personally enriched.
00:40:27
Speaker
Well, here's the thing. It's a business decision. The different, the difficulty I see here is that you're, you've literally just fired somebody because of a difference in philosophy. That doesn't seem like actionable. You know, the, the actual, uh,
00:40:44
Speaker
and you had a replacement who is highly disrespected, and then had, but disagreeing with the existing. Yeah, it functionally destroyed it, but it wasn't, I mean, it would be like, if you imagine any other charitable organization, a 501c3, the Habitat for Humanity or something, right? And the CEO decides that, I think it's high time that we start charging people a little bit more for these houses, right? And you, as the board, you don't think that's a great idea, so you can them.
00:41:11
Speaker
but functionally that is a disaster from a PR standpoint and from everything else. You nonetheless would have the defense of, though that's not in our stated goal, that's not our mission. We have our mission right here. You as the general public or the court were before may not think that our mission is valid, but that's not really your concern. We're here to cabin AI and keep it from escaping its pen or whatever. Altman was going to throw the gate open and so we fired him.
00:41:38
Speaker
Ignore the fact that there are, you know, this this corporate board over here, the nonprofit board. You are a co-owner of of a for profit company. There's nothing in that for profit companies bylaws that says that we don't we exist only for the you know, the the advancement of humanity or anything like that. It does not have a mission statement. So when you are active, when you make the bad decision,
00:42:06
Speaker
as a member of that board, you don't get to state that justification just because you have something in your own thought process saying, actually, I don't care about shareholder value. That would have to be built into the for-profit companies. I should not be talking about this. We should be getting Gina on here. Can we just shout? Can we just call for her loudly, Gina?
00:42:31
Speaker
I think she's recording the other podcast in the other room, actually. I'm not sure. Or she's asleep. But yeah, I get what you're saying, though. If the fiduciary duties of one for-profit company gets sort of
00:42:47
Speaker
Quasi imputed into the non-profit company it ceases to have any meaning to like why would you have the separation now because you basically can't like you would you would never want to have anywhere in the line a for-profit company because the non-profit and or yeah, right because the non-profit entity could be held accountable for something that would be against some sort of fiduciary responsibility of the for-profit company does that make sense like
00:43:09
Speaker
It has to be separated apart. Okay. Yeah. I'm saying a lot of different words, but like imagine you're a nonprofit. This happens all the time. Schools, schools have endowment funds, right? That, uh, the schools themselves have a, have a mission statement. Maybe you invest in a company that is anti that mission statement, right? You don't get to then vote to destroy that company. If you somehow have control over it, because there are other people that also own part of that company.
00:43:37
Speaker
And it's not part of that company to be destroyed or to only exist if it's to the extent it is consistent with your mission statement. Yeah, because anyone you're owning it in a different in a different stance than as a university, you're owning it just as an owner. So you're saying the fiduciary duty would be to entities like Microsoft that are just minority owners and investors, cash investors. Yes, I think so. Yeah, I can say that. But
00:44:04
Speaker
Yeah, but they have no control, but you nonetheless, I mean, yeah, I guess it makes sense. We're full one. You don't get to exceptions here. You don't get to accept a minority investment and then say, I'm destroying my company. You have a duty to wonder if you care holders. I do. But I mean, I don't remember the name of that duty. I feel ashamed that I don't remember the duty of obedience, duty of loyalty, duty of care. I don't know. Fair dealing. Good faith. Something like that.
00:44:30
Speaker
Duty of chaos probably. Whatever it is, it's not good. Duty did not set the thing on fire.
00:44:34
Speaker
But I do know you also have like, there's pretty, a pretty wide berth for business decisions, right? Because it's judgment rule, the BJR. Right. So because obviously you have to have that carve out, right? Because, uh, people make bad business judgments all the time. And you can't say, well, you violate the fiduciary duty of the shareholders because this resulted in you not maximizing your value or indeed like the company going under, you know what I mean? Like it can't be that you violated some sort of fiduciary duty simply because you're a bad business person.
00:45:00
Speaker
So how much of this could be swept all away by simply saying it's just a bad business people. That's it. Yeah. I mean, that's the thing. They didn't they didn't justify it as it's inconsistent. Well, this is where I think this gets important is if they really said to their employees that destroying the company is consistent with their right.
00:45:19
Speaker
then I think that Microsoft and every other investor could just wipe the floor with them, maybe take over ownership of the company, which then that gets you where you want to go. And I think that Satya Nadella, the Microsoft CEO, is definitely flexing that by implication, even if not explicitly, that we'll go after you and we will just take it. And so where's your mission statement then?
00:45:47
Speaker
From what it sounded like, Satya Nadella was stepping in over the weekend and was basically trying to
00:45:54
Speaker
one way or another, keep the ship from sinking, whether that means, or maybe not, maybe that's the wrong part of the metaphor, trying to prevent everything that was on the ship from being submersed under the sea. And whether that means keeping the ship afloat or whether it means evacuating everybody off the boat, one way or another, Satya Nadella was going to see that happen. And I think that's what's going on. And we still just don't know which one it's going to be, although I'm
00:46:22
Speaker
pretty prepared to forecast personally, that OpenAI is itself done as anything other than a vassal state of Microsoft, to the extent that they weren't already pretty well beholden to Microsoft.
00:46:39
Speaker
They I think that based on what's happened here, even if the whole slate of directors that are left resigned and were replaced by people who weren't part of this colossally bad decision making process over the course of the last week. Even so, I think the reputation, the equity that they had in terms not in terms of money, but in terms of gravitas, in terms of influence.
00:47:08
Speaker
I think that is set on fire and permanently gone. I don't think that you're going to find investors who are willing to invest in it, but Microsoft already being so deeply in it may be willing to say, yeah, okay, we're enough into this that we have a vested interest in keeping it afloat and we're probably the only ones who do. And so as a practical matter, either open AI continues as a little puppet of Microsoft.
00:47:34
Speaker
Or OpenAI just gets straight up absorbed into Microsoft by a talent acquisition on a huge scale of like 700 out of the 770 employees or something like that. I think one way or another, Microsoft ends up in the driver's seat here and nobody else is willing to touch OpenAI except, I don't know, maybe some crazy billionaire.
00:48:03
Speaker
That's a great segue. I know just one crazy billionaire that might be interested in buying that husk. Yeah, the XAI can get the desiccated remains of the dead open AI. But yeah, I'm totally in agreement. Either they get a new board, which does whatever Microsoft and Sam Altman want, and then
00:48:25
Speaker
You don't even, Microsoft doesn't need to buy them because they are, then Microsoft doesn't have to deal with having a thousand employees, 700 employees now, but certainly probably going to grow. Uh, doesn't have to have to deal with that employment issue. Um, and they keep getting their, their development that they like, uh, or there's some insulation if it goes awry.
00:48:48
Speaker
Right. If it does break loose, uh, the Microsoft is free to say it wasn't us. It was that those guys say open AI kids. Yeah. Over there. Sure. They're functionally us, but I don't know that. I think the only bad outcome for like really terrible outcome for Microsoft is if it keeps dragging on and dragging on, uh, because I was thinking about this, like, how do you, my Microsoft is more,
00:49:16
Speaker
interested in just hiring 500, 600 people than any other company has ever been interested. This would be like a perfect merger because you don't even have to lay the redundancies off. You just never hire them. You just hire the ones that you want and it's like an automatic layoff. And congratulations. And then you're not even going to pay for the compute units because the company's going to die.
00:49:45
Speaker
And by the way, Microsoft owns 49% of the for-profit entity. So they are like right there. So, but if it keeps going on and keeps going on, they're not going to know how many people they need to get ready to install. They aren't going to like know what teams be able to build teams reliably. Right now they could just import whole teams. They could just be like, we're hiring this team, this team, this team, this team.
00:50:14
Speaker
Right. Every single person. And you just come over a whole hog with all the teams. Um, I'm not sure whether they'd prefer that open AI continue to exist, uh, as it is, but with Sam Altman and a real board that knows what they're doing and you know, all that. Um, or if they'd rather absorb and completely own the entire AI,
00:50:39
Speaker
uh, structure within under Microsoft's umbrella. Yeah. I don't know if open AI has that much brand recognition to be particularly worth anything. And Microsoft pretty clearly hedged their bets by moving in the direction of it's, you know, Microsoft co-pilot it's get hub co-pilot. It's not GPT. It's not some sort of like, you know what I mean? I mean, I think that probably the original thought was if they ever wanted to pivot away to a different language model, uh, they could do so and it would be sort of invisible for other people. You know what I mean? There'd be nothing to have to change basically, but
00:51:08
Speaker
You know, I don't think they expected this to happen now, but I could see the same sort of thing. Once they develop something in-house that is functional, just quietly move away from open AI and that's the end of it. Let it die. I mean, it's not worth that much probably as a brand. Yeah. I would think, I don't know. I mean, the chat GPT is worth it to the, as a brand. Yeah, that GPT, the synonymous with AI now, right? Yeah. Like we just had a chat GPT lunch at my local bar association. Right. That's like about, about that. So.
00:51:39
Speaker
I'm not ready to, and I know we're going to transition here in a second, I'm not ready to say that the board made entirely bad judgment and they were totally uninformed because they're rubes, because I don't think that's true. You have a director of strategy and foundational research grants for the Center of Security and Emerging Technology at Georgetown.
00:52:01
Speaker
The center for foundational research grants, maybe sounds or director of foundational research ground grants, maybe sounds like a fundraiser to me. And so maybe that's not great. Uh, in the basement of the academic building or something for sure. That's one of those things that you go to a school for four years and you never knew, even knew it existed. Maybe I'm not ready to say something like that because I'm sure there are, uh, titular roles like that at Georgetown.
00:52:25
Speaker
that are just there strictly for the purpose of, this person is an alum and we want them, I don't have any reason to believe that that's what's going on here, maybe.
00:52:38
Speaker
And then there's another one who's like a management scientist at RandCorp. I assume that is a reasonably prestigious role. There's the CEO of Quora. I don't know how prestigious that is, but one way or another. And then there's Ilya Sutskever, who is the company's chief scientist. These are not necessarily uninformed people who just made a foolhardy decision.
00:53:06
Speaker
Although I wonder the extent to which, maybe with the exception of the CEO of Quora and for the academic type at Georgetown,
00:53:18
Speaker
I wonder whether the extent of this is they just didn't understand the politics of what they were doing. A CEO should understand better. A CEO of any meaningfully sized operation should know better that before you do something like this, you need to understand the politics and the fallout that's going to come from it.
00:53:40
Speaker
And so maybe they expected it. Maybe they didn't. I don't know. It still is probably going to go down in history as one of the most gargantuan screw ups that a corporate board has ever made. I like the likening it to Apple, but I was also thinking of like, in terms of like just how it will be popularly thought of, I'm imagining it more
00:54:02
Speaker
in the like, remember the DeLorean cars? Like that kind of thing. This thing that was like for a moment, this, it was coming, right? It was just, it was just big new thing. It was going to be, it was going to revolutionize everything. And then it's just like flamed out on the tarmac. You know what I mean? The Segway. Yeah. The Segway. Exactly. I'm thinking of it like a Napster.
00:54:20
Speaker
where it was like, Napster came, changed everything, and then was destroyed, and then real things picked up in its place, but the change never went away.
00:54:33
Speaker
Right. It could have been a huge player in there, but for some decisions a lot of the way. But for complete lawlessness and unprofessionalism. A few little minor issues like that, yeah. While we've been on this, news broke that shortly after firing Altman, OpenAI's board approached Anthropik, which is another AI company, about a potential merger. That's awesome.
00:55:00
Speaker
What were you doing? Also, while we've been recording, it came out that before Emmett Sheer took the job, OpenAI's remaining board members offered it to two other people who both declined. They were both, they were you guys, right? Yeah, right. It was probably a great career choice for each of them to decline because this is like the Linda Yakarino post,
00:55:31
Speaker
Hey, why don't you be the captain of the Titanic while it's sinking?

Leadership Changes at OpenAI

00:55:35
Speaker
Yeah. Good luck, guys. He said he took two hours to think about it, which suggests that the board maybe didn't have... Well, I would say there was a terrible idea to hire a CEO with this little, you know, checking.
00:55:56
Speaker
of their credentials. It could have gone way worse. But they they had a rebellion of their their interim CEO, the one that they appointed to replace the old one, which I guess they didn't expect because of their thorough lack of understanding of their own company. Yeah. So I guess they were in a bad position because their own CEO was bad mouthing them and saying that she shouldn't have been appointed. So
00:56:23
Speaker
Uh, really, it's like the basic professional, like board one on one. If you, you don't fire a person and stab them in the back without like, like really good evidence. Like you don't say, you don't say that you're firing somebody for cause basically who is generally popular without something real good. Oh, by the way, they haven't had, they have doing this, been doing this without outside counsel and without their own PR.
00:56:50
Speaker
So it's been radio silent. It's just like amateur stuff all around. There's no, there's, I've seen some people like defending, maybe the decision could be itself to fire Sam Altman. I could see defenses of that decision. Sure. Whatever. I mean, all you give me a detail on that, that I would say, Oh yeah, sure. Got to go. Yeah. Gotcha. Yeah. But like, there are so many better ways to do it. Absolutely. Yeah. So many basics and they're still not following them.
00:57:20
Speaker
It really just like...
00:57:22
Speaker
adds to the list of the tech world's just full of insane people. Yeah. Speaking of, I think it's a great segue for

Elon Musk's Legal Battle with Media Matters

00:57:31
Speaker
bozos. Yeah. I don't know if we have that. So you were right before. Speak of insane people. Also, he has his handprints on this. He was, as you had said, a board member and an early investor of OpenAI, Jake. Elon Musk turns out he's added one more thing to his illustrious list of his resume of screw ups. Is it a thermonuclear lawsuit?
00:57:51
Speaker
a thermonuclear lawsuit against, uh, this is against media matters, right? Is that, is that correct? Yeah. Yeah. That's against him out for being antisemitic. Well, that's not exactly what happened. That's a little bit, that's a little bit reductive. Uh, what they actually did was pointed out how on Twitter, uh, there were a number and the exact number is the source of a great deal of contention in
00:58:19
Speaker
uh, what is among the most poorly written lawsuits I've seen. Uh, but they have on Twitter at some point in the past, there have been pictures or advertisements for companies like Apple and IBM and Comcast and Oracle, uh, like big companies, uh, big tech companies.
00:58:45
Speaker
that have been, so ads on Twitter right next to some pretty like hot garbage white supremacist antisemitism, like nasty, nasty stuff. Like pictures of Hitler and a bunch of Nazis presented in like kind of a favorable light for those guys. Talking about this is what spiritual awakening looks like and it's like the Nazis. Like bad, bad stuff right next to this ad for Apple.
00:59:16
Speaker
And understandably, when this gets brought up, Apple's not crazy about their ads being shown right next to this, like, vile garbage. Neither is any other reasonable advertiser where
00:59:32
Speaker
They don't want to be associated with this stuff. And so they start talking about, we're going to stop all advertising. I think Disney was among them too. Did Disney stop advertising on Twitter? I think they did. Apple, Bravo, IBM, Oracle, and Xfinity were all specifically named. Ubisoft is no longer...
00:59:52
Speaker
There have been like one by ones that's kind of slowly moving. Yeah. It was just this huge wave over the weekend. And of course, it's a great time for advertisers to be pulling off of a platform right before Black Friday, the biggest shopping day of the year. And so Elon Musk.
01:00:12
Speaker
I hesitate to say understandably is upset by this. All of the biggest advertisers are dropping off of Twitter because of the toxic cesspool that he has allowed it to become and probably goaded it to become. And so he filed this lawsuit. The lawsuit is it's not good. It is a 15 page complaint.
01:00:38
Speaker
I think I've gone on this specific diatribe before about what a good lawsuit looks like and the federal rules of civil procedure. There's a federal lawsuit. The federal rules of civil procedure specify that a good lawsuit, what a good lawsuit looks like is a short plain statement of the facts.
01:00:55
Speaker
giving rise to your claim for relief. And this is not short. It is not plain. It is mostly not facts. And so it is not a serious lawsuit written by serious lawyers for a serious client. This is an unserious lawsuit that is drafted for the purpose of PR, basically laying out, what is it, three claims? One, two, three, three claims.
01:01:21
Speaker
The first, that they intentionally interfered with a contractual relationship. The second, that they disparaged the business of X. And the third is that they interfered with a prospective economic advantage. I will tell you that is a tort with which I'm completely unfamiliar. It probably arises under Texas law, although I don't know how that would be any sort of different from intentional interference with a contractual relationship.
01:01:48
Speaker
or disparagement. I don't know how that's differentiated, but when you talk about interference- It's not likely to be real. None of it is likely to be a real claim. Yeah. There are certain best practices for how you write a complaint. One of my early mentors in the law told me,
01:02:05
Speaker
to read The Old Man and the Sea by Ernest Hemingway and write like that when you're writing a lawsuit. And it's a great tip, great advice. This is the exact opposite of that. And the problems that they're going to run into here is when you get to business disparagement, you're going to have a problem with truth because the truth is an absolute defense to disparagement. If what you said was true, it's not disparagement. And they've got the screenshots. And unless somehow these folks can
01:02:33
Speaker
forensically show that these screenshots that were posted by Media Matters online were doctored and by doctored I mean more than just I took a screenshot that shows only a portion of the screen that was on or portion of the screen at the time. If they show that it was actually materially altered, maybe. Well, I can do you one better. They contradict themselves because here I'm looking at
01:02:57
Speaker
paragraph 11 of the complaint. He says, or they say, media matters omitted mentioning any of this in a report published in November 16th and displayed instances of media matters found on X advertisers paid post features next to next to neo-Nazi and white nationalist content. Nor did media matters otherwise provide any context regarding the forced inauthentic nature and extraordinary rarity of these pairings. So by saying extraordinary rarity, you're saying, well, it does happen. It just doesn't happen all that often. And so we're,
01:03:26
Speaker
Ex-Twitter, Elon and his lawyers are going to be very quickly introduced to federal rule of civil procedure 12C, which allows for judgment on the pleadings because you have basically confessed that this thing that you said was disparagement actually is true and happened. Second class of action, business disparagement is probably going to be bopped out on a motion to dismiss for
01:03:49
Speaker
It might be 12B6, it might be 12C, it probably should be 12C as a matter of legal strategy, but who cares, insider baseball. The interference with a contractual relationship is a little bit quirkier because sometimes the elements of an interference with a contractual relationship
01:04:09
Speaker
has to show, there has to be some element, this is not always the case, and I haven't researched to see whether this is the case in Texas, but there has to be some element of wrongdoing, where in some instances, intentional interference towards have to involve a crime or other civil wrong in the process, like fraud, and I think you're gonna have a hard time proving that, because what they did looks an awful lot like journalism.
01:04:39
Speaker
And then it can't just be about be through going through, you're like going about your own business. You have interfered with somebody's contractual obligation. It has to be, you did something with the intent to interfere. I would imagine, right? Yeah. Like you're not just reporting on something you were intending to interfere with this contractual relationship. So you get into a, what is the, what is the actual intent? And this goes into like first, first year law school criminal law stuff. Like what is intent?
01:05:04
Speaker
Right. If the natural and probable consequence of it was going to be that this would, that interference would happen and there's some sort of reason why that is wrongful, uh, then that's fine. But I think they're going to have a problem with the wrongfulness prong. Like, yes, their intent was to point out this bad thing that was happening, but that's not a wrong thing to do. That's what journalists and watchdogs do. They point out the bad things that people are doing to try to get them to stop doing bad things.
01:05:32
Speaker
And the one thing I knew about this complaint was that they, the complaint actually confirms that media matters saw what it says it saw. Cause it says that it only, this specific pairing only happened to one user and that was media matters. So in other words, yeah, it happened. Thanks for the confirmation of the thing that you are claiming was disparagement.
01:05:56
Speaker
And you're saying we doctored or whatever. Yeah. We forced, it was, it was not organic. It was forced. It was not organic, which like, you know, it doesn't, it doesn't give the context to the rarity, but of course they don't say, I would love to see here. Here's the funny thing. This opens it up to discovery of, Oh, so you know exactly how often it happens. He says it's rare. So we get to find out exactly how often
01:06:20
Speaker
white nationalist content gets seen next to ads and it's going to be a lot. You are in some way tracking or coding white nationalist content. You know what it is. You're choosing to permit it to exist. In other words, you can identify it and you've just decided to keep it on your platform. Right, which was always the thing with like why Google didn't want to get involved with like filtering search results because the idea is like once you sort of open the door that it's possible,
01:06:45
Speaker
All things are available to you. Everybody can come after you in terms of like, well, why was this person able to Google for how to make a thermite bomb or whatever, right? Why can't you filter that? Showing it's possible to track this or to curtail this opens up the door for all kinds of wonderful stuff. And of course, it's not like, I don't think it's believable at all, the numbers they're talking about.
01:07:09
Speaker
Like I don't think they can actually identify fringe content is the way to describe it. I think I don't think they can identify it. Um, but I would love to see the discovery for that. I would love to see, you know, I don't think they're going to want to give that to media matters.
01:07:25
Speaker
One of the other interesting things about this too is that they accuse media matters of manipulating the algorithm so that presumably they were more likely to get this sort of Nazi content delivered to them. And it's interesting to me that X is pretending that media matters is the only person for whom that has occurred. I'm confident in saying that there are plenty of people who use Twitter
01:07:53
Speaker
And they carefully curate their algorithm to give them the nastiest, vilest stuff because that's what they're into. They're called Nazis. Yeah. And they're on there and you can find them. They're right next to the Comcast ads. Training an algorithm to suit your needs is the one of the new skills of the new age. Right. Yeah. Yeah. And the idea that even how to push it, where you give it, make it give you content that you want.
01:08:19
Speaker
Right. And the idea then that that has not occurred to anyone who is not intentionally attempting to see Nazi content is also like another thing of all rights. But the guardrails on this are so like precise. It's certain that only if you really want to see it, will you see it. Yeah, I don't think it's going to reach discovery, but I hope it does. If this reaches discovery, this is going to be one of the most cataclysmic
01:08:46
Speaker
electronic records discovery cases ever. Like the sheer volume of content that is on Twitter that's going to be ingested into whatever relativity, is that one of the ESI tools? The sheer amount of material that is going to be ingested into these discovery tools is going to be unbelievable.

Conclusion and Farewell

01:09:16
Speaker
We'll have things to talk about. Yeah. For between open AI and Twitter, each like racing each other to bash their heads against the rocks. Uh, we'll have stuff to talk about for the, for shows to come litigation moves slowly. Open AI moves quickly and breaks things, including those things or itself. So, uh, we'll see open it. I think the open AI question is going to be relatively resolved within the next month. By the time we talk next, I would expect there's major
01:09:46
Speaker
movement and you basically know the direction it's heading. I think you're both right. The Satya Nadel is not going to have a very peaceful Thanksgiving, I don't think. No, probably not. Probably not. Okay.
01:09:58
Speaker
You guys want to move on to recommendations and we'll, we'll conclude this and let everybody go to bed. I can get my recommendation very quickly. It's simple. I don't think I've ever recommended a book before, but I want to seem like the kind of person who reads books. And I think Jake or one of you recommended a book a couple of weeks ago, a couple of episodes ago. And I want that kind of aura of intelligent person. Um, it's like 10 episodes ago. Yeah. Time doesn't have any meaning for me anymore. Um, it's called a mystery of mysteries. It's about the death and life of Edgar Allan Poe. And it's by Mark.
01:10:27
Speaker
Dawood Ziaq, and I didn't know anything about Edgar Allan Poe prior to this. It's a very good book. He's a super interesting character, and I made to understand there's something on Netflix that is like a compendium of a bunch of his stories and stuff. The Fall of the House of Usher, it's not really his stories. It's like plays on his stories. The Fall of the House of Usher is a book that he wrote, but this has basically no relationship to
01:10:57
Speaker
to it other than the most broadest strokes. OK, but it also has other other stories in there anyway. Oh, well, interesting book just basically about his life and a little bit about his death. That's my recommendation. What you got? My recommendation, I'm going to jump in in front of Jake. Yeah, fine. My recommendation before we get to the video games, I'm going to do another book is another book. This one is by Esau Macaulay, who is an Anglican priest and a
01:11:27
Speaker
a professor at Wheaton College, which is like the Harvard of Christian colleges. But he wrote a book called How Far to the Promised Land. The subtitle is One Black Family's Story of Hope and Survival in the American South. Really interesting. It's written in the style of a memoir. It probably is a memoir of a man who's not that much older than I am, but has experienced a vastly different and interesting and compelling life so far.
01:11:56
Speaker
strong recommend Esau Macaulay. In addition to just being a well-written book, it is a story of tragedy and hope, and I have really enjoyed it. Good book, strong recommend.
01:12:11
Speaker
So even when I recommend a book, you put up me on the intellectual. It's not a much better book. It is not highly intellectual. It is very accessible. It is something that is
01:12:26
Speaker
easy reading, uh, in the sense of it is not dense. Uh, it is not easy reading emotionally all the time, but it's good. And sorry, I'm a jerk. No, sorry. I think you probably had that chambered and you were waiting and you would have bust that out whenever I had a book as my recommendation, but I'm fine with it. I want to really quickly say before, uh, Jake potentially recommends, uh, Alan wake to again, I've been playing it. I'm only a couple hours in excellent game.
01:12:50
Speaker
Very good recommendation. I know your recommendation. Thank you. So you're OK. I'm glad you're liking it. It's so strange. Anyway, I'm not going to recommend it. I'll make you again. I'm not going to recommend a video game. I'm taking a much needed break from the insane year that it was. I'm going to talk about Marvel.
01:13:11
Speaker
uh because Loki season two ended uh and also I saw the marbles and uh Loki season two ended really
01:13:21
Speaker
cool. It was really good. And it's like a reminder that they make good stuff sometimes. You know what I mean? What'd you say, Jason? I said, was it though? Was it really good? It was. The ending? It was good. I thought the ending was very beautiful. I didn't like the first three episodes of the season, three or four episodes of the season. I was like, this is too much. What's going on? It needed like an episode or two of like
01:13:48
Speaker
Calming down because everyone's like constantly running and I was like, I don't know what's going on
01:13:54
Speaker
I really liked the end of the season, that season two. And then Marvel's I'm giving you a hard time. Marvel's, uh, undeserved all of the hate that it's getting and like the, you know, the bombing it's bombing at the box office, no doubt. But like, I almost left Thor love and thunder in the middle. I hated it so much. Um, the Marvel's I had a good time. Like I actually liked, like.
01:14:19
Speaker
I'm not mad that I spent money to see that in a theater. It had two of the best sequences of any Marvel movie. Uh, like two really imaginative, cool sequences. One of which I was just laughing out loud for like five minutes straight. Um, so like, it's not bad. You can wait for Disney plus because every part, every like the parts of that movie that give it like that make it a movie are not very good. They're like the actual plot. No, no, it's not very good.
01:14:48
Speaker
It's a good time. So I'm not optimistic at all that Marvel is going to get good again. But I am optimistic that it will get good sometime in like years down the line, maybe.
01:15:05
Speaker
Because they can still make good stuff on occasion. An interesting cultural experience I'm getting to have with all of this, and I feel bad that I think both of you guys are not having, is as someone who hasn't seen a Marvel movie in like 20 years, I'm certain that this movie has come out about 13 times, and you've both talked about a movie called The Marvels.
01:15:27
Speaker
every other week. And on Slack, you've talked about it. It seems like it is so, I cannot explain how confusing it is to attempt to follow along with what's going on with any of the comic book movies. They was Iron Man and then I think they came out with Iron Man 2. I didn't see that. I lost the plot there and I've just been proceeding through life ever since and listening to all these things and the names you say and the stuff you throw around and you could all be screwing with me and I'd never know. Well, it's great though. I really enjoy it. I'm not joking.
01:15:57
Speaker
the marvel name specifically is you got captain marvel and then the marvel is the sequel to captain marvel and then in between was the tv series miss marvel uh so it's it's not the most
01:16:13
Speaker
And then, of course, it shares its name with the series. So with the whole with the whole thing itself. Right. The whole company that is the the public definitely doesn't help. Yeah. But you've seen you've seen Iron Man, the original Iron Man. Yeah. So the first one. Yeah. OK. I mean, that's a good one. That's that's one of the best. OK. Well, at least I went out on a high note, which we could do. I know. Good timing. Good timing. Good delivery. Yeah. Proud of you. Thank you. I appreciate it.
01:16:44
Speaker
I hope you both have a great Thanksgiving. Yeah, we do, definitely. That's all I want for Christmas. That's what I'm going to do over Thanksgiving.