Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Ep 57: Bridging the gap between Politicians & Corps. with Jules Polonetsky, CEO, Future of Privacy image

Ep 57: Bridging the gap between Politicians & Corps. with Jules Polonetsky, CEO, Future of Privacy

S4 E57 · The Abstract
Avatar
75 Plays1 month ago

Join Jules Polonetsky, CEO and Founder of the Future of Privacy Forum, as he shares lessons he learned from the front lines of ad tech starting in the early 1990s. Starting his career in politics as a New York assemblyman and working his way into consumer protection-focused public roles, he became one of the first chief privacy officers in the tech industry at companies like DoubleClick and AOL at a crucial moment in the history of online privacy. Now his think tank connects policymakers with corporations, and helps both sides answer tough questions about privacy and data.

Join Jules Polonetsky, CEO and Founder of the Future of Privacy Forum, as he shares lessons he learned from the front lines of ad tech starting in the early 1990s. Starting his career in politics as a New York assemblyman and working his way into consumer protection-focused public roles, he became one of the first chief privacy officers in the tech industry at companies like DoubleClick and AOL at a crucial moment in the history of online privacy. Now his think tank connects policymakers with corporations and helps both sides answer tough questions about privacy and data.

Listen as Jules discusses his uniquely-positioned understanding of privacy issues, the future of AI governance and the chief privacy officer role, advice to lawyers who want to move into the privacy space, and much more.

Read detailed summary:  https://www.spotdraft.com/podcast/episode-57

Topics:
Introduction: 0:00
Getting a start in New York politics: 1:57
Running for elected office: 11:46
Taking one of the first Chief Privacy Officer roles in the industry at DoubleClick: 16:43
Considering the necessary training to be successful in privacy: 24:28
Founding the Future of Privacy Forum: 32:02
Questioning the death of the Chief Privacy Officer role: 44:42
Favorite part of your day-to-day work and professional pet peeves: 51:54
Book recommendations: 56:41
What you wish you’d known as a young lawyer: 1:00:01

Connect with us:
Jules Polonetsky: https://www.linkedin.com/in/julespolonetsky/
Tyler Finn: https://www.linkedin.com/in/tylerhfinn
SpotDraft: https://www.linkedin.com/company/spotdraft

SpotDraft is a leading contract lifecycle management platform that solves your end-to-end contract management issues.

Visit https://www.spotdraft.com to learn more.

Recommended
Transcript
00:00:00
Speaker
Hey, the experts in Washington, the Federal Trade Commission, the people who looked into this and they really know this stuff and they have the same statue that you do, you have a state version of that FTC act, they said it's good. So we're good, right? No, no, not good. So we end up having to sign a settlement promising to do everything exactly the way we were actually already planning on doing it, but also paying a million dollar fine.
00:00:28
Speaker
What's it like to found a forward-looking DC think tank? Is AI going to subsume privacy? Are chief privacy officer roles going to remain relevant? Today, I am joined on the abstract by Jules Polinetsky, founder and CEO of Future Privacy Forum. FPF is is a nonprofit organization that serves as a catalyst for privacy leadership and scholarship, advancing principal data practices in support of emerging technologies. I will also say as someone who's worked in privacy and talked to a lot of folks who are in the DC think tank world, I think FPF is also considered to be a truly practical think tank that really actually understands the underlying technologies that are often being debated in policy conversations. It's a great forum. Before launching FPF, Jules with the chief privacy officer and senior vice president for consumer advocacy at AOL, and then the chief privacy officer at DoubleClick. He also had a prior career in politics,
00:01:34
Speaker
first working for a congressman from Brooklyn, then for now Senate Majority Leader Chuck Schumer. He held elected office himself as a state assembly member from Shorefront, Brooklyn, and as the appointed commissioner of the New York City Department of Consumer Affairs. Jules, thanks so much for joining me today for this episode of The Abstract.
00:01:55
Speaker
Tyler, great to be with you and the audience. So as I mentioned, your career was not a privacy one at first. It was actually sort of destined to be a political career. I want to start there. What inspired you to get into politics?
00:02:10
Speaker
you know I wasn't inspired um and i didn't know I didn't know anything about politics. um My career has been a happy coincidence of maybe being bold enough to seize opportunities or or help create them. I started out as a pre-med student who suddenly he didn't really want to go to medical school and wasn't sure what I was going to do. I thought maybe I'd go for a PhD in industrial psychology. Still not sure what that is.
00:02:35
Speaker
I took law, the LSATs, on a lark, and when I got into good law schools, I said, okay, I'll do that. I didn't even know a lot of lawyers. I grew up in a very kind of blue-collar home. My father was a postal carrier and mom worked in a local school system. you know Doctors were the important people that I knew, but I didn't have business people in my family or neighbors or some very immigrant community in ah in in Brighton Beach. so Somehow law school seemed, I don't know, something that I could do because I needed to do something. And NYU was a good enough school to go to. And I i did enjoy law school, um but I didn't have any real vocation. I liked real estate. I liked architecture. I like buildings and and you know the architectural structure and the history and the magnificence. And so I thought I should be a real estate lawyer ah um when I showed up.
00:03:25
Speaker
and Spent the year proofreading leases at a big Wall Street law firm where doing a good job meant, you know, being eagle eyed at spotting typos or finding clauses. I hated it. I wasn't good at it. And I hated it. And I really wasn't sure what to do. Other law firms seem pretty much the same. When the mind's mom ran for local office and he said, hey, could you help us out? And all of a sudden I was showing up and handing out flyers and walking into political clubhouses and my eyes were opened to this world where people were passionate about things that really affected the local community.
00:04:03
Speaker
really cared about crime and funding for their schools and how come their local government wasn't responsive to things they needed. And I discovered that you could show up and if you were alert and had some decent social skills, you could immediately get involved. I learned that I was actually very lucky. You know, there's a famous story of somebody who walked into a Chicago political clubhouse and said, hey, I'm here to help. And the head of the clubhouse ah in that local democratic ward said, well, who sent you? And the guy said, well, no one sent me. And they said, well, we don't want no one that no one sent.
00:04:39
Speaker
And I was lucky enough that either I was sent by somebody, my friend's mom, who was a local political player, or the community was elderly and ah going through a transition and some eager person showing up and saying, hey, I'm passionate about working with you ah was well received. So I stumbled into local politics and um ended up ah leaving my law firm job, ah taking a you know huge salary cut and and working for my local congressman. That's great. I think we'll talk about this as as the podcast goes on. you know A big theme here is around non-linear career paths and how they can lead to really meaningful opportunities. What a great origin. you know One piece of advice that I really like to give to, say, young people, but when I say young people, I mean people who have the capability of
00:05:27
Speaker
taking risks. You know, right now I have a mortgage. I have a spouse. I have college tuitions. There are some things that come up. Interesting. I love my job. There's a lot more to do at F.P.F. But occasionally there's a really interesting senior government role sure that comes my way or something. I can't easily take risks.
00:05:47
Speaker
I've got a pay bills, I've got an organization that I'm responsible for, but at that time I was young and single and although I had a law firm job, I could go make $25,000 working for my local congressman and move back home because I ah had that safety net. And then even later on, when other opportunities came up, you know I didn't have kids. I eventually i had a spouse, but it was just us. I was able to take chances. I'll run for office. If I lose, that's fine. I'm not going to go hungry. I urge people, and this can happen. at any point in your career. Sometimes people who've been working as a CPO or been at a company and they're at a transition, they're thinking about what to do next. And I say to them, do you have the leeway? right Maybe you were laid off, but you weren't laid off and and you just have an opportunity. Can you take a risk? like what if nothing What if you make no money for a year? right And some people are like, hey, I can do that. i'm I'm being paid out or I made one wrong Good money. So whether you're young or whether you're further or along, there's only certain windows in your life where you can actually go for a gold ring, like say, I'm going to take a chance. There's never any guarantee and even changing jobs, right? You're taking a chance and it's not. So grab those opportunities when you have the room to do so, because there are other times where you're going to have to say, no, Hey, it's your job to pay bills and and you got to stick with the stability.
00:07:09
Speaker
I got a couple more questions for you about your time in politics. The first is Chuck Schumer is the Senate Majority Leader and a huge figure today in American politics. Any stories you have or impressions from your time, things that you learned working for him. And and then I want to ask you about running for office too and what compelled you to to go and put yourself out there because Maybe it you know maybe it was easier at the time, but that is still ah that is still a big risk and a big shift in in sort of the the the division between a job being something slightly private and and it being sort of your whole life and very public. yeah you know there were Really a lot of compelling lessons from working for Chuck. Chuck, Senator Schumer now is a driven person and many of us show up and we're asked to carry out a task and we kind of do it and we hand in you know the results. That wasn't enough for Chuck. You needed to drive the agenda. If you were leading and supporting him on an effort and you went and you did X and Y and then you had to report you know to him, here's where this is going. He had a dozen things that you should have thought of and that when people told you no, you should have not taken no for an answer, that you should have just kept pushing it. and you know doing it with do with grace and dignity, but but really getting things done. It's like being a startup founder. or you know And I find that you know sometimes we'll we'll hire people and you know they do what they're asked and then they run into a hurdle and then they come back to you, right? Or they say, well, it couldn't be done. And you know, you know as ah somebody whose job at the end of the day is to make things happen, you know you never could start a startup. A million things go wrong. And like someone has to be like triaging and getting it done and pushing it through. So A, I learned a lot about what it meant to really push and not take no and keep going at something. But I also learned yeah the importance of communicating in ways that matter to ordinary people. you know but The first member I worked for was a wonderful congressman named Stephen Solers. And he was a great foreign policy leader. And he was looking to make peace in the Mideast and in India. And he was all over the world.
00:09:22
Speaker
and chuck was involved as well in Washington in gun control and, you know, lofty and important issues. But he came home to the district every weekend, and he showed up at every community event. And he also had an agenda of very local practical things, a stoplight, a a local police precinct, a local school board. And that's what mattered, being able to communicate the issues that he was working on to the very daily practical issues instead of it being kind of intellectual or theoretical. And that's something that a lot of us in policy or legal world don't always do. We get wonked out with the issues we're working on, and we forget that the ordinary person, the ordinary legislator, the state legislator, you know, we do a lot of work nowadays with state legislators. And I see these different trade groups and policy organizations coming in from where they sit.
00:10:15
Speaker
and then explaining to the policymaker why they shouldn't advance state legislation. right And I say to them, hey, do you understand where they sit? They sit in a world where you're basically telling them, go home, do nothing, and lose your next election.
00:10:29
Speaker
like You can't say don't do. When I was a state legislator, it was my job to advance state legislation. so I didn't really care at the time that you might have compliance challenges because you might have to do it differently in a different state. I'm a New York state legislator or that maybe it'll be a federal bill or you would prefer there's a federal bill.
00:10:50
Speaker
and so people need to put the hat on of the person on the other side of the table, right? That you're negotiating with or that you're... What do they want to hear? Not, well, your proposal really affects my ad tech business model. Okay. You didn't elect me. People who are like, I don't like that, or advocates who are telling to me, hey, we're going to salute you for doing this. So now you tell me why I shouldn't do it. Now, there are reasons why, but you've got to put your hat on her and say, hey, what you're doing is going to affect the local retailer who only wants to advertise because he only has one store, the local plumber who only wants to advertise locally. So instead of saying, here's what I need about location data and this and that, you need to understand what he or she or they are going to be responsive to and absolutely them where they are. So I learned a lot of of that from Senator Schumer and it's been a great life lesson.
00:11:45
Speaker
And you deciding to run for office yourself. Tell us a little bit about that and what was going through your mind at that but point in time and what what was that experience like of of representing people in in Brooklyn? When a seat opened up, I said, you know what? Hey, I i can do this. I had um represented that area for Congressman Steve Solar as my first boss in politics.
00:12:10
Speaker
And it was you know one of the areas that was in Congressman Schumer's district at the time. um So I had a lot of good relationships. My family you know still live there. And I always felt that even when I was in Washington working you know in the Capitol and so forth, that I was this kind of person who came from a family that had to you know work hard to put the kids through school and had to pay bills.
00:12:34
Speaker
you know, just had a very almost immigrant sensibility that it was important to be able to help people make it through their day and be treated fairly by companies and by government. So I took a chance and threw my hat into the ring and was lucky enough to get elected. It was a great learning experience. I've had the pleasure of working in federal, state and city government.
00:12:58
Speaker
um yeah but has right been such a you know As someone who now has to deal with policymakers at those levels, it's really helped me. and then you know Eventually, being on the other side, not only, hey, I want to talk about legislation, but and you know later on when I was at DoubleClick, having to deal with you know an attorney general um who's running for office, but also bringing in actions you know against the companies I was working for. you know it It helps you understand, again, the perspectives of policymakers and the people and the constituencies that they look to. um It also started um giving me an opportunity to develop a kind of consumer expert portfolio at At the time, um I had a very senior district and the funeral industry was rapidly consolidating. um And the local funeral homes that people thought were quasi-clergy like you know figures come in and view what's appropriate for you know your your loved one. ah You don't have time to shop around. You don't call up three funeral homes. Riverside will bury my house.
00:13:59
Speaker
My husband, you know for $10,000, can you beat that price? You go in, you're you're not in a great state of mind. you you but Sometimes it's a very tight timeline. Some traditions you bury immediately the next day. And I saw how the people in my district were shocked that this huge huge expense for many people. a Funeral is the third most expensive purchase you make in your life after a house and a car. and Here they were having to make it without information. Disclosures didn't exist. Price comparison wasn't possible. and You didn't know that there was a big global corporation who owned much of the market actually behind that local community funeral home. So I began advocating for greater transparency and dealing with some of the challenges of the consolidation in the funeral industry. I got involved with the safety of the blood supply and started realizing that you needed to work with the media, you needed to work with civil society, you needed to find the industry stakeholders, and that's how you could get things done.
00:14:58
Speaker
So, great experience. But I did get tired eventually of communing up to Albany. um I did think I was going to run for Congress for Chuck Schumer's seat when ahha when that opened up. Things went in a different direction, and I became the Consumer Affairs Commissioner for New York City. And I learned that it was far more fun to be in the executive side with all the You're being a state legislator and advocating, but you're one of many, you know, and and you only keep things done collectively. And so you spend your time, well, there ought to be a bill and i'll I'll draft it or I'll write a letter to the governor, but you're not in charge of anything, right? You have just your vote and maybe, you know, your ability to put together influence in coalitions and get a thing done. And even then it's so hard things done being on the executive side.
00:15:42
Speaker
Well, it wasn't my business to deal with every issue, but the consumer protection issues, I had statutes and laws that I can force. I had instructors and finding authority and I could draft the proposed rule because I was on the other side of it. And that was super, super fun. um I often have young people who come and say, I want to get into politics and office. I'm like, what do you really, what do you really want? If your goal is to give speeches at local community events and rah, rah, and You could be painting and you really want to be involved in like every possible issue. Great. But if your goal is, I want to actually advance policy you know matters you know in a particular area, go work for a government agency and or or you know or get elected to a role where you are on the executive side. um So it was a fabulous experience enforcing New York City's consumer protection laws, which are very similar to the FTC's consumer protection laws and learning or what it meant to have a deceptive practice, what it meant to have unfairness, what it meant to do rulemaking, really a thrilling experience. I mean, what fantastic perspective I would say to be able to bring to a company that's going to have to navigate regulators and navigate the press. And when when you took the job at DoubleClick, and moved into chief privacy officer role. First of all, I can't imagine there were that many chief privacy officers out there at the time. If you weren't the first, you must have been in the first 10. Talk to us about what privacy was like as a field at that point in time. Had you had a lot of exposure to it through the role as a regulator? Where did you think it was going to go? like what What did the landscape look like then?
00:17:20
Speaker
Yeah, the first of all, Chief Privacy Officer was probably Jennifer Barrett Glasgow of of Axiom, um or sometimes Ray Everett Church of All Advantage, who may have been the first sort of online digital advertising related one, whichever date came first, there was a collection, but it was pretty early. um And um we I think we were all figuring out what the job ah you know actually meant. ah and so There was a lot of opportunity to define ah what the role ought to be. DoubleClick was a particularly interesting place to come into because they were under siege for you know buying advocates and intending to link up offline data to online data. and
00:18:01
Speaker
you know The interesting lesson from the DoubleClick experience was they had done a lot of things right. Before they did anything, they did consumer surveys to learn what do people think about cookies and about being anonymous online and when they want to opt in and when they want to opt out. They worked with the leading academic of the time, who ah Alan Weston, the father of US s privacy scholarship, and he did these surveys and interpreted it you know for them. They gave people prominent notices and choices, some better than what we have today when they plan to link ah data. And ah they had a privacy policy. People didn't even have privacy policies in those days. um and they had ah And they built an opt-out, which wasn't required and, and frankly, partly existed. And they they built these tools. And yet the sky, and they briefed advocates.
00:18:53
Speaker
um Some who initially actually were very positive because they said, we're used to dealing with personal information, like people having credit report problems and data breaches and, hey, you're not going to know people's names, it will just be a cookie and people can even clear their cookies. um Now, the advocates quickly realized, well, actually, maybe it wasn't explicitly personal, but boy, it felt very personal. um Sure. And yes, people had browser controls, but maybe they didn't even know they had those browser controls and so you know it turns sour but there was this early you know period where DoubleClick and the other early tech companies were were the champions of democratizing access to technology. The alternative was you had to do business with AOL. There was no ad market.
00:19:40
Speaker
You didn't have salespeople. Big companies couldn't go talk to 10,000 little websites. They only wanted to deal with Lycos and with AOL. And so the first crop had this rosy period where they felt they were the good guys. Hey, we're helping small websites get revenue so they can do their work because the alternative is that AOL is going to buy you or crush you or only sure or or the advertisers are only going to do business with the biggest players. And they said, we're rolling up all these companies and we're representing them en masse to Coca-Cola as an ad buyer or to compete with ah with AOL. um So when all of a sudden this turned against them, when they went from being the upstarts who are great startups and and you know making people millionaires and helping thousands of small websites have a business model for all the things people were excited about. And and all of a sudden, no, you're evil. You are violating laws.
00:20:36
Speaker
When the founders of DoubleClick worked with great out a big outside counsel and hired ex-FTC people as their representatives and had had a good in-house team working on these issues, and all of a sudden, no, you're actually a criminal. I said, wow. right so I had to help the business understand that, well, this isn't fair, but guess what? It is what it is. The politicians will view what they do. When we resolved the issue with the FTC, for instance, I had a closing letter from the FTC saying, hey, DoubleClick, you didn't do anything wrong and you didn't do all the things people are alleging. So, you know, based on um those representations, it's it's all done. And I took it to the AGs who were ah investigating DoubleClick and I said, hey, the experts in Washington, the Federal Trade Commission, the people who looked into this and they really know this stuff and they have the same statue that you do. You have a state version of that FTC act. They said it's good. So we're good, right?
00:21:31
Speaker
No, said Elliott Spitzer, the attorney general of New York, a friend of mine who I you know worked with ah as consumer affairs commissioner, but who was running for higher office and was leading this pack of attorneys general. No, no, not good. So we end up having to sign a settlement promising to do everything exactly the way we were actually already planning on doing it, but also paying a million dollar fine. right Let alone the class action stuff where of course you know we had to pay. right So you see how business people on the other side of this, hey, I did things right. I asked my counsel. I did all these extra outside things. I'm the leader in my sector. Oh, and by the way, everyone else is doing what I'm doing, but maybe not even as properly. But since I'm the high profile company,
00:22:14
Speaker
I'm getting punched in the nose. Now, again, those were great lessons that I try to impart you know when I talk to business folks and you know yeah and explain to them, yes, Facebook paid a billion dollar fine for transfers from Europe to the US. And yes, that could have been you, but the complaint was against them. But yes, that could have been you as well. And now Uber's you know got a ah hundreds of millions of dollars. And yes.
00:22:40
Speaker
so helping them understand that you got to be a little cynical, but but then there are things you can do about it, that you can try to work and and set the stage and, you know, index on transparent practices and be confident and proud. You know, many years ago, I worked with um some of the team at Disney when they rolled out magic bands, right? They knew people were going to like, right? They knew that it was a pain in act to have to run around and get your FastPass loaded and that people would be able to pay for things.
00:23:09
Speaker
or even give it to their teen and say, yeah, go ahead, spend, you know, i put money on it. yeah You don't got to come bother me. I'm going to be on the other side of the park with your little kid sister, wait online. Right. And and that. ah But yet, you know, a prominent senator, ah when he heard about these magic bands that, you know, Disney could use to track you know, what you did in the park and wrote the leaving letter, well, they had done all the P's and Q's. And, you know, they worked with their legal team, their privacy team. They had outside input, you know, from us. We walked through the park with them. They were able to turn right back, you know, to Senator Markey at the time and say, excuse me, what we're doing is
00:23:48
Speaker
for our customers to make their experience better and we're and confident and you know, so what are the things you need to do so that if you were challenged by a critic who maybe doesn't understand what you're doing or doesn't like it, that you can say, doing the right thing. And here's all the things that I did to spell out that path. So hear me out. I hear these stories about double click, and I just have to think, like as much as things change, oh, how they stay the same, right? Oh, how the same conversation around ad tech is is happening. I guess that's a good lead-in to maybe my next question for you, which is, what type of training do you think
00:24:30
Speaker
is necessary to be successful in privacy and to help companies navigate this variety of stakeholders who are going to push back or or have different incentives or constituencies they need to serve. Is that legal training? Is that something else? tell Tell us your view on that. Well, one thing is the ability to learn a little bit from history. You know, you mentioned ad tech and things never change. It is like Groundhog Day. We're still debating, right? It's 20, 25 years later, and we're still debating, are cookies okay? Is ad targeted? in a minute right And guess what? Because we're still debating it, the critics have already concluded. And guess what they've concluded? They don't like it. They want to break it. They think it's bad. They think it's harmful. They think there's discrimination happening. They think that it's too creepy. What is it about the advertising industry, which is supposed to be expert on communicating with users that it cannot figure out how to make its case because there is a good case to be made. And yes, plenty of things that got out of control and that, you know, have legitimate criticism.
00:25:36
Speaker
There are companies sending data to the government. ah That's not what anybody expects when they opt in or opt out or are told that they're visiting locally and sharing posts to third parties that it might be going to law enforcement, right? um There is personal information constantly leaking every year. Someone does a study. Look at this. It's being sent. Look what's being sent. Look what's being sent, right? ah we We never have managed to communicate to users what the value equation is when many of them would perhaps choose an ad-supported model. And we can't even explain to policymakers that contextual advertising, for instance, might not be a solution when perhaps the media who have the best case, you know hey, we need media. We need strong independent media.
00:26:22
Speaker
yeah and they are, in large part, ads supported. and Guess what? If you only allow them to have contextual ads, look at your front newspaper today. What's going on? Hamas and hostages, Ukraine, earthquakes. Exactly what kind of content targeted ads do you think belong on, hey, Russia just killed 40 people in the Ukraine.
00:26:45
Speaker
so You incentivize that all your stories are going to be clickbaity things about autos and finance and stuff that is contextually appropriate to advertisers, right? But we need to explain, and here's why we need location. This is what it's going to enable, right? So there have been plenty of excesses, but we don't learn from, we say the same things over and over and over and over and over and over and over. Oh, give us that tech. There's going to be walled gardens. Guess what?
00:27:12
Speaker
Policy makers aren't nervous about, quote unquote, world gardens. They care about competition, absolutely. But the notion that there'll be safe places, is that what a world garden means? But industry says it over and over because to them it means something. But the policy makers, it doesn't. Now, there are ways to talk about competition in ways that policymakers do indeed you know care about. so What are the skills you need? and You got to learn. There are so many examples of of companies getting punched in the nose or advocates getting upset or legislation being provoked or compliance meltdowns that being able to take lessons from what actually is risky and harmful. painful the The biggest thing I would say is, and this is particularly for the legal and policy audience, the more you can actually understand the business and the technology. A lot of us, we're so busy, we're reading legal things, we're following legal stuff, right and then we spout back to the business you know our conclusions. you know The best legal and policy people
00:28:10
Speaker
end up becoming you know strategic. Kristin McNeely, who recently became the chief privacy officer and hit a data strategy at ah at the global level. right You see the most sophisticated general counsels right becoming the president of the business. Correct. Yes. Whether your title stays you know privacy, but being a trusted advisor, sometimes means having to know the business and the data flows almost as good as the business people. And sometimes better, and I don't want to be arrogant about it, right? The business person you're dealing with maybe knows their product and has their mission. Go make more money selling this widget, right? And you are talking to people across the company about how they use data and where it goes. And you are often in a position to say,
00:28:55
Speaker
Yeah, I get that. But you know, we built that system six times in six different ways already. And so yeah, you want the one that is customized to exactly the bells and widgets and gadgets that you do. But for the company's purpose, having something standardized, and then we will actually be able to get you the data you need because it'll in the formats and this and that, instead of just creating skunk works and databases. and so um a Why are you all going to the same legal and privacy conferences over and over? Where are you at the conferences or the trainings that give you the insight? Are you listening to the earnings calls of your competitors and other companies?
00:29:32
Speaker
so A, know the underlying economics of the business and the industry. It's amazing how often people who are just so busy trying to keep up with what they need to know about law and policy, that that they can do that, right? Number one. Number two, if you're dealing with senior executives, understanding how to distill the key points, they don't want to understand your complicated, complex legal menu. They're operating very often.
00:30:01
Speaker
You know, they're making gut decisions, right? And this is sort of ridiculous, but this is the way it is for lots of senior business executives, right? They got a lot of stuff reporting to them and they have sort of general frameworks and in views and they expect you to know the issue better than them. And you have to give them, you know, a best informed judgment based on the kinds of factors they are listening for. They can't assess the legal risk. You somehow need to learn. And this only comes, I think, with experience and judgment and having a network across other companies. ah you huh We have so many issues of first impression and so many new laws that are gray. And you are the one who has to make really sensitive decisions. like We can't comply with this perfectly, but this is reasonable.
00:30:46
Speaker
This is what we reasonably can and should do now. And so, yes, I am creating some risk, but, you know, I think I've laid my you know groundwork ah appropriately. So business judgment, technical and business understanding. The easiest part is what people spend all their time on, reading the law and analyzing the law. like yeah You got to do that. But please realize that that is not your entire job.
00:31:14
Speaker
The abstract is brought to you by Spotdraft, an end-to-end contract lifecycle management system that helps high-performing legal teams become 10 times more efficient. If you spend hours every week drafting and reviewing contracts, worrying about being blindsided by renewals, or if you just want to streamline your contracting processes, Spotdraft is the right solution for you.
00:31:35
Speaker
From creating and managing templates and workflows to tracking approvals, e-signing, and reporting via an AI-powered repository, Spotdraft helps you in every stage of your contracting. And because it should work where you work, it integrates with all the tools your business already uses. Spotdraft is the key that unlocks the potential of your legal team. Make your contracting easier today at spotdraft.com. um The thing that I love about everything that you said is, I don't think I've had someone on before who's been an elected official, might have had some folks who were regulators before, certainly no one who's run a think tank. And I think we can say is really, really good at framing things and talking to policymakers maybe in the way that you have that experience. I do talk to a lot of general counsels, including a number who have taken on that president role or COO role.
00:32:31
Speaker
And the advice that you just gave for how people can be great in privacy and in lee that's what we hear from them too. So I just want to highlight like isn't that cool that your cumulative experience has led you to to that conclusion for privacy professionals that you know some of the best sort of legal minds in businesses today I've also reached. I want to ask you about why you decided to start FPF and and be a founder of ah a think tank. yeah you know At one point, AOL was losing all of its dial-up customers to broadband. and We were looking to do whatever we could. and My boss, ah one of the early founders of AOL, Ted Leonsis, now perhaps known better for his sports ah you know leadership.
00:33:22
Speaker
You have the Wizards and the Caps and the major sports centers here. But Ted owns the brand and the trust and those issues. And at one point, Ted said, we've got to stop doing pop-ups. People don't like pop-ups. We'll stop doing pop-ups. Maybe they won't leave us for broadband. Of course, they were leaving no matter what we did. We were doing whatever we could. And we stopped doing pop-ups. It was a huge financial hit. And people still got pop-ups because they got pop-unders and pop-overs and the pop-up left by the site that you were at 20 minutes earlier that was leaving garbage behind and so forth. And Ted said, Jules, go talk to the various trade groups. I mean, yeah, we made our decision. But look what they're doing. They're burning all of their customers. Everybody on the Internet is going to be unhappy if every site you go to is bashing you with these pop-ups.
00:34:12
Speaker
So I go to some of the trade groups and they say, well, yeah, and and um and of course they do nothing about it. And then I come back a few months later and I said, I heard there's a new technology, Papa blockers, and people are downloading them. and through Those people, they're not allowed to do that. They'd be stealing our content.
00:34:30
Speaker
And I come back a few months later and I say, no, no, no. I heard that Internet internet Explorer, which was the dominant browser at the time, is going to build the pop-up blocker into the browser. They can't do that. That's not competitive. And I said, I don't know. Who's the Internet? These things work out. In the end, consumers kind of get control of these things. And then I realized something goes off in my head. Wait, the pop-up blocker, the pop-up companies are 10 percent of this trade group's members.
00:34:59
Speaker
They're not going to boot out, even if it's in the best interest of the party, because I, with my self-interest, hey, we stopped doing this, we think everyone else should, and clearly in the best interest of consumers, they can't do what I'm asking them to do without pissing off their constituency. And and it's revenue.
00:35:19
Speaker
Right. It's a trade group. It's revenue it's versus it's revenue. And hey, it works. People are clicking on it. So they must actually like it. Right. And I realize that a challenge that many of the trade groups had was that they had to get a lot of consensus or they had to be a massive disaster coming immediately before they could get their members to agree on things that maybe even 80 percent of their members agree needed to happen in the best interest, but more fatal, perhaps to a small number.
00:35:49
Speaker
real And on the internet and in digital, almost no company can be a good actor without cooperation of vendors and partners and competitors and a whole ecosystem. You can't be a good app without Apple having certain rules in the app store having certain rules and your advertisers and your publisher and those companies don't always talk to each other even though they're in business together. Their privacy people aren't like working out a cooperative experience. Here's the end user experience for my app on a phone. That's like the dependent on all kinds of things and controls and settings. and so I said, wait a second, nobody can really fix things that are even clear things that everybody would benefit from in the long run in their own good for consumers and for business because we need a lot of cooperation and because the trade groups are constitutionally not suited.
00:36:39
Speaker
At the same time, I thought a lot of the advocacy groups and civil society groups didn't really understand the technology. They'd read what they knew in the newspaper, and companies might be too scared to talk to them. and Even if the advocates had technical chops, how things were actually working you know behind the scenes wasn't always obvious. and Some of them actually made their money by getting pieces of settlements when lawsuits were brought.
00:37:02
Speaker
It's on the issues that they raised, so that was fine with them. Or they would get quoted in the media, right? and theres your job And people would go try to convince somebody whose job is to fight you, not to criticize you, and not criticizing unions. I don't get a New York Times quote in the top graph saying, the consumers I represent are unhappy with that big bad actor, so whatever you're calling to ask me about, yes, that company did wrong and I'm here against it, right? But I understood that because that's what politics worked like. You didn't go to NRA and say to them, can I discuss with you your position on guns? Could you like not do for guns? Yes, but that's their job. There's this famous quote,
00:37:42
Speaker
you know um a famous quote never can try to convince a man of something when his job depends on him not understanding you, right? okay No, and generalizing there were some great civil society groups and sometimes they were absolutely right. But there were a lot, there were a very few folks who lived in the middle, like me and my peers at companies now, there were thousands of cheap privacy offices around the world. And I was always, you know, connecting with them and learning from them and working with them. And they were generally good people who wanted to figure out how to get things done right. And their job wasn't maximizing the revenue, even if it was you know giving some strategic advice. Their job was help you navigate by doing reasonable things that are good for your customers. So yeah I didn't think there was anyone in the middle. And so when AOL was slowing down and I was thinking, hey, what am I going to do next? And I'd already been a chief privacy officer at prominent companies twice. And I didn't want to run for office again. not ah yet I now had that house and the mortgage I needed. you know I couldn't afford to take the chance. So my vision was, can we put together an organization that is in that middle of the road, that works with advocates and listens to the economics and works with the leading companies and the trade groups.
00:38:57
Speaker
and policymakers and tries to move things forward in a reasonable, practical way. Optimistic about tech and data, but yeah, let's learn from history and put the rules in place. And yeah if it's not a perfect law, let's get the rules in place so that we can move forward. What has happened over the years as we've grown, we have 200 plus member companies, and frankly, the issues have gotten so complicated that nobody knows the answers to some of these things. is We've evolved in some sense to being the place where the senior executives in privacy come to talk to each other and learn from each other and develop best practices. and standards and codes of conduct and engage with advocates in civil society and academia who are also part of our world and our advisory council. So yeah in one hand we are the organization of the senior chief privacy officers of the world and we support not their government affairs, not their advocacy, we support their challenge of understanding the technology
00:39:58
Speaker
tracking what's happening globally, because now everybody does business globally, and every country in the world is somewhere around legislating on privacy. So we have a team in Africa, we have a team in Europe, we have a team in APAC, we cover the South and Latin America, the states, helping them take in the firehose of laws and regulations and conflicting policies and helping do analysis. And then we say to them, now leave us alone, we're helping you.
00:40:24
Speaker
We hear you, we get the pain, we hear what's not working. Now let us go help policymakers do their jobs well. We're not your trade group. Your trade group will go lobby them and they'll be treated skeptically, but that's their job to go say everything's fine and here's what industry we're in. And civil society is going to go tell them that the sky's falling in and sometimes it is.
00:40:45
Speaker
ah Let me and let our team go to them and say, how can we help you? We get that you're busy, you're fundraising, you're doing going to the local school board, you don't have a big staff, you're a state legislator, but you're smart and you're savvy and you want to do the right thing. How can we help you? And of course, we have we have views, we have ideas, but we're more interested in showing them, hey, you want to regulate AI? Well, here's what NIST is doing. they don't have staff following what NIST is doing. And you want to know about the EU AI Act, they can't read 180 pages of whatever of compliments with all kinds of implementations. But we can say, look, here's an expert on our staff, or here's a great external expert who can walk you through. And my hope is that even if I don't agree, and even if you don't agree with everything they're doing, a smarter
00:41:31
Speaker
better informed legislator who also actually understands the practical needs, the business needs, understands what will break or not and gets candid, honest. Yes, again, optimistic and and friendly too. Hey, we need advertising. We need AI. We need commerce. sure anyway That's what we do, and it's been ah it's been a great run. if Well, two questions for you kind of combined. One is, what are you most proud, I guess, of having accomplished so far through Future Privacy Forum? And then if folks want to get involved, like I've got a bunch of GCs who listen to the podcast, other folks who are on in-house legal teams, including a number of privacy professionals. I'm sure that a number of them already work with FPF, but if people want to get involved, how do they how do they get in touch?
00:42:15
Speaker
You know, when we started out, we were a small organization with me at the center and I was you know whirling around with a bunch of junior people. Oh, let's do this. Let's do that. Big $10 million dollars budget, 60 staff. And I'm super proud that it is not about me. I'm no longer the smartest person in this place. There are people who know AI.
00:42:32
Speaker
There are people who know ad tech, there are people who know health data, there are people who know cars and mobility data, right? So I'm super proud that in the 15 years FPF has been up and that my co-founder Chris Wolf and I have been here, there are a generation or two of people who came in as interns and they're now chief privacy officers, they're senior staff and regulators, they are in academia.
00:42:55
Speaker
There is an entire community of hundreds of people who learned, grew, developed, and are now leaders all across privacy. So that's what I'm proud of. that my My legacy will be the people I mentored and the people who started and built careers here and went on to being you know amazing leaders. I'm also proud of the fact that we have put in place multiple codes of conduct, best practices, standards that are being actually used. Almost 400 companies have legally committed to our student privacy pledge and are on the hook for commitments they make for how they use student data.
00:43:33
Speaker
the companies that do consumer genetics, the helix 23 and me, and so she negotiated with us a set of rules for how your consumer genetic data is handled. This became a model for what California passed as as legislation. We did indeed an ADP and Workday and LinkedIn.
00:43:52
Speaker
all who use AI in recruiting and hiring tools and who have tools that many, many, many companies use. And we hammered out with them, with input from outsiders and experts and advocates, a set of standards that they are using to help shape how the tools they build are used. So there are dozens of examples like that where we've been able to collaborate with the leaders in a space to set a standard, a best practice, a code of conduct that's I think helped shape practices for the good. People can get involved by you know emailing us at our website and saying, how can I join? How can I support you? We've got working groups. We've got ah peer-to-peer meetings. We've got you know educational trainings, legislative tracking. I think it's a great value.
00:44:36
Speaker
Two more substantive questions for you before we have a couple fun ones and and wrap up. There's been some conversation recently. I don't agree with this, by the way, but there's been some conversation recently and there was a Wall Street Journal article, I think, about the death of the chief privacy officer role. Google's chief privacy officer left after a number of years. Let's let's talk about that.
00:44:59
Speaker
Do you think that chief privacy officers are at least declining in influence or in numbers, or is this sort of like hand-wringing? I was quoted recently in one of the publications saying that the the role of the CPO is at an inflection point, and by that I meant a few things that are happening.
00:45:17
Speaker
The early crop of chief privacy officers were in largely unregulated industries. It wasn't the banking sector. They had laws and lawyers told you what the laws were for how privacy should be handled. It was double click where there was just general consumer protection law.
00:45:36
Speaker
Yet there was concern and consternation and proposed legislation. So you needed people who could say, well, yeah, you're not violating consumer protection laws strictly, but here's what hell will reign upon you if you go track people in this way. Right. And here's what regulators are going to think and here's what ah legislators are going to think and here's what the media is going to think and here's what our brand and so you needed people who were shaping policy and practice and thinking through compliance and now we have state laws and GDPR and enormous
00:46:15
Speaker
legislative activity and companies know how to deal with complicated legislation. They've got legal and compliance teams who do this in slave labor, in environmental and all sorts of other issues that are really important as well. And so privacy has moved from being a largely unregulated with sector specifics to a highly regulated industry. And highly regulated industries aren't the ones who've led the way or who've had the most prominent chief privacy officers. Google is now a highly regulated entity. There are lots of people working on legal and compliance and litigation and so on and so forth. And there are lots of organizations that are moving from having some dramatic executive officer title to
00:47:07
Speaker
ah Having senior expert legal compliance people who are running legal ethics and compliance program of which privacy is one of many. So that maybe is a bit of a down draft, not on the fact that we need lots and lots of people working on lots of laws and lots of regulation and lots of compliance. We need more of them.
00:47:28
Speaker
But you need somebody who's sitting in a different place in the chain of command and who maybe you have a head legal privacy person. You also need a chief privacy officer. So that is that is a trend. But the same at the same time that you have that regulatory trend that might be downgrading the having this grand puba who's got this executive sophisticated set of skills and who runs a program. We have the complication of laws like the various European digital strategies, which all are in conflict with each other that are all developing. It's not clear how they're going to fit together. You've got this expanded digital governance mandate. How is the privacy world living with your security world? And then you have the whole AI influx of which a lot of it is privacy. Not all of it, but a lot of it is privacy and a lot of the tools. yeah
00:48:16
Speaker
Even if they go beyond individual privacy, they are assessments and they're um dealing with data protection regulators who are in many cases the AI regulator and in some cases not. So there's a group of privacy leaders that are seizing that AI mantle, playing a key role, being empowered with broader digital governance complications, digital strategy, who best understands all the legal restrictions on data that comes my way, whether it's partner restrictions, whether it's legal privacy restrictions, whether it's um international restrictions and understands my business and can help navigate the complexity of these evolving and not yet clear or final laws. So there is a
00:49:02
Speaker
group of people that are getting digital governance, that are getting yeah that are getting new titles, broader titles, CPO and digital governance. Carolyn Lescue at Mastercard has a very broad digital governance scope on a Zyder eBay is CPO and AI lead. Emerald Blue has seen AI showing up on some of those titles. You've seen digital governance is seeing data strategy. So it but it's at an inflection, right? In some places, you can see it being pushed down. Hey, it's just another regulatory issue. We now handle that. We have a lot of regulatory infrastructure. And then you do see, at the same time, sophisticated companies recognizing that the complexity is far more than compliance. There's a program to run here. There is privacy operations. There's training. There are things that go perhaps a bit beyond. So it's at an inflection point, and it it's not yet clear
00:49:50
Speaker
um how it goes, it's absolutely a role under massive development. Not everybody is suited to become that executive. They're legal compliance people. They're a CPO title, but frankly, they're really doing legal compliance work, and that's all they do, and maybe nobody ever should have called them a you know a CPO. But there are others who have the executive skills, have the business gravitas, and who are jumping and seizing it, and I think there's a lot of opportunity for those roles.
00:50:17
Speaker
I was going to ask you about AI governance and how it related to privacy, but I feel like you expertly answered that question too in this one, right? You know what there, are folks? I just spoke to a senior chief privacy officer at a big company and she said, I don't want it.
00:50:35
Speaker
I'm busy. I got a lot on my plate already. Nobody's offering me 10 new staffs ah for it. A lot of the work is heavily technical work that maybe our CTO or CISO, if it's testing the AI system to determine bias, am I actually running those you know very technical tests if it's a red teaming, whether or not the the model, you know, can be broken. Am I really doing that? I'll give my input, but I don't want to seize it. And then you've got others who are like, gimme, gimme, gimme, this is important. This is the future. It's head driven. So I see
00:51:14
Speaker
And when I say I see, we host dozens and dozens and dozens of convenings among the senior people who are talking to each other. do you want Should you get it? Oh, you got it in your title, but are you really leading this? Or are you just a member of the committee? How is it working at your organization? So I don't think that's fully cooked.
00:51:32
Speaker
Yet some of it is dependent on what kind of organization are you? Are you really building AI models or are you a consumer of these models and you need to do the compliance work? you know Maybe the role looks very different than someone who's at a frontier model company or who's at a company that's really building these tools where you know where it's a very different type of role. also at an inflection point. I got some fun questions for you as we start to wrap up. I'm curious about your favorite part of your day-to-day. I love doing product reviews. We've got companies who come to us all the time. and you know I miss being in the bowels of product people coming and saying, we're going to try to do this. and i was like yeah I thought the technology didn't do this. they' like Yeah, but if you do this, then we do that. And issue spotting and saying, well, OK, here's a path forward on that. That does seem pretty creepy. But maybe if you do this, then we do that. And so people love coming to us with the hard stuff that they don't have a good answer. They don't have consensus. Legal says one thing. Public policy is nervous. Advocates are critical. And saying to us, can we walk you through what we're looking to do? It keeps me smart. It keeps our staff super smart. And then I love.
00:52:41
Speaker
seeing that, hey, we propose you do it this way or you design it that way. And then you know having some impact on even some of the biggest companies' practices in ways that you know I see rolled out. So i love I love doing that. That's great. I totally get that. I think product counseling work or being product counsel is maybe one of the best jobs in a business and certainly one of the best on the legal team. Do you have a professional pet peeve? Wow. um I guess what frustrates me is we now are in this wave of legislating and legislating and legislating, and that's fine. But build on what is already the current law when you are legislating. Let's take dark patterns.
00:53:24
Speaker
Everyone's banning and restricting dark patterns because who would be for them? And what I keep asking is, isn't that already illegal? When what are you looking to expand? the We have a hundred years of FTC and state AG history on what is deceptive, what is unfair, what is prominent disclosure, right? What is going to confuse people? where can you What is good enough?
00:53:46
Speaker
now If you're telling me that wasn't good enough, okay, let's talk about where we need to expand this. The CFPB has authority that is not just deceptive, they have abusive. And that maybe means something more. Professor Lear Strelovitz of University of Chicago wrote a wonderful paper and he sort of posited what might be legal, but um maybe a dark pattern, but yet legal on the current law. and you know he He did a bunch of consumer tests, and he says, you know what? There are some examples where it passes muster. It's like it's disclosed and all that. But when we poll people afterwards, they regret the choice they made. They don't say, well, I didn't know there was some catch there. I didn't see the opt-out breath. They're like, I wouldn't have done that now having understood what really ended up happening. but it was
00:54:38
Speaker
adequately disclose, right? So maybe that's where we're going, but that's not where the policymaker discussion is. People are banning things that are already illegal. Now again, if we're changing the standard and we're saying, no, no, it needs to be equal. We care so much about this, right? We don't let you donate your kidney. We we say, hey, you need to know what you're doing. We got to sit down. It's not just a matter of filling out a form. You want to tie your Hoobs, you don't want to have any more kids. yeah we got to You go for a psychological test first, right? Like, you've got to do all this stuff. Now, if we're saying, as perhaps our European colleagues have, no. Or ah frankly, even California law, hey, it needs to be equal. Yes, I want the tracking. No, I don't want the tracking. As opposed to, I can't try to persuade you a bit. No, it's got to be equal and fair. OK, then we've moved the barrier. But we're not. So the European digital strategies, again,
00:55:32
Speaker
um Here's what we want. ah You must share data if you're a platform with researchers, but don't violate GDPR. If you could share already without worrying about GDPR, there'd be more sharing. There are legitimate barriers. in time So AI, we're regulating AI without recognizing. We did a report that took a look at all of the automated decisioning cases under GDPR. There are many of them. That's AI, right? Automated.
00:55:59
Speaker
And so we said, look, we want to regulate AI. Awesome. But here's what your current law already is. Now build on top of it or amend it, amend GDPR. We can't do that. So people co-regulate things that, so conflict of laws, not understanding what the current legal structure is, and then building on it instead just jumping in and kind of legislating or regulating without an understanding of the needless complicated confusion that you might be creating.
00:56:30
Speaker
Super interesting answer. Yeah, our listeners have a lot to learn from you and also the work that FBF is doing. Okay, last couple questions for you. I'm always looking for books to read myself and then also recommendations for our audience. It doesn't have to be privacy related, but it might be fitting if it is a book that you think that folks or a couple that you think that folks should pick up for their next flight from LA to San Francisco.
00:56:58
Speaker
historian named ah Sarah Igo wrote a phenomenal book. It's called something like sort of the history of privacy in the US and you read it and you're like, oh my God, this is the debate. Exactly what we're seeing today happened when social security numbers were rolled out and some of it isn't intuitive. um What are the issues that we should realize so so many so many lessons again from history um about how new initiatives, government or private sector were treated, the cultural and societal link. So it's a wonderful, wonderful book.
00:57:33
Speaker
It's got feminism and privacy. It's got power and privacy. It's government and privacy. Highly recommended. um The best privacy by design book is one by a presser named Joc Henk, forgetting his ah last name, but a Dutch academic. um And it's it's called something like Seven, you know, Privacy is Hard and and Seven Myths. And it really walks you through a number of scenarios and sort of analyzes what went wrong and what you could do about it. um That's ah another book that I really um appreciate. Orly Lobel. Academic in California has a wonderful book about ah AI and data as the equality machine, and she goes through all the things that can go wrong, but then all the things that could go right and be better if you did it right. We're worried about AI and hiring, but guess what? We want humans in the loop, but guess what? Humans are terribly biased.
00:58:21
Speaker
yeah And so maybe the technology is biased, but we can examine it and we can perhaps fix it and tweak it. I can't really fix what's in your brain when it turns out that you look at people and if they have a certain name or certain skin color or they have certain hobbies, now you think that they're you know ah better employees and and that's sort of buried deep in your heart. So she you know does a really good job at um unpacking the challenges of technology and the solutions. Neera Farahani is a wonderful academic at Duke, who's the leader in neural rights and brain privacy. And you you think like that's that's kind of futuristic and science fiction, like Musk is yeah implanting things in people's heads. And she walks through all the ways where our thinking and our emotions and our brain is already being used, manipulated, managed, and all of the things that are coming quickly our way. So it's a very timely book that helps us get beyond the cookies and the tracking to the much broader issues that are starting to evolve as cameras and technology and sensors
00:59:26
Speaker
we'll know even more about our thinking and our sentiment and our analysis. Again, she's an optimist. like there' It's great that we will be able to control computers and drive cars and you know use our brainwaves to ensure that people with disabilities can better interact with technology and and the internet. But here are the downsides and here's what we need to do to manage this in a way that will be good for industry, good for society.
00:59:52
Speaker
Great recommendations. We will add those to the show notes. My last question for you and and the traditional sort of closing question that I have for all of our guests, it's if you could look back on your days as a young lawyer, maybe when you were at the real estate law firm or at the law firm practicing real estate law, just getting started, something that you know now that you wish that you'd known back then.
01:00:18
Speaker
I'm really gratified by the enormous number of people that I've been able to mentor, whether it's junior people or senior and yeah to sort of share advice. I was not a good mentee as somebody junior. I just I i didn't find mentors. i I worked for, you know, important leaders and I worked in, you know, spaces where but but I was just too awkward or maybe just not comfortable enough.
01:00:48
Speaker
to take advantage of building those relationships. So I kind of got lucky and and you know was ambitious and things worked out. But take the time, and and I probably would have thought I was imposing on somebody back then, if I asked them for help or asked them you know to mentor. But the reality is,
01:01:10
Speaker
people who are in the position of doing so feel obligated and want to do so and are and are gratified. And you know you don't take up too much of their time. And I didn't know that. And I see people who really do take advantage of it here and make sure to build those relationships. And then others who are maybe too shy or too embarrassed or don't want to be you know imposed on. So I wish I had been able to be a mentee or be bold enough or confident enough to sort of seek and build those relationships and I encourage young people to take advantage of the fact that people would would like to share um perspective with you and ah build build those. I wish I had done more of that and and I encourage young folks to take advantage of that.
01:01:57
Speaker
That's a really thoughtful answer. Jules, thank you so much for joining me for this episode of The Abstract and for such a really interesting conversation. Great to be with you. And to all of our listeners, thanks so much for tuning in and we hope to see you next time.