Introduction to Microsoft’s Role in Elections
00:00:09
Speaker
Hello, you're listening to Observations, a podcast where we talk about democracy, how it works, and how it could work better. Today, I'm joined by Microsoft's Senior Director of Global Elections, Dave Lichtman. He's going to discuss how and why one of the world's largest technology companies takes an interest in safeguarding elections happening in democracies around the world. Thank you so much for joining us, Dave. Thank you for having me, Lily. I appreciate it.
Why Microsoft Supports Democratic Elections
00:00:37
Speaker
So why does Microsoft take an interest in elections happening worldwide? It's a great question. It's one we get a lot. um you know People look at us as we're one of the largest companies in the world, and they say, why why would Microsoft care? or Why does Microsoft do this? And it's actually a pretty easy easy case to make. um We make something like 95%.
00:01:00
Speaker
ninety five percent It might be higher. It depends on the year and circumstances. But 95% of our revenue comes from democratic countries. And that shouldn't be surprising to anybody who you know understands whether you whether you agree whether it should or not.
00:01:14
Speaker
Capitalism and democracy are inexorably linked. you know we We do well where rule of law is standard, where we know what regulations are going to be. where we know that we can't just be arbitrarily kicked out of a city because we get crosswise with the government.
00:01:31
Speaker
um Those are good things. Those lead to stable conditions where we can do good business. And so we often say ah democracy is good for business, so business has to be good for democracy.
Election Work as Corporate Social Responsibility
00:01:42
Speaker
So is this about doing something that's good for shareholders or for society? Both. I mean, why not both, right? ah we The work we do is part of our corporate social responsibility mission. um my My team is not in sales or marketing. ah we we're very Sometimes we're even agnostic about technologies when we go and talk to election stakeholders. But I think it's really important that...
00:02:10
Speaker
It's good for us to have a little self-interest in the work we do because it means that it's core to the business and it'll survive. And I think some of our listeners will be interested to learn that Microsoft actually has an elections department and a global head of elections. So can you tell us a little bit about what your role actually involves and what your day-to-day looks like?
Lichtman’s Role and Focus on Democracy Support
00:02:32
Speaker
Sure. Yeah. um It is unique. A lot of the other companies have elections directors or do election work, but it's it's very you know sales or marketing oriented, especially our competitors in the big tech space.
00:02:46
Speaker
um But we look at this as an opportunity to serve um sometimes existing customers, sometimes the space in general, ah in a way that helps them do their jobs better and ah protects them. right Look, there's there's a group of stakeholders, we'll call them those that are important to societal resilience, which is what our our larger team is actually called societal resilience. um Human rights defenders, journalists, social good organizations, and all of it
00:03:23
Speaker
Elections Observer is a great example. These are groups and stakeholders that are critical to society, but they're often under-resourced. They're often highly targeted and they need help, right? They need help with security. They need help with productivity. They need help doing their jobs and doing it securely and safely. And so that's where we step in.
00:03:44
Speaker
um Oftentimes I'll work with existing customers. And there are a lot, a lot of in in the U.S. It shouldn't surprise you that something like three quarters of elections offices in the country use us for email or for cloud services. um And that lets me come in and provide extra free resources to keep folks safe and secure.
00:04:06
Speaker
Okay, and what resources is Microsoft as a company putting towards
Composition of Microsoft’s Societal Resilience Team
00:04:10
Speaker
this elections work? So can you tell us anything in terms of the figure, like the amount of money going into this or the number of staff that are working on this project?
00:04:18
Speaker
We have a team of about, our societal resilience team is about ah seven people um working in different areas, you know, in in media literacy, information integrity, our larger societal resilience. We do a lot of work in frauds and scams.
00:04:33
Speaker
um So it's varied work. I'm the one person who's solely dedicated to elections. So you can imagine that's a lot, right? It's me, and um I have a project manager, and I do have the rest of my team and we all we all help each other with our various subject areas, but um you know I'm responsible for the whole world.
00:04:52
Speaker
so ah it's a I can't get into dollar figures, but it is a commitment for the company and we've restated our commitment every election cycle and I expect we'll do the same this year in 2026.
00:05:06
Speaker
And you talked about um democracy being existentially important to Microsoft. So Microsoft doing well in democratic nations and preferring that style of, well, that system of political thought and um process. But what does Microsoft perceive to be the biggest threat to democracy today?
Major Threats to Democracy: Misinformation and Cyber Attacks
00:05:26
Speaker
The biggest threat to democracy is
00:05:33
Speaker
Those external forces usually state actors from countries that we disagree with ideologically that are trying to erode the fundamentals of democracy in various ways, right?
00:05:46
Speaker
ah Polluting the information ecosystem. um eroding, attacking the very fundamentals of democracy, um attacking us on multiple fronts in cyberspace, physically in some cases, you know, we've, and so there's there's there's lots of different attack vectors, but the the answer is oftentimes state actors who we, who disagree with us anyway, and are trying to directly impact our our democratic values.
00:06:18
Speaker
Okay, because one area that Microsoft seems to focus on is this problem of foreign interference. and So what kinds of interference attempts by foreign nations or groups have you seen um in relation to trying to destabilize democracy or interfere with elections?
00:06:35
Speaker
When we talk about foreign actors, we usually talk about the big three, um Russia, China, and Iran. ah Those are the three groups that are the three countries that are most active in the space. And they ah they tend to not operate necessarily with a first party.
00:06:53
Speaker
um They often will contract groups or spin up entities within their various ah state security apparatuses or external apparatuses, you know, are are equivalent to the external security. And they would um set to task, right?
00:07:11
Speaker
Here, you're going to do this thing to this election. You're going to do this thing to this election. And that might mean a misinformation campaign. That might mean a hacking attempt. ah Russia has always been strong on the on their mis misinformation game. You know, the the word misinformation comes from Russian.
00:07:30
Speaker
it's it's ah It's been a thing of theirs for the last 120 years. So it shouldn't be surprising that that's their their main weapon of choice. The Chinese often focus on hacking, right? So they're they're more cybersecurity focused. They're trying to infiltrate accounts expulsate data.
00:07:48
Speaker
ah The Iranians are kind of a a mixed bag. they They've done everything in between and um even done things like attempting to recruit folks on LinkedIn, which was an interesting vector in 2020 that we saw. Yeah.
00:08:06
Speaker
Yeah. so That's affected the uk context as well, I think. So how widespread is this problem? Are you seeing it happening across democracies or is it something you've mainly seen focused in the US context?
00:08:18
Speaker
There's what happens is there's a baseline, right? So these groups are all semi-active in some sense on a regular basis in various ways. And then around certain events, they will activate. We saw a lot of a lot of focus from external groups around the EU parliamentary elections in 2024.
00:08:41
Speaker
around the US elections in 2024, around the Olympics and in 2024. That was ah a big year, right? The year of the election was a big year in 2024 for these external groups.
00:08:53
Speaker
Less so last year, less so we haven't seen a lot this year. um But there also haven't been any really major events. Okay.
Strategies Against Misinformation Campaigns
00:09:02
Speaker
And in terms something the rather catchly named Microsoft Threat Analysis Center picked up on in the 2024 US elections was that some of this misinformation online being spread by foreign groups was actually trying to amplify concerns about the election being rigged, voter fraud, and other electoral integrity issues. So does this misinformation tend to be focused around um the actual integrity of the election process um and...
00:09:31
Speaker
It is undermining trust in the elections, really what they're trying to do. Oftentimes, I mean, I will say there's a bit of an organic focus on that, right? That there's now a large chunk of the United States population that believes that our elections in 2020 and 2024 were rigged or stolen, which is unfortunate because they most certainly were not.
00:09:52
Speaker
ah In fact, they both went opposite directions. And so sometimes I wonder how people can hold those two thoughts in their heads simultaneously. But it is what it is. um Oftentimes, these groups seek to amplify existing narratives, right?
00:10:05
Speaker
as That thing already exists, that kernel of... conspiracy theory or whatever already exists in society and they'll drive a wedge they'll they'll take that thing and they'll amplify the hell out of it to make it make it more contentious to make more people fight about it and they'll they'll oftentimes work both sides of the issue right they'll they won't just drive one side they'll drive both sides and that way it creates acrimony And that's destabilizing Yeah.
00:10:34
Speaker
Oh, obviously. Yeah. It's very destabilizing. and I mean, it's incredibly impactful to to democratic values. But that's attacking a very fundamental thing to democracy, right? Elections. Yeah. And can you tell us a bit about the scope and scale of Microsoft's involvement with
Microsoft’s Tools for Election Security
00:10:49
Speaker
elections? Because there's the elections department of which you're in charge, I suppose. And then there's also the Microsoft threat and analysis center. And you've got things like election guard, which tries to protect the result of elections and account guard, which prevents phishing attempts against ah political candidates and other things. So can you just talk to us about the full scale of what you're doing with elections?
00:11:11
Speaker
Yeah. um Look, the the bad guys are ah working full spectrum, so we have to also, right? They're attacking ah every every little piece of elections where they can. And so it's incumbent on us to figure out how we can exercise Microsoft technologies or resources to protect every bit that we can.
00:11:31
Speaker
um Let's talk about cybersecurity first. So Microsoft... huge technology company. We have a couple of things that we actually... It shouldn't surprise you that... I think I said something at like three quarters of US officials use us for email. A lot of of them use us for cloud services. The other chunk is Google and AWS, right? ah if you If you got...
00:12:00
Speaker
Microsoft, Google, AWS, Apple, and let's say Cloudflare into the room together, that's probably 95% of the technologies that all elections officials use in some sense. So ah there's oftentimes not not a single elections official we talk to who isn't using Microsoft technologies in some aspect of their work.
00:12:22
Speaker
ah You know, in the U.S., people are overly focused on voting machines, but the whole process end-to-end, there's, you know, websites that people run, there's voter registration systems, there's results reporting, there's election management systems, there's poll books, there's check-in, you know, the e-verification systems that everybody's using in Africa now. Like those, those are all technological solutions and half of them touch our technology in some way. And so we come to elections officials, uh, and say, look, if you're using our stuff, let us help you use it more securely.
00:12:56
Speaker
Let us give you these other free resources to keep you um safe. And we do things like, well, a great example is oftentimes we don't know if an elections office, and I know that sounds silly, but we don't know all of our customers. We're such a big company.
00:13:11
Speaker
And oftentimes elections officials are buying things at rack price on their own, not even with a government deal. And so we don't know that they're customers.
00:13:22
Speaker
um Simply finding out their customers is helpful to us because we can protect them further. We can identify them. You mentioned our threat analysis center. We also have an entity called the um Microsoft Threat Intelligence Center, which focuses on cyber actors.
00:13:35
Speaker
And we simply flagging the existence of an elections authority on our products is helpful to our threat intelligence center, right? Because they can see what's happening with them.
00:13:47
Speaker
Sorry, I was just going to say, does it does it create a risk if all of the elections officials are using the same oh sort technology source? Oh, absolutely. Absolutely. So Microsoft is embedded in of it. OK. Yeah, and and I mean, I'm not going to pretend like, again, it's good that we have skin in the game um when we're out there we doing the work we do because it gives us a ah staying power, I guess, the the fact that we're not going anywhere. um We also are mitigating risk for the company because if somebody's out there using our products unsafely in the conduct of something so societally critical, ah things go wrong. That looks bad for them and us, and it's bad for society. So ah we have an obligation and a self-interest in in protecting these folks. Yeah.
00:14:36
Speaker
um You mentioned a AccountGuard. We actually have a whole program around our email infrastructure to make sure that people are using our email in the most secure way because still, and this can't yes can't be said enough times, email is still the number one way that people get compromised. Phishing campaigns are still the number one way that people are are attacked online.
00:14:58
Speaker
That seems crazy that that foreign interference can still be happening through something that seems as simple as a phishing email. Well, let's let's talk about it from an observer lens. um The bad guys will go to the easiest available target.
00:15:14
Speaker
And in 2016, that was political campaigns, right? ah So we saw high profile, the DNC in the United States was attacked. ah But then campaigns got wise and they started rolling out protections and training their staff.
00:15:28
Speaker
So then... Bad guys went at one level down. They started attacking think tanks and nonprofits that were associated with the political process yeah in 2020. Then everybody got wise, started protecting themselves, started training their staff. And so now the bad guys are going one level down, which is attacking personal accounts, attacking maybe the next rung of civil society that's not as well trained or protected. So it's critical to groups like civil society stakeholders around elections like observers to protect themselves as best they can.
00:16:00
Speaker
And do you feel like governments and security services are able to keep up with the development of technology and the development of different techniques that foreign actors or malicious actors can use to try and interfere with elections?
Global Collaborations for Election Protection
00:16:14
Speaker
I believe that some are. um I will give a shout out to the NCSC, the National Cybersecurity Center in the UK. They're one of the best in the world at what they do.
00:16:26
Speaker
And they have elections top of mind every every year in a in a great way. We usually have, well, I was just on the phone with them last week. They're excellent to work with and they care a lot about protecting even local elections. so Well, I'm sure they'll be pleased to hear that if they're listening.
00:16:48
Speaker
And um how else are you involved with elections in the UK? Are there any other organisations that you're working with, perhaps the Electoral Commission? Yeah, we do work with the Electoral Commission. And um in 2024, during the general election in the UK, we worked with Democracy Club ah to make sure that we... So Microsoft has a whole lot of technological solutions, clearly. We also have a whole set of front-end-facing general consumer products ah in the information ecosystem, right? So we have search.
00:17:23
Speaker
Yes, most people use Google, but... ah a A not insignificant chunk of people use Bing for their searching. And Copilot is one of the largest AI solutions, chatbot solutions. so it's incumbent on us to make sure that our answers around elections are correct and secure, that they're not disenfranchising anybody, that they're always giving the correct answers and telling people, you know, if I ask where to vote, when to vote, who's on my ballot, things like that, that we're always giving the right answers. And so we've worked with entities like Democracy Club in the past to ah to make sure that we were always producing correct results.
00:18:01
Speaker
And in the US, we have a deal with what's called the National Association of State Elections Directors to do the same. And on that point, it feels like AI tools have opened up new possibilities for spreading disinformation, such as the creation of fake videos of political candidates. And that's one way that we're seeing foreign interference happening. So could you just explain what is meant by a deep fake and how easy it is to actually create one?
Deepfakes and Their Impact on Elections
00:18:24
Speaker
Sure. um So a deepfake is generally an unauthorized use of use of someone's likeness ah to deceive somebody else, um oftentimes in video form.
00:18:37
Speaker
I think that that that's typically what people mean when they say deepfake. Unfortunately, deepfake gets thrown around in a lot of different contexts. But when we say deepfake, we specifically mean a malicious intent to deceive.
00:18:49
Speaker
Because there are there are positive uses of fakes, right? ah I often hold up um Imran Khan when he was running for... in Pakistan's election in 2024 for prime minister, he would ship speeches to his staff and they would fake video of him giving the speech.
00:19:09
Speaker
which is awesome because he was in prison and he couldn't give that speech from prison, right? So that's a good use of likeness. ah But you know when we saw people, um fakes of Kamala Harris giving a speech she didn't give in 2024, that's obviously a deep fake. That's a bad a bad use. But um we have a veryary we take it very seriously at Microsoft.
00:19:31
Speaker
ah We make sure that... we we Actually, one of the ways we work with the the UK Commission and other commissions is a constant update of candidate lists. So during an election, we actively block elections candidates from generation on our systems.
00:19:49
Speaker
yeah So if somebody's running right now, we we actually haven't haven't done it yet for the UK, but we will in in the coming months when when all the candidate filing deadlines have passed, we'll start loading lists for Scottish Parliament, for Welsh Parliament, and we'll make sure that they're um they're all blocked from generation. So it's time limited, so it's during election periods that people do that, but outside they could potentially be creating.
00:20:18
Speaker
Yeah, there's there's an argument that, you know, Keir Starmer, that there's you should be able to make a video of Keir Starmer for free speech reasons, for satire, things like that, right? He's a public figure.
00:20:35
Speaker
But if you were running for if if there were a general election called, we would still block generation on his name during the election. And is there anything that listeners can do to spot when they're seeing content that's been faked?
00:20:46
Speaker
Oh, absolutely. I mean, I think that it's interesting. We thought in 2024 that people were going to... that it was going to destroy democracy, right? People were going to be deceived by deepfakes left and right, and that ah people wouldn't be able to tell what's real and what's not. And then in the end, it would it would be devastating to democracy. And it just didn't pan out that way. And we think that people are inoculated somewhat.
00:21:11
Speaker
um People are inoculated because, especially when national figures are faked, They've seen them enough that they know when something's wrong. The human brain has a really unique ability to detect when something's awful little, right?
00:21:25
Speaker
So you see a video of Kamala Harris and she's saying words that you think, wait a second, would she actually say that? And so I'll give credit to the Australian Elections Commission. They had a brilliant campaign last year during their election in May called Stop and Consider.
00:21:42
Speaker
Very simple messaging, right? If you're about to share something, stop. Think about whether this is real or not. Think about whether the person would say these things or not, right? It's it's ah it's a simple message, but I think it it deserves hammering um as much as possible societally. Like, let's just all, I often say to my staff, take a beat, right?
00:22:03
Speaker
yeah You're about to do something, take a beat. Think about it. Stop and consider. Yeah.
Microsoft’s Ethical AI Use in Elections
00:22:08
Speaker
But I suppose putting the onus on the viewer for spotting problems with ai is a narrative which is quite favorable to tech companies. So just to put this out there, you know Microsoft positions itself as the solution to problems with AI and to protecting people from misinformation and deepfakes and the problems that it's causing in relation to elections. But actually, Microsoft and other tech companies have helped to fund and develop and disseminate these tools to the public in a way which can be harmful. So do you also view Microsoft as being part of the problem?
00:22:42
Speaker
I view the generative side of AI to be part of the problem. I do not view Microsoft to be part of the problem. I believe that we are very disciplined and principled internally on the way that we allow use of our tools for generative content.
00:22:57
Speaker
That is unfortunately not the case at some of the competing tech entities. And without naming names, i think you can tell who I'm implying here. um We can only do our part, right? I only have control over what Microsoft generates.
00:23:13
Speaker
And so I think we, I actually genuinely believe that we are not the problem here. Okay. Fine. But again, the ecosystem is large and we only have our piece of it that we can control.
00:23:27
Speaker
Okay. And one area that Microsoft has been involved with in elections is observations.
Global Election Observations
00:23:34
Speaker
So Democracy Volunteers, the organization behind the Observations podcast, trains volunteers to observe elections in a free, to make sure they're carried out in a free and fair way. So we're very interested in this type of work. So where have you been observing elections and can you talk about how it feeds into your work?
00:23:51
Speaker
I've personally done a lot of observation locally in the U.S. with various political entities, but I was privileged to go to Zimbabwe with the Carter Center 2023. And the woman who runs our team went with IRI to Kenya in 2022.
00:24:10
Speaker
um We've been invited on various other missions and, you know, scheduling hasn't worked out, but I'm hopeful that we can continue working with some of these other groups with OAS and and Carter Center and IRI and NDI specifically.
00:24:23
Speaker
um And actually, i I'll just say, i appreciate, I was just invited to join democracy volunteers and in Scotland in May. So I don't know if I'm able to join, but I appreciate that invite. Well, I'd like to see that.
00:24:36
Speaker
yeah ah Look, I think it's it's absolutely critical. we The UN considers elections observers to be human rights defenders.
00:24:49
Speaker
And so we do too. A lot of our work for our team is in defending defending human rights defenders. ah Human rights are critical to our company, holding them up, holding upholding those values and protecting those who do.
00:25:03
Speaker
um If you'll allow me a personal point here, I'd actually just like to call attention to something and if people can see more about it on my LinkedIn, um the head of what we call GENDEM, which is the global network of all elections observers networks run by a National Democratic Institute, NDI.
00:25:25
Speaker
The chair, ah Dr. Sara Bariti, who is Ugandan, is currently in prison. She was arrested on December 30th ahead of the Ugandan election.
00:25:38
Speaker
And about for the last three weeks, she's been in prison on charges that can only be political. um She is a political prisoner, as far as I'm concerned. And the Ugandan government is ah holding her as a political prisoner. and needs to release her. um They conveniently set her bail hearing for yesterday, a week after the election, and she's been denied bail for a week and she has another hearing next week, but there is a GoFundMe to raise legal funds for her defense.
00:26:10
Speaker
um It's an absolutely ridiculous situation, one that the international community should not tolerate. Okay. Well, thank you for raising that. I just want to say, for you, Sarah.
00:26:22
Speaker
is And were there any practices that you saw while in the course of your election observations that have stood out to you and informed what you're doing at Microsoft?
Beyond Election Day: Broader Democratic Processes
00:26:32
Speaker
Yeah, I think um something that was really eye opening for me that, I mean, i I knew it, but when you see it, it's different. Right.
00:26:41
Speaker
um And I think all the people who do this observer work really fully appreciate this, is that we have a tendency to focus on elections. Hmm. And and we have a focus we tend to focus on elections over everything, and we tend to unfairly associate elections with democracy.
00:27:02
Speaker
as the like We conflate them. And they're not the same thing, right? ah I think that if you look at recent elections in the last year in Georgia,
00:27:13
Speaker
and in, ill I'll say in Zimbabwe, ah where where I witnessed it, you can run a great election on election day. the the The elections authority can do a great job running the election.
00:27:28
Speaker
But that's not the thing, right? The thing is everything that goes on around the if you if you disqualify opposition candidates and you make it difficult for people to vote and you knock people, you rig the rolls and then you announce results in darkness. It's it doesn't matter whether the election was run well, if everything on either side of it was not.
00:27:49
Speaker
And so it's critical to observe the process end to end, right? And to make make sure that the whole world sees and that transparency is brought to the whole process.
00:28:00
Speaker
And it's also critical for people to know that, you know, elections are often used as a fig leaf. ah and And we see this, we see fake election observers being invited to a lot of authoritarian, we'll call them,
00:28:14
Speaker
and ah anocracies, right? So... where democracy is a show, but it's not actually happening. I think Uganda is a great example. um But in areas like Zimbabwe, where they actually ran a pretty decent election, but the election was not free and fair.
00:28:33
Speaker
Yeah. Right. And in terms of the work that you're doing, i mean, it's quite involved in terms of observing elections end to end and thinking through problems that you're
Microsoft’s Diplomatic Role in Cybersecurity
00:28:44
Speaker
perceiving with the process. And it feels in some ways like that goes beyond the scope of what you would expect a a tech company to be interested in and working on. um And so is Microsoft, does it engage in what is basically diplomacy and statecraft and intelligence and these other kind of areas? Is that something that it's it's involved in and how does it see itself as an entity?
00:29:07
Speaker
um I appreciate that you said that and I can't really, i don't want to dive into that ah too much, um but i you're you're not wrong to say those things, right? We're a very large entity and in some sense we are unavoidably on the international diplomatic scene, especially when it comes to cybersecurity and threat intelligence. um Look, I will say that it's not just elections officials who use our email.
00:29:37
Speaker
It's everyone in the world. And in fact, most email in the world runs through us or Google. Yeah. And so you can imagine the mountain of threat intel we sit on top of.
00:29:49
Speaker
It's incumbent on us to work with governments to share the threat intel that we process as a result. Yeah. Yeah.
00:29:58
Speaker
And yeah, no, thank you. That's, that's interesting. I appreciate You can't answer it in full. Um, but it's just a point that that that is interesting to me because on the, on the website, you know, it talks about like tech diplomats in Microsoft. And obviously that's the kind of term that you use when you're thinking about statecraft and international relations.
00:30:19
Speaker
And we have tech diplomats. um I mean, we we actually call the team Digital Diplomacy. ah they're They're out there working on cyber regulation and things like that.
00:30:31
Speaker
Okay. And in the observer context in the UK, how, because you've been talking to other election observers and election officials about how they could be using technology and AI. So could you tell us a bit about how you see technology being embedded into elections?
AI’s Role in Elections and the Need for Oversight
00:30:48
Speaker
Well, I think we've we've been talking a lot lately with election stakeholders about, we'll call it like productive use of AI in their work. hey um we One thing we're doing right now is actually a workshop, a hands-on workshop, where we'll sit down with elections officials and we'll go through exercises.
00:31:07
Speaker
And then the exercise we do is kind of fun. It's ah it's a tabletop. We say, here's ah here's your old law. Here's your new elections law. And you have a snap election in 10 days.
00:31:19
Speaker
Now, I want you to communicate to your voters the changes in the election law and the upcoming election. And we give them tasks. We say, write a press release, create some social media graphics, create a calendar for your social media campaign, um translate into three different languages, things like, write the really tactical things that you can do with AI. And i think it's cool. it's really It's been really eye-opening for a lot of AI officials because, are away sorry, i elections official aid music officials, mean officials. We're not there yet.
00:31:52
Speaker
um they a lot of them have been scared to use ai because they they see it in the context of deepfakes and the badness that it brings to the elections and democracy space and they're afraid to experiment with the good stuff but uh so we we try and help them see like well you know deepfakes aside most of this is actually mundane most of this is just helpful for you in your in your communications process in alleviating some of the grunt work you might have to do some of the mundane stuff that
00:32:23
Speaker
might take you three hours normally, but here you can do it in five minutes, right? To create a graphic or whatever. um that's ah That's a huge positive. that Most of these elections offices are under-resourced dramatically, which is unfortunate.
00:32:35
Speaker
um But ah giving them an extra pair of hands in the form of an AI generative engine is is usually helpful. So we're we're just out there showing them how they can do their jobs better, more productively.
00:32:50
Speaker
Okay, because I feel like um when we talk about embedding technology into processes, we think about greater efficiency and improvement. um But in the UK, where we have a very paper-based system for voting, so you pick up your poll card, you go physically into a booth and you vote on a slip of paper, and then that's actually counted physically in person by the election officials. And you can watch that happening um over the election night and you see it happening in centres. That actually, the lack of technology is actually part of what builds trust in the process. in the UK context?
00:33:27
Speaker
Sure. I mean, you say lack of technology, but again, I'll point out, you're focusing on a very narrow slice of the process, which is election day, right? Which is the actual just...
00:33:38
Speaker
I'm here, I'm giving a ballot, I'm casting it, it's being counted in the span of 12 hours. But there's so much that happens around an election. The qualification of candidates, the, ah you know, if you, are our friends at the University of Bristol have worked with the commission on processing receipts for campaign expense reporting with AI. Like that's,
00:34:00
Speaker
That's undersold as as a huge thing. The transparency around the process, the reporting and mechanics that go into filing, making sure that people are on the lists, reporting out the results, things like that. So there's so much more to the process that actually is technologically enabled outside the you know narrow 12-hour window of Election Day.
00:34:21
Speaker
Okay. And if one of the election official teams or teams of observers in somewhere like Zimbabwe, they started using Microsoft technology for um helping with its election process, would they be getting that for free?
00:34:35
Speaker
um No, that's usually illegal for us to give ah government. and you're you You're saying government officials or elections observers? I suppose I'm asking about both, but we can treat them separately. Governments, we often can't provide things for free.
00:34:49
Speaker
There's just loss. and And so we do where we can and we don't where we cannot. um We can usually, if they're already using our technology, we can usually come in and provide them with security extras for free. ah Nonprofits, yeah, we we either significantly reduced or free um for not-for-profit organizations, NGOs.
00:35:11
Speaker
OK. Yeah. Okay. And um do you give much consideration to the fact that, because we've talked about the adoption of AI and Microsoft's use of AI in the way that it safeguards elections and protects democracy, but conversely, do you think about the ways that it could undermine democracy or trust in democratic systems? So, for example, if lots workers lose their jobs to AI or the expansion of big den data centers, which are powering AI, lose to bills going up for electricity or you know land use and environmental pollution, won't those problems undermine public support for democracy long term?
AI’s Impact on Jobs and Democracy
00:35:48
Speaker
Sure, and I think it's important to call attention to my boss, Brad Smith, the president of Microsoft, made a pretty big announcement last week in DC about our commitments to paying our fair share, ah making sure that um electricity does not requests do not go up because of our data center usage, and making sure that our environmental impact is entirely mitigated.
00:36:14
Speaker
ah we We consider those very important at Microsoft. Okay. And in terms of the risks of embedding AI into lots of different industries and processes, I'm really thinking about, you know, things like outsourcing critical thinking skills to AI. Do you think there's to be problems long-term with that? Because we've talked about, you know, using AI to summarize the differences between old legislation and new legislation and writing press releases, but you don't want people to lose those skills.
00:36:44
Speaker
That's totally fair. I mean, I think that needs to be, that's a societal issue as we deal with a new technology, right? I mean, people probably thought when books were first printed that people were going to lose the ability to tell a good story. I mean, this is, I'm i'm certain that attitude is slightly overblown, that we're going to all lose critical thinking skills as a result of of having ah a work helper in in the form of AI. But but point taken, I mean, that yes, we we actually, i think one thing that I skipped over before is when we're working with election stakeholders, we stress over and over again, based on the criticality of the exercise, ah human involvement is is tantamount, right? So we say human in the loop.
00:37:32
Speaker
I'll say it 50 times when I'm doing a training, human in the loop. So we don't encourage the use of AI in anything that is considered critical. So for instance, the tabulation of votes.
00:37:45
Speaker
That's off the table. we We do not recommend that you use AI in any form right now in the tabulation of votes. But if you're creating social media graphics to tell people about an upcoming election, sure, that's a great way to use it, right? And even then, human in the loop, right?
00:38:01
Speaker
Don't just have AI create something and then throw it online. Make sure that somebody's reviewing it before you do. Okay. So it's it's basically, it's better for super superficial uses?
00:38:14
Speaker
I mean... There's lots and lots of uses. But when we're talking in the elections context, yeah like you said, trust elections are about trust. And we would never, ever recommend anything to an elections official that might degrade trust in the process.
00:38:29
Speaker
Yeah. Okay. Thanks. That's really helpful.
Legal and Ethical Implications of AI Content
00:38:33
Speaker
um Something I just wanted to come back to is this idea of create someone creating a video of Keir Starmer and that being protected. speech laws Because is that not a nightmare from a defamation perspective?
00:38:45
Speaker
And is that something that you have to consider? I mean, i i don't want to I don't want to overly weigh in on this right now because I believe there is actually an acrimonious ah argument happening between Keir Starmer and the US s government over this right now, this very thing, which is i believe the UK wants to block all um instances of that on TikTok. And the US believes that is an undue limitation on free speech. So I'm not here to take sides.
00:39:15
Speaker
we do with the we We follow the law. Okay. That's a diplomatic... I'm sorry. I say we follow the law, but we do we do go an extra mile when it comes to protecting candidates for election.
00:39:26
Speaker
And we do go the extra mile when it comes to protecting people from unauthorized likeness for ah you know certain suspect uses, so non-consensual intimate imagery or ah child sexual abuse, those kind of things. Those are... you know completely like we I think we do a very good job keeping that off of our generative systems but yeah but ah there are you know this this argument has come up recently because of x and grok specifically its um inability to prevent those things from happening
00:40:05
Speaker
Okay, well, thank you so much for your time. It's been really fascinating to hear what Microsoft is involved in um when it comes
Lichtman’s Commitment to Supporting Democracy
00:40:13
Speaker
to elections. But is there anything that you'd like listeners to take away or any closing thoughts that you have about elections?
00:40:19
Speaker
Yeah, I mean, this is first of all, thank you for having me. i really appreciate it. and enjoyed the discussion. um I think I can't overstate that Microsoft is committed to this work and we're in it for the long haul. And look, I i personally just I love this. or I love working with people who care about protecting democracy. i didn't I didn't say it at the beginning, but this is actually recent. I just joined my local board of elections ah just last week.
00:40:46
Speaker
So I'm one of I live in Arlington, Virginia, which is just across the river from D.C. So now I'm one of the three people in Arlington that certifies the election on election day and oversees the operation, you know, just a board member. I'm not like I wouldn't claim that I'm doing the hands on work or anything. We have I'll say we have the best elections department in all of the United States. But ah I'm just excited because like I love elections so much. I do it in my spare time, too. Well, you're in the right place then. Yes. So the opportunity to talk about this stuff, people who care is is always a joy.
00:41:19
Speaker
And what inspires that interest in elections and making sure that they're carried out freely and fairly? It's one of the foundational, again, back to my earlier point, it's not the only thing, but it is one of the foundational pieces of democracy is making sure that every voice is heard and that everybody has the opportunity to to contribute their voice to democracy.
00:41:39
Speaker
And that's that's a beautiful thing. It's... It's shown historically to be the best way to run a society and to to allow people to participate. And that's that's wonderful. And it needs to be protected.
Podcast Conclusion by Democracy Volunteers
00:41:52
Speaker
Well, thank you again for your time. It's been fascinating talking to you. Thank you, Lily. All right. Goodbye.
00:42:08
Speaker
The Observations podcast is being brought to you by Democracy Volunteers, the UK's leading election observation group. Democracy Volunteers is non-partisan and does not necessarily share the opinions of participants in the podcast. It brings the podcast to you to improve knowledge of elections, both national and international.