Podcast Introduction
00:00:11
Speaker
Hi there, and welcome to the first ever episode of Infovercity. This podcast is coming to you from Syracuse University's School of Information Studies, where we investigate the intersection of technology, business, and humanity.
Meet Professor Jennifer Stromer-Galley
00:00:25
Speaker
Today, I'm joined by Professor Jennifer Stromer-Galley, a senior associate dean at the iSchool and renowned researcher in the field of social media and politics. Hi, Jenny. Howdy.
Digital Presidential Campaigning
00:00:37
Speaker
In the second edition of her book, Presidential Campaigning in the Internet Age, which came out in 2019 and a history of presidential campaigns and how they have been adopted and adapted to digital communication technologies from 96 to 2016.
00:00:53
Speaker
She has more recently been working to make it easier for public journalists to track what political candidates say to their social media followers and received a John S. and James L. Knight Foundation grant to build an online dashboard that tracks candidate spending and messaging. Jenny has also been a principal investigator in a $5.2 million project called the Trackable Reasoning and Analysis for Collaboration and Evaluation, or TRACE.
00:01:22
Speaker
funded by the Intelligence Advanced Research Program Agency, IRPA. The project experimented with different tools and interfaces to support complex reasoning in intelligence analysis. And earlier this month, she has received a grant from the Knowledge Graph Database Company, Neo4j, to track misinformation on social media and in online political ads in the 2024 US election.
00:01:50
Speaker
This topic is of particular interest as we kick off a major election year in the US and abroad.
Impact of Digital Media on Politics
00:01:57
Speaker
And in fact, 2024 is slated to be the biggest election year in global history, with an estimated 4 billion people going to the polls in 60 countries. So welcome, Jenny. Thank you for joining us. No, it's my pleasure. I'm delighted to be part of the inaugural podcast for the iSchool. I think that's pretty cool. And I really thank you for being here.
00:02:20
Speaker
That was one heck of an introduction that we had for you. That was amazing. I mean, your accomplishments are just incredible. Could you spend a little bit of time and talk about how you got involved and got started in your interests in political elections? Yeah, I can.
00:02:42
Speaker
I don't know if it's embarrassing to say, but I actually got started by studying Bob Dole's website. So this is back, imagine it's 1996. And this really old guy named Bob Dole, who was a senator, had decided to run for president against Bill Clinton for Clinton's second run for the
00:03:04
Speaker
for a second term as president. And this is back when there's no social media, there are barely websites. The World Wide Web is just really starting to diffuse. People in higher education had email addresses and they might be able to create a little website or web page, but it was still a really novel technology. And so I was really struck by the fact that candidates could use
00:03:32
Speaker
digital media to talk directly to supporters. Because imagine before the web, before email, before social media and all the things we currently are consumed by, we mostly got our information from three television stations for news. And maybe by the 90s, we had broadcast, sorry, cable, that's what I meant to say. So we had Fox News, we had CNN,
00:03:57
Speaker
And then we had a newspaper that came and radio stations. And so the mass media was a very powerful intermediary between political candidates and politicians and the public.
Social Media's Role in Campaigns
00:04:10
Speaker
And so with the internet coming along and potentially democratizing, like little de-democratizing the ways that the public could engage with the political elite in the country and vice versa, I was really fascinated by that. So I wrote my master's thesis
00:04:27
Speaker
on Godel's website and this new political campaign medium. And that's kind of where things started. That's fantastic. You've obviously been involved in social media for a very long time. What do you think has changed about social media since its inception to today?
00:04:47
Speaker
Um, so in the context of how political campaigns use digital technologies, I mean, there's been a lot of evolution. Um, so you mentioned in the intro, uh, the book that I wrote presidential campaigning in the internet age. And, you know, it's a history. It looks at a decade of how candidates evolved their
00:05:09
Speaker
campaign strategies as the digital media environment changed, right? Because they both change hand in hand. And we don't give a lot of credit for this, but actually innovations by campaigns do drive technology. And there's actually a long history of this. And so for example,
00:05:30
Speaker
When radio was just starting to become a thing in households in the 1920s and 30s, radio producers used the debates to sell radios to households to try to get them to adopt radio and bring this national event of a political debate into people's households.
00:05:55
Speaker
You see similar kinds of interactions, if you will, between the companies and the technologies, these kind of innovations in information technology and campaigns because presidential campaigns in particular are when
00:06:12
Speaker
you have kind of a mass media event, right? The public, generally speaking, is sort of tuned in. And so it allows for the technology companies to really leverage their technology to say, hey, if you use us, then you can gain additional connection to this kind of national event. So looking across from 1996 through 2016,
00:06:40
Speaker
So much has changed, actually. I was thinking about it as I was coming in today, just how much what we think of today as normal, even a decade ago, would have been seen as really quite, I don't know what the word is, amazing, extreme, problematic. I mean, I personally think we are at a very challenging moment in our democracy.
00:07:05
Speaker
And I do think that the ways that political actors can directly talk to the public and bypassing the, you know, the news media has been called the fourth estate, right? So it sort of serves as this intermediary.
00:07:23
Speaker
in between the public and political figures. And the news media journalism, that industry has really been challenged through digital technologies. The business model for journalism has eroded. And so you see newspapers
00:07:41
Speaker
Folding all over the country, you know here in Syracuse our daily newspaper is actually only delivered now three days a week and largely what most people are getting as news is national news from syndicated media like AP or Reuters and so local news has really eroded and
00:08:02
Speaker
I think that has a potential problematic effect, especially around misinformation, which I have a hunch we'll talk a little bit about next. But I think looking at the current political moment, when you have
00:08:13
Speaker
so many elections happening across the globe, there is a real opportunity for ordinary people to get involved with those political campaigns for political actors and political parties to directly talk to their supporters and the public. And that's a good thing. I actually think there is a democratizing dimension to that that's really quite beneficial. The challenge is that
00:08:40
Speaker
bad actors can also misuse that direct connection and can spread incorrect information whether it's accidental or intentional and of course we're more worried about the intentional spreading of false and misleading information. And that wasn't a concern back in say 1996. In 1996 the concern was really more
00:09:07
Speaker
the fact that the public couldn't really communicate and connect with political actors. There was this intermediary between them. So there's a bridging of the distance, I think, today that social media enables that allows the public to be more involved in political
00:09:28
Speaker
action, political engagement, finding communities of people who share their perspectives, that's good.
Technology and Political Behavior
00:09:35
Speaker
But then there is always, unfortunately, I think with these technologies, there's always a dark side. And human behavior, there's both good and bad, and the technology enables both to happen. So it seems like if I may
00:09:50
Speaker
sort of digest a lot of what you said there. That's all right. That was great. There is a, you know, it seems like throughout history from the political landscape, there's always been informing and technology has enabled informing and not informing in some ways. And what we're seeing now is with the explosion, it's becoming exponential with the ways that someone can be informed and be misinformed.
00:10:16
Speaker
And that presents itself as a problem and a unique challenge, I guess is a good way to think about what you're saying there. I thought about that radio and I was like, okay, there's a way to inform and like buying people radios. It's like, that's how you get to your electorate, right? You got to let them listen. Now there's so many channels, right? So I think that's one of the challenges.
Strategic Platform Use in Campaigns
00:10:36
Speaker
There's so many channels. If you're a political candidate, what do you do? I mean, like, would you go on Mastodon? Do you go on Twitter?
00:10:46
Speaker
like a you want to be mayor of a small town you don't have an army of people to manage social media for you but there's so many different channels so can you talk maybe a little bit about how that works yeah for sure no it's it's funny because back in like
00:11:04
Speaker
2000 and 2004, right? So 2000 candidates raising email. Oh my gosh, email. Wow. But honestly, at that time, an email, believe it or not, is still an incredibly powerful fundraising tool for candidates. Subscribe to my newsletter. Exactly. Yeah. And so, you know, any, any supporter, any, any citizen who's willing to give an email address to a political organization, whether it's a party, a candidate, or
00:11:30
Speaker
an activist group, if you're willing to give up your email, that means that you're committed to some degree. So that becomes this reciprocity of, cool, you care about our cause, then why don't you give us a little money? So email is still really, really powerful. And it never gets talked about when you think about social media, but honestly, email was the first social medium invented in 1972.
00:11:53
Speaker
So, we've had social media, actually, a really long time, but the proliferation of channels is really remarkable right now. And, you know, again, think about 2000. So, email, there were blogs, actually, kind of early generation blogs, and there were discussion boards, like threaded discussion.
00:12:11
Speaker
I was a big Usenet user way back in the early 90s. I was a Usenet person. Yeah, for sure. Usenet is sort of like the Twitter of two decades ago, right? What comes around goes around. It's different evolutions in kind of the core thing of allowing people to yell at each other.
00:12:31
Speaker
So, and then 2004 comes along, blogging really becomes hot. MySpace is a thing. And then by 2008, you start to see MySpace, which was still a thing, Facebook and Twitter, email, blogs, and of course the traditional website, because the website's still a staple today for campaigns. And so basically from 2008 until 2020, for the most part, it was Twitter, Facebook,
00:13:01
Speaker
Instagram kind of grew in importance, but not to the same degree as Facebook and Twitter. And Snapchat actually was a little bit of an experiment for some campaigns, in part because of the geofencing. So if you were holding a political rally in Iowa, you could encourage your supporters to use a Snapchat geo filter to sort of say to their friends, hey, I'm at this event for Ted Cruz. And that seemed kind of cool and hip.
00:13:30
Speaker
But Twitter, I would not have predicted, actually, the sort of shift that happened with Twitter that has then enabled and opened up, I think, a new proliferation of social media platforms like Blue Sky and Mastodon. There's also on the political right, there are new social media platforms like Truth Social, which was started by Trump.
00:13:54
Speaker
as well as GAB, which is kind of a religious slash conservative social media platform. And then some others that come and go, they get pulled down from the internet for various reasons. So there are many, many choices that campaigns now have. Strategically, campaigns generally go where their supporters are.
00:14:16
Speaker
So for the most part, that means that they're going to Facebook, in part because that's where older voters now are. And Facebook still reaches roughly 80% of the public. Now, it doesn't necessarily mean everybody's active on Facebook, but it still has massive reach. And so Facebook is sort of a standard go-to. Twitter, I don't know what's going to happen on Twitter this election season. Things are so unsettled with that platform and its utility.
00:14:45
Speaker
Candidates are on. We just did an analysis looking at where the candidates are right now for the 24 election for president, governor, senator, and select house races. Almost all of the federal races have Twitter accounts. They have Facebook accounts. They have YouTube as well. I forgot to mention YouTube. Oh, YouTube has been central really since 2012.
00:15:07
Speaker
as a way to take advertisements that you might be running on television and then put them up online to reach potentially other voters and supporters.
00:15:16
Speaker
And of course, now there's TikTok as well, and you do have candidates. It's an uneven smattering, if you will, of candidates that are using TikTok, but they're there too. So as a candidate, you have to make decisions. Where do I think my supporters are? Where can I get the most bang for my buck, if you will, in terms of talking to people that matter?
00:15:38
Speaker
Generally speaking, candidates are not on every platform. It's not worth our energy or time strategically. So where's your voter? Talk to those voters and try to get them to actually turn out to vote on election day because at the end of the day is what a campaign is about.
Research in Social Media Data
00:15:52
Speaker
So let's turn it into your research a little bit. So with all this going on, right? And everyone's using these different platforms. And then if you're in a Senate race, you might be looking more at these higher end platforms. If you're local, you might be looking at different platforms. How do you as a researcher go about collecting this information? And then what do you do with it? So
00:16:20
Speaker
I'm trying to think here. So roughly, so I started here at the School of Information Studies in 2013.
00:16:29
Speaker
From 1996 through 2012, I was looking very closely at how campaigns were using digital media as part of their communication work. There wasn't a systematic way to study, say, Facebook posts by Barack Obama and Mitt Romney in 2012. There was no access point, if you will. I actually was looking back. I have screenshots.
00:16:53
Speaker
like physical, digital screenshots that I took from both Romney and Obama's Facebook accounts back in 2012 because it wasn't any other way to try to capture what they were saying on those accounts.
00:17:09
Speaker
But the beauty of being in the School of Information Studies, combined with the platforms like Facebook opening up what they call APIs, which are application programming interfaces, you can think of it as basically a door between me and the database at Facebook. And Facebook can open up that door. And when they do, then we can actually more easily pull down
00:17:32
Speaker
in database form that candidates' messages, when they post those messages, how many people reacted to those messages. So that's been transformative. And so when I started here at the school, I started working with Jeff Hemsley, another colleague in the iSchool, and a few others who now have retired, looking at and collecting social media messages.
00:17:57
Speaker
I've long been concerned about the challenges of really tracking what the candidates are saying to their publics on social media. There's so many messages. As you mentioned, there's a lot of different platforms and it's just really hard to keep track of it all. And I was concerned that for journalists who are following the political elections, many of them don't have the computational skills or the ability to do that kind of
00:18:21
Speaker
collection and analysis work to really see what the messaging looks like. And with micro-targeting, which I think is especially a concern. So micro-targeting is basically a candidate saying one message to one constituency and a different message to a different constituency. And I think
00:18:41
Speaker
thinking about misinformation and all of the challenges in our communication environment right now, when you've got multiple social media platforms and you've got many, many messages, it's easy for bad actors to hide bad messages in the massive volume of information. It's a needle in the haystack problem. The goal that we started back in 2014,
00:19:03
Speaker
looking at governor's races in 2014 and then in preparation for 2016 was to start collecting and then building computational classifiers to
00:19:12
Speaker
categorize the type of messaging that we're seeing from political candidates. And so you mentioned the John S. and James L. Knight Foundation grant that we got. So in 2016 and 2020, we built an interactive dashboard. And so if you go to illuminating.iscool.syr.edu, you can actually go and see the dashboard and play with it. And we only have the presidential data up. I have other data like governors and senators and stuff. But for the most part, that dashboard is presidential.
00:19:42
Speaker
And so you can see like, how much is Donald Trump attacking Hillary Clinton in the 2016 election? And here's a fun fact. It turns out that Hillary Clinton actually was more negative. That is, she attacked more in her advertisements and on her social media accounts than Trump did. So there's these assumptions that say Trump was more negative
00:20:03
Speaker
than Clinton in 16, but actually Clinton was more negative. Now there's different types of negativity. And if you look at incivility, which we have a classifier for that too, it turns out that Trump is a lot more uncivil than Clinton or coming to 2020, same with Biden. So that sort of interactive dashboard, that's the beauty of data and data science is the ability to help
00:20:28
Speaker
digest large volumes of data and make it easier for other people to understand it, make use of it, and write stories about it. So yes, that's some of the things that we do with the data that we're collecting. There must be a tremendous volume of data that we're talking about here, right? Because when you think about all of the
00:20:50
Speaker
signals that happen on a daily basis during a major election across all the different social media platforms. This has to be a large, a very large quantity of data, right?
00:21:01
Speaker
Yes, I think somebody said that we filled up 13 servers worth of social media messages. However much that is, I don't even know.
Political Advertising and Misinformation
00:21:10
Speaker
But yeah, and especially because ads, so one of the things we've been doing since 2018 is collecting and analyzing ads that are run on Facebook and Instagram by candidates. So we started looking at Facebook and Twitter, basically the candidates accounts, and then Facebook in 2017 after a scandal involving
00:21:31
Speaker
Cambridge Analytica, which was a private company that was basically selling to political candidates the ability to micro target it, target advertisements on Facebook based on the personality characteristics of the targets, which is that there's a holy grail quest in political campaigning, going back to micro targeting to try to match the right message to the right voter.
00:21:58
Speaker
such that you'll pull them to you as a candidate. And so that's what Cambridge Analytica was selling. It's snake oil, honestly. There's a bunch of issues there, but that's what they were selling. And of course, there is data breaches in that because the way that Cambridge Analytica built their predictive algorithms of
00:22:18
Speaker
Facebook users was actually based on basically ill-gotten Facebook data. So Facebook said, this is a problem. And so in 2017, they created the Facebook ad library. So since 2017, we've been collecting the candidate advertisements. And I have to say that's where the volume is most spectacular. Because Mike Bloomberg, when he ran for president in 2020,
00:22:42
Speaker
wealthy billionaire, he ran oodles and oodles and oodles of ads. He broke all of our collections because there were so many advertisements. And as we come into this election season, we're running into the same issues because we're not only collecting candidate advertisements now, but we're also collecting ads around the candidates. So what's interesting is that if I
00:23:07
Speaker
give Facebook evidence of my existence, basically my driver's license. I take photocopies of my driver's license. I give that to Facebook. They will approve me to run ads. And so ordinary people run ads on Facebook around candidates. And then of course, political action committees, activist organizations potentially
00:23:28
Speaker
bad actors from other countries, all can run ads potentially on Facebook and tracking all of that is going to be one of our biggest challenges actually as we come into this election season.
00:23:42
Speaker
So again, you're going to that exponential growth. I wouldn't even think of someone like myself, a common voter, paying for ads to support a political candidate. Yeah, it's wild, actually. It's like the new Wild West. It's wild. You'll have people who will pay for an ad. It's basically them doing like this. They're just talking into a microphone to some anonymous public, and then they slice it up, and then they run ads from it. Yeah, the impact and effect is not clear, but hopefully we'll have a better sense of that this time.
00:24:12
Speaker
So let's go back to something that you said earlier. We know there was evidence of Russia attempting to meddle in our presidential election. In 2016. In 2016, right? Was that a good example of micro-targeting? Or was that
00:24:29
Speaker
Is it, was it advertisement based? Was it, were it, was it actors pretending to not be who they were and just spreading, you know, just saying, this is, this is a message that I want people to connect to. Yeah. So 16 was interesting because, um, there's a couple of different dimensions. Some of what Russia was attempting to do was to actually hack election machines. Um, so there were municipalities, um,
00:24:57
Speaker
a couple of the southern states, Florida, I believe is one, Georgia might have been another. Don't quote me on that because I haven't looked in a while. But there were a set of municipalities where Russia was actually trying to hack into electronic voting machines. They were unsuccessful. There's no evidence that there was any
00:25:16
Speaker
inappropriate vote tallies coming out of that effort. So there, so there's that. And that's a whole other conversation, just how we vote and, and susceptibility. Those are air gapped though. You know, they're not even on a network. That's exactly right. So, um, so that's one dimension, but the other dimension that caught the attention of Congress and the intelligence agencies were the efforts by, uh, RT, which is Russia today, which is, um, basically a,
00:25:46
Speaker
kind of a Russian troll farm based people hired by the Russian government to spend time building social media accounts in the United States. So these are generally Russians who have good English who then spend time building up what looked to be a legitimate Facebook or Twitter accounts.
00:26:04
Speaker
And then at some point, they kind of shift from being posts about puppies and what I ate for breakfast today to being political messages. And so there was coordinated efforts, especially in Florida, to try to organize rallies in support of Trump that were actually RT organizing those events. So it wasn't ordinary citizens in Florida saying, yay, Trump. It was actually Russian actors trying to mobilize American citizens
00:26:31
Speaker
In addition to that Russian operatives were also creating proliferation of Twitter accounts that basically we're trying to sow division Within the electorate especially on racial lines so both kind of Have anti I don't know how to put this kind of racist messages as well as kind of pro
00:26:58
Speaker
civil rights messages in an effort to push Americans to basically hate on each other. And that effort to sow division in the United States is something that was really prevalent in 2016 and then actually continued a little bit in 2020.
00:27:21
Speaker
One of the challenges with the internet, right, is it brings out the worst in people, brings out the best in people.
Challenges of Misinformation in Politics
00:27:26
Speaker
And there's a lot of misinformation out there. You know, you have children, I have children, I'm constantly correcting their facts that they get on the internet and things like that. And so can we talk a little bit about the current landscape as far as misinformation and what's going on in elections with misinformation?
00:27:49
Speaker
So there's a lot of concern about the potential for misinformation in the selection season. It's hard to not know a little bit these days about the concern of deep fakes, for example, which are basically ways for people to create videos that look like they might be, say, a video of former President Barack Obama,
00:28:17
Speaker
Staying or advocating for something when in fact is entirely a digitally created avatar and a digitally created message that was puppeteered by somebody and So deep fakes, you know that the the concerns about generative AI the that is basically the ability to pretty easily and quickly create fake
00:28:44
Speaker
representations of politicians, or journalists actually, is I think a genuine concern. And because, going back to what you mentioned earlier, the proliferation of so many different social media platforms, it's really hard to monitor, to track. The messaging comes from many different sources, including politicians themselves, in terms of sort of pushing misinformation
00:29:13
Speaker
And so it's a mess, honestly. And I think that it will be important for the public to be savvy about
00:29:35
Speaker
I don't know if I can say this, but detecting bullshit. And when something doesn't look right or feel right to really question it and not just pass it on and say, look at this crazy thing because that's how misinformation tends to spread. But it requires ordinary people to have a stronger bullshit detector. And I don't know how we help them with that.
00:29:57
Speaker
Congress has completely abdicated its responsibility in properly legislating this environment. So we still currently at this moment have no strong legislation other than a requirement that the social media platforms indicate when an advertisement
00:30:17
Speaker
on, say, Facebook or on Google, it has AI-generated content. That is not enough to help us navigate this election season. I was on Reddit, one of my favorite places. And in the Mid-Journey subreddit, there was someone who generated a photo of Hillary Clinton and Joe Biden sharing drinks. And it was AI-generated.
00:30:43
Speaker
And it looked very convincing, other than the fact that you wouldn't think Hillary Clinton and Joe Biden would be drinking like that. Right. So it's a hard problem to get to the bottom of. But it is a problem that in this next decade, we're going to have to wrestle with it as a culture. Absolutely. Because it's only going to get better, in my opinion. You said better, not better.
00:31:09
Speaker
better. It's going to get better. The AI is going to get better and the situation is going to get worse. That's a good way to put it. And you know, it is very concerning for things like elections where you can be misled by falsehoods that are represented as truth and sowing division, right? I mean, it could just be as simple as
00:31:32
Speaker
misrepresenting facts, but it could be even bigger in terms of just sowing division. Absolutely. And it's hard to be aware of what is truth and what isn't. Do you have any advice? Because I just said how hard it was. Yeah, no, it's super spread. I mean, I think it's a multi-pronged effort that will be required, right? There's not a single solution to this. I mean, one is
00:31:58
Speaker
So if you think about the 11 Labs is one of the companies that makes it very easy to generate oral or audio.
00:32:08
Speaker
AI voices and so, you know, I could sit down and pretty quickly built an AI generated or I could basically a fake Barack Obama voice that I could then layer on to Either you know a set of stills right it just photos of Obama and
00:32:31
Speaker
and stitch that together into what looks like maybe an advertisement where Obama basically is attacking Joe Biden. And 11 Labs and the other AI companies that are making these tools and technologies so readily available, I think have a responsibility to think about how we're going to also build detection tools
00:32:54
Speaker
that the platforms and ordinary people can use as part of their browser, right? You know, a little browser add-in or a little app on your phone that allows you to say, this was made by AI. Because honestly, if I'm a bad actor and I read an advertisement on Facebook, why would I say, oh, this was AI generated? But you wouldn't. So, and it's entirely up to the advertiser to sort of report that they're the one who, yeah, they made this ad and it is AI generated.
00:33:22
Speaker
The companies that are building the generative AI need to do more and do better to help in the public sphere. The platforms, social media platforms, Facebook, Twitter, TikTok.
00:33:37
Speaker
They also, I think, have a much stronger responsibility than they're currently taking to help support good information in the public sphere. The problem is the Facebook, TikTok, Twitter, et cetera. These companies are for-profit companies, and their bottom line is to shareholders, not to the citizenry, not to the public good. And because of that, in fact, they have actually reduced
00:34:07
Speaker
their integrity teams coming into this election season. I don't honestly know if X or Twitter even has an election integrity team this round. I know there are layoffs at Facebook. I don't believe there have been any kind of new efforts to build up a set of staff whose job it is to really think through the policies and processes and practices by these platforms
00:34:32
Speaker
to help ensure a healthy information environment. So those are failures, right? So the AI companies, there's failures. The tech companies, there's failures. Congress, again, has not done enough, quickly enough, coming into this election to really tackle the problem. When most senators' average age is 70, it also further challenges the ability for them to even understand.
00:34:55
Speaker
what the technology is and how to regulate it. The Supreme Court currently is looking at a case that might even further challenge the federal agencies to regulate in this space.
00:35:09
Speaker
then that leaves the citizens and the politicians, right? So the politicians themselves, we have a new norm of, I don't know what the word is, it's almost, the word that comes to mind is lawlessness, and then that's probably not the right word. But once upon a time,
00:35:27
Speaker
Even a decade ago, there was an expectation that politicians held themselves up to a certain standard of truthfulness, of decorum to the opposition, to the news media, to people that they disagreed with. That currently doesn't exist. And as a result, there doesn't seem to be any norms
00:35:47
Speaker
They're holding politicians back from saying what's not true in the spirit of expediency. So that's a huge problem. And then the public, right? So then it's like, okay, well, you American citizen, you better just get better at this and figure it all out. But that's also very, very challenging, right? We're busy. We have busy lives. We are overwhelmed with information.
00:36:10
Speaker
Um, i'm a political junkie. I you know, this is my lifeblood but for most poor ordinary people Politics is just one of many many things. They might pay some bit of attention to and when you're Unfortunately the phrasing in political science is low informed voter, which is kind of a derogatory term But the idea is that ordinary people who who aren't junkies like me don't have a lot of
00:36:36
Speaker
knowledge of actors, events. And as a result, they're more susceptible to misinformation because they don't have a strong bullshit detector. And so that then leaves them to be more susceptible to incorrect information. Then you fall into the problem of ideology and identity. You know, ordinary people, some of them are really beholden to their political party and it's part of their identity. And when that happens, when information comes to them that is false,
00:37:03
Speaker
but aligns with their beliefs, they're more likely to then believe it and then pass it on. And so that's also very much a challenge. So I've just laid out all the problems. I don't know what the solutions are other than a multi-pronged effort by educators, by ordinary citizens themselves, by politicians, by Congress and the courts, and by the platforms to really
00:37:30
Speaker
take this problem seriously. Otherwise, I'll be honest, Mike, I actually fear for the health of our democracy. I can see that. I totally can see that. I am not a political junkie. I'm probably classified. I won't go there. As low informed voters. I don't want to say low informed, but maybe medium informed voters.
00:37:54
Speaker
But I would agree with you. I think based on my understanding and knowledge of what the capabilities are as far as deep fakes in video, for my own classes, I'm thinking about using voice printing so that I don't have to just keep speaking all the time. I could just write out what I want and then, A, I generate my own voice, right? You're going to end up replacing the professor. I just replaced myself.
00:38:20
Speaker
But you can see how these things can be used for not so good purposes, right? And yes, it isn't very realistic to expect that everyone out there is going to be able to figure out whether or not they're being duped.
Technological Impact on Political Communication
00:38:41
Speaker
if it aligns with her ideology, might not think about it as being duped. Exactly. And so there are a lot of different there are a lot of different challenges here. And I think the biggest one, if I may summarize all the things that you said, which was so informative, is that the scale has changed, right? I think these problems have always been there in some way or another. It's just technology as it does with everything becomes this this enabler that makes us all work harder and faster. And now things are
00:39:10
Speaker
doing what they do at a breakneck pace, right? And that's what becomes concerning is that, you know, we used to have the the grapevine game where you'd play and spread a rumor and then by the time I got to the 10th person, it wasn't the same rumor. But those guys are the telephone games. The telephone games. Yeah. Those sort of things now happen at breakneck speed, right? And like the news cycles are so short.
00:39:34
Speaker
And I guess one way that we deal with crisis nowadays is just let it pass because it will go away. And so those ends up being a really big challenge. And so yeah, I hope we do find something. I think at some point we will level out with this because like so many technologies that are disruptive, they do find a way
00:39:54
Speaker
to come out even. But I think the biggest challenge and one thing that you mentioned before is that there's so many all at once right now. Yeah. With between the deep fakes and the generative AI, we can generate text. Now we can generate video, we can generate audio. It's a lot all at once. Yep. And it's all on your phone, right? So it's so easily in your hands all the time. And so it
00:40:20
Speaker
It's almost like a virus, right? And it sort of infects and it potentially, I hate to use this metaphor completely, but it ends up infecting the electorate with problematic information. And then the public doesn't vote
00:40:40
Speaker
in their best interest. They're in effect being manipulated. My home discipline is the discipline of communication. The communication discipline got its start primarily during World War II and the study of propaganda. I feel like we are in a new moment of propagandistic
00:41:02
Speaker
tendencies in the democracies around the globe. And social media like radio in its time allows for this kind of rapid spread of information and it allows bad actors to really potentially manipulate. And anything that I can do as a researcher
00:41:23
Speaker
to help protect the public or inform the public about what they should be watching out for or worried about, and also to help journalists and policymakers to think about how to support a more healthy information environment. That's what motivates me as a researcher.
00:41:44
Speaker
Well, that's fantastic. I appreciate you spending your time with me this afternoon and sharing your story. Is there anything else that you wanted to talk about before we wrap up? Oh my gosh. I feel like we've been in such a gloom and doom note. I know. I know. It's very sad. But on the flip side, is there a flip side? I don't know if there is a flip side. Well, I look at it. One of the things that I was thinking about is
00:42:12
Speaker
You know, you're mentioning propaganda, right? And it's like propaganda used to come from the authority, you know, like the news media outlets could spread propaganda and the governments could spread propaganda, but now anybody can spread. I guess if I have enough money, I could ever do some advertisements and spread some propaganda. Maybe I don't want a flat earth. I want a inverted sphere for my earth. But, you know, and that, I guess that's not a, that's not a good thing by, by any means.
00:42:42
Speaker
But it does allow people's voices to be heard. Exactly. Yeah. So I mean, I guess one of the good things in there is the little person. Ken has a bigger voice now than they've ever had. Yes. Right. Yes. Absolutely right. And that democratizing potential. I mean, going back
00:42:57
Speaker
In 1994 and 1995, when the web was starting to really disseminate, that was the big dream is that this new digital technology would democratize. And it has. But it's gone a little warped. And so I think that's the effort as we come through the next year or two is to figure out how do we put some checks and balances in place so that people can have a powerful voice, but it isn't
00:43:26
Speaker
creating a toxic information environment. I think that's the challenge.
Future of Information Environments
00:43:30
Speaker
And again, I think in the information studies space, we have a unique set of skills and understandings and talents that really allow us to speak to and help the public and politicians
00:43:43
Speaker
in my context anyway, to help make a difference. Just as you're helping companies think about how to use information technology for good, I think in my world I can help us think about how to build information technology for good in the public sphere. That's the upside, is the power of the tools, the technologies to make a difference.
00:44:06
Speaker
I don't know. We'll see. It'll be a really, it's going to be a really weird 2024, I think. And, um, but hopefully we come through this a healthier democracy than we are at right now anyway. Well, I appreciate that. And thank you so much for your time.
Episode Conclusion
00:44:21
Speaker
It was great talking with you. And, um, I guess this wraps up our very first podcast.