Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Scam-Proofing Social Media: The Good, The Bad, and The Reality , A conversation with Assaf Kipnis, ex-Facebook Integrity Team lead image

Scam-Proofing Social Media: The Good, The Bad, and The Reality , A conversation with Assaf Kipnis, ex-Facebook Integrity Team lead

S1 E20 · Scam Rangers
Avatar
661 Plays1 year ago

In this revealing episode, we sit down with Assaf, a former member of Fakebook's integrity team, to explore the complexities of combating scams and fraud on social media platforms. Assaf shares candid insights into the challenges faced by the team in preserving user safety while maintaining platform integrity. We delve into shutting down fake accounts, the impact on PR on the team's work and the importance of industry collaboration, particularly with law enforcement agencies. Join us as we uncover the strategies and considerations involved in making social media a safer place for users, with a focus on building productive partnerships and preventing potential scams.

Find Assaf on LinkedIn: https://www.linkedin.com/in/assafkipnis/

This podcast is hosted by Ayelet Biger-Levin  who spent the last 15 years building technology to help financial institutions authenticate their customers and identify fraud. She believes that when it comes to scams, the story starts well before the transaction. She has created this podcast to talk about the human side of scams, and to learn from people who have decided to dedicate their lives to speaking up on behalf of scam victims and who take action to solve this problem. Be sure to follow her on LinkedIn and reach out to learn about her additional activities in this space.   https://www.linkedin.com/in/ayelet-biger-levin/ 

Also check out https://scamranger.ai if you had received a message that you suspect is a scam



Recommended
Transcript

The Role of Sectors in Online Scams

00:00:00
Speaker
It's time to talk about social media.
00:00:03
Speaker
I often talk about the scam lifecycle and different unwitting accomplices like telcos, mobile OS vendors, social media, and messaging platforms, and of course, financial institutions and law enforcement. None of them want to induce scams, of course, and many of them do their part in preventing scams. But many of them do enable communications, money transfers, and they don't necessarily focus on online scams.
00:00:31
Speaker
At the same time, there's been a significant evolution in governments and regulators' attention to the problem of online scams, and in particular, authorized push payments, where victims are coerced into transferring money themselves. For example, the UK Payment Systems Regulator announced legislation that will mandate financial institutions to reimburse customers for scams in mid-2024. And there are many other initiatives globally.
00:00:59
Speaker
Earlier this year, UK Finance, a trade association for the UK banking and financial services sector, released a report that showed that 78% of authorized push payment fraud begins online, with another 18 starting with phone calls. Recently, the Financial Times reported that UK Finance said that its data had found that 61% of all reported authorized push payment fraud by volume is connected to META,
00:01:29
Speaker
to meta social media platforms, Facebook and Instagram as well as WhatsApp. So criminal use scam phone calls, text messages and emails as well as fake websites and social media posts to trick people into handing over personal details and passwords and then subsequently use that information to convince people into authorizing a payment. So it could be an impersonation scam or an extortion scam or having enough information about them to make
00:01:56
Speaker
criminals dangerous like the fact that they have children and the ages of children and things like that and we're very familiar with the grandparent scam or high mom scam and such.
00:02:09
Speaker
And UK finances managing director of economic crime, Katie Warbeck, said that no matter how advanced the technologies are being used at financial institutions, be it with AI, connecting multiple behavioral device network elements,
00:02:26
Speaker
As long as fraudsters can convince victims to let them access their personal information and authorize payments, it's back to square one. And they also go ahead and say you can have many locks on the door as you like, but if someone gives you the keys or even opens the door for you, all bets are off. Scam Rangers.

Asaf Kipnis: Career and Insights

00:02:51
Speaker
a podcast about the human side of fraud and the people who are on a mission to protect us. I'm your host, Ayere Figur Levine, and I'm passionate about driving awareness and solving this problem. Welcome to Scam Rangers.
00:03:09
Speaker
Today's scam ranger is an ex integrity slash trust and safety team member at both meta, in particular Facebook, as well as LinkedIn. His name is Asaf Kipnis. And I'm very excited to have you here in the conversation with him. Hi Asaf, it's great to have you on the podcast finally. Oh, thanks so much for having me.
00:03:31
Speaker
We talked a few times a couple months ago and now a few weeks ago, and I'm really looking forward to the conversation today. We're going to touch a few hot topics and talk about the role of social media in preventing online scams. Before we jump into the big story, I wanted to ask you to share a little bit about your background and your roles with trust and safety teams at the companies that you worked with.
00:03:56
Speaker
Sure, happy to. Going all the way back, I'm originally from Israel. I got to the U.S. when I was 23, I think. I spent four years in the military in Israel in the artillery forces. And then I got here, long story, ended up getting my degree here in cybersecurity, which was kind of cool. It was, if I'm not mistaken, the third cybersecurity class in that school, which was a long time ago.
00:04:26
Speaker
After that, I worked as a security engineer in a couple of places, and then I got my start in trust and safety when I went to LinkedIn. I was a part of the investigations team in LinkedIn, which was the team that was dealing with the most prevalent issues. I was doing a lot of scraping. They actually got a lot of notoriety lately again, but I was dealing with anti-scraping, account takeovers, extortion,
00:04:57
Speaker
Things that you'd be surprised, I get a lot of people being really surprised. Like, oh, things like that happen on LinkedIn. Yeah. Exactly. Where there's user-generated content, there is people abusing it.
00:05:10
Speaker
Um, after that, after a little under three years of LinkedIn, uh, I moved over to the e-crime team at Facebook, where it's, it's, if you think about it, it's a little similar, but it's a much, much larger scale and complexity, um, dealing with financially motivated actors. Uh, and I was there for four years as an IC as an individual contributor.
00:05:38
Speaker
And then a year as a manager of the team, I dealt with things ranging from fishing scams, social engineering, large inauthentic behavior and coordinated inauthentic behavior.
00:05:56
Speaker
Clusters doing 2020. And my last stint in most of my work as a manager was a really strong focus on scams and then also pig butchering scams and military impersonations and work on at scale mitigation through investigations. Wow. And one of the things that
00:06:23
Speaker
You know, it's really intriguing to me as we talk about that scale of work is, you know, both LinkedIn and Facebook. Let's, let's focus on Facebook for a second, because that's the organization that you were a part of in meta. Both of those are.
00:06:41
Speaker
very large scale organizations, I'm assuming when it comes to trust and safety because of many reasons, not just security, right? The integrity of the data, there's a lot of news around groups that are starting and misinformation on these platforms. So I bet a lot of that, and you focused on financial crime, correct? So
00:07:04
Speaker
I think although your focus is one, you really need to take into consideration everything. So what would be a day in the life in your role, let's say, at Facebook? What are the things that you look for? What are considerations that your team took into account in the daily life there?

Strategies to Combat Scams

00:07:24
Speaker
So if I focus more on the scam activity that we did, which I'm very proud of the way the team was working on it because originally when I started on e-crime, it was very much, let's go find the bad guys and let's shut them down, which is a lot of fun. And you can do a lot of very nuanced investigations. But then we kind of moved into like, okay,
00:07:52
Speaker
how do we actually attack the problem at scale? So when we started working on scams, there was a very big focus on, okay, how do we identify these clusters of bad activity versus identifying, well, versus identifying the who. While identifying the who is doing it is important, but that wasn't the main goal. The main goal,
00:08:22
Speaker
If you go into it, so if you look at the scam kind of infrastructure at all, it's a lot of small, if I think about it as a graph, it's a lot of small nodes that are doing scams. It's very disparate and it's very broad. Right, and if you shut down one node, then another one will pop up, so that's not affecting. Exactly.
00:08:49
Speaker
That was one of the things that I was very proud of and still am, where I was in the structure in Meta, is we weren't the team to take down one thing and move on. We were the team to say, OK, here's one thing. Let's see where the thousand things that are connected to it. So kind of backtracking into the bigger picture. Exactly. I think my spiel was always you take a cluster, you make it bigger, then you make it smaller again, then you make it bigger and then you narrow it down.
00:09:17
Speaker
So that is kind of what you do. You get a cluster from whatever source, and then you start identifying the TTPs, the tactics, techniques, and procedures of the attackers. And then you try to generalize it. What are they doing? How are they doing it? How can I find now that I know kind of how they look like on the platform? How can I generalize this so I can find more of them and then kind of make sure that I have good precision
00:09:45
Speaker
Recently, there started to be more of a focus on like, let's preserve user voice, let's protect actually good users from false positives.
00:09:54
Speaker
that was actually the people that worked with me were security engineers. So there was a lot of technical work to kind of really distill the data to, okay, here is actually how it looks like. And what I really enjoyed was working on a large cross-functional team with machine learning and engineering and product to really identify how do we
00:10:21
Speaker
holistically solve the problem which is which is it's funny because that was towards the end of my work but i feel like that's something we were striving for for a really long time i really always wanted to figure out the holistic way to solve these problems i always i called it like
00:10:38
Speaker
I don't want to just close the door that they came through. I want to build a wall there. And then they'll find another way, which is fine. And looking back to other work that was done, I think most of my successful work was when we were doing this at scale mitigation. We just were more empowered to in the last year. Yeah, and that's really good to hear. And so tell us what your next steps are now.
00:11:06
Speaker
So a couple of things. I'm trying to figure it out. I'm trying to figure out kind of what do I want to do? Do I want to continue working, go back to working for a larger company? I don't know. I have just decided to start kind of like a venture on my own for consulting and advising. I'll see how it goes.
00:11:26
Speaker
I've been trying to find what what does this stuff like to do and I really like to connect people together and I really like talking to people. Since I got laid off all I do is kind of talk to people and I find it very energizing and kind of spread the knowledge and talk about new problems. So I'm trying to do that while at the same time trying to see you know you gotta you gotta put food on the table so kind of see if I can get a
00:11:53
Speaker
if I can get and if I want a full-time job at other type of leadership in the space. That's great and good luck. And now I want to ask you a few hard questions, which is why we're here today. So first of all, say, of course, you're not speaking on behalf of Meta and you no longer work there, although you are
00:12:15
Speaker
very loyal and you like the company and you appreciate what they're doing. So that's great to see, but I did want to put that disclaimer in here. And one of the things that's really kind of bothering me and I reached out to you to talk about this was, okay, let's put it on the table.

Social Media's Accountability in Scams

00:12:32
Speaker
Let's put the hard stuff on the table. What's happening today is that people are being scammed at larger scales than ever before. We can see that
00:12:42
Speaker
everywhere in the world, in the FBI reports that are coming out, in the UK finance reports that are coming out, in the initiatives that the governments of Australia and the UK are taking, and here in the US, the Senate is also putting pressure on companies to do more. Now the financial sector is in scrutiny, under scrutiny all over the world, and what we're seeing as the regulatory landscape unfolds in the UK,
00:13:08
Speaker
where they're talking about the the payment systems regulator just released an announcement a couple days ago talking about the fact that the you know here's a draft of regulation that's going to come out in October we expect banks to get it in place in 2024
00:13:24
Speaker
what's the first thing that happened finger pointing at social media because in this whole scam life cycle we have the incoming text messages and calls or messages on social media or scam ads that are happening on Google and we just presented actually myself and a colleague of mine to fifth graders
00:13:48
Speaker
And we presented about cybersecurity yesterday at school. And we presented a video showing how it's so easy to find fake ads, scam ads, and scam videos on YouTube and Google search. So it's definitely not just Facebook. But those are places where all these scams originate. And then, yes, later on in the life cycle, there is the financial transaction that happens. And, yes, financial institutions are involved.
00:14:15
Speaker
But it seems like all the pressure is on them because that's where the money is. And the question becomes, you know, what about social media? So for example, TSB stated a few days ago, or maybe it was a week ago, that eight in 10 scams come from meta platforms. So we're talking about WhatsApp, where the pig butchering scams often start, Facebook Marketplace,
00:14:38
Speaker
Facebook chat all these groups that happen instagram there are a lot of news scams. This is something that i also talk to a few other UK banks and common knowledge maybe it's not eight maybe seventy five percent or seventy percent. First of all how is this happening and why is so much coming from.
00:14:56
Speaker
So first of all, I want to caveat the kind of explain my loyal, quote unquote, loyalty to meta. My loyalty is for the people who work on integrity. They deserve to be represented correctly because they are the ones doing the work. Companies make company decisions, but I will say that the integrity team in meta, they all want
00:15:22
Speaker
the best for users. I think that's like intent. And there are just amazing people. And I can say that about LinkedIn as well. So I have big feelings about articles like that. So I'll start from the articles and I'll go down to like
00:15:39
Speaker
the nitty gritty. So I've been thinking about this since we spoke, and I really like analogies. So here's my analogy to that. I am ready for people to challenge me on it. Saying that most scams come from meta, to me says that if we imagine that there was in the world of car manufacturing, there was one manufacturer that made 90% of the cars. And then you said,
00:16:10
Speaker
Most accidents happen from that car manufacturer. Yeah.
00:16:17
Speaker
That makes sense. Most interaction happens there. If we go back to social media, so first of all, most interactions happens there. There's a giant user base and it's been around for what now we could call a really long time. So when I see an article like that, usually I kind of roll my eyes a little bit because
00:16:44
Speaker
Yes, there's always truth in there, but it's very, to me, it's very clickbaity. It's like, look at this, look how bad it is. It's frustrating because, yeah, obviously. Working at Meta is sometimes exhausting just due to these news articles that just beat you down because they're
00:17:11
Speaker
There's a lot of people that look at this and say, oh, they don't do anything. It's like, dude, that's all I do day and night. And it's difficult. So that's why I have big feelings about that. It's kind of, to me, it's a little bit of lazy reporting. I'm not surprised.
00:17:32
Speaker
But let me double click on that. So you talk about as an integrity team member, the fact that these news articles come out, it's exhausting. So I wanted to kind of double click on that from two angles. One is just generally speaking, how does the press you feel impact meta's ability to act on these types of things, on scams?
00:17:55
Speaker
How do these articles impact that type of work? And the other question is, I think people don't know enough about what meta does to stop scams. And I think that's the gap here that we might be able to close a little bit. So let's start with the first part. How does the press impact meta's ability to act?

META's Integrity Challenges

00:18:14
Speaker
So I don't know if I'm saying it right, but I think it's kind of like a self-fulfilling thing. It's like you have the media
00:18:21
Speaker
Sometimes saying good things, sometimes saying bad, bad things. I'm not going to dog the media, but like saying these things or pointing out and not only media advocates, which by the way, I've, I've met Aaron West multiple times. I love Aaron. Uh, we had conversations about this.
00:18:39
Speaker
sometimes people only see their own universe, which is fine, but they forget that the meta universe, I'm not going to say the other thing, but the meta universe is enormous and worldwide. So what happens is, and this is kind of goes into those reports, advocates and news sites,
00:19:05
Speaker
that I can tell you the perspective of a down in the weeds person. So my perspective as a down in the weeds person is this. The media reports something that trickles down through the hallways of meta to whoever it needs to be and creates a fire, a PR fire of sorts.
00:19:28
Speaker
What happens then is that the company, I think, has no choice but to go to integrity and say, whatever it is that you're doing now, stop. Fire. Go deal with the fire. And integrity in general and trust and safety in general is fires, but it's very frustrating as someone in the front lines
00:19:52
Speaker
to to get consistent so this is what happens you consistently get diverted to fires which by the way in my role i got very very good at saying no so i i got my left my my team was very focused and i said no to all fires but that was 90 of my job i called myself a gold but here's another analogy uh that i was playing around with imagine it's an er and someone comes in with a brain injury and you're dealing with the brain injury but you know what
00:20:22
Speaker
Their legs are broken and they have burns on them. But you're dealing with a brain injury right now. That's what I'm dealing with. I need to keep this patient alive. But then you have their family members or their friends who are screaming at you and clawing at you. Deal with the broken legs right now.
00:20:45
Speaker
That's untenable. We're going to have to deal with the bigger, let's keep this patient alive. I know it's a little morbid, but imagine yourself as the doctor dealing with it, getting pulled into these new directions. And while the doctor has all the power to say, you know, I go away, integrity people have a hard time because you can't constantly fight it. So what happens is two things. One,
00:21:12
Speaker
You get pulled in a million directions because someone saw a person's picture a hundred times on the platform on a hundred accounts, which I will concede that is a problem. But in the larger scale, someone's picture getting used in a hundred or a thousand accounts is beyond small of a problem.
00:21:35
Speaker
Like it's annoying. It sucks for that person. They're used for romance scams. Agreed. Agreed. So what I recently saw, there was a thing, I think I talked about it with with Aaron about why can't meta take down the pictures of that guy? Because one hopes, which is what we did,
00:21:56
Speaker
We're dealing with the core of the problem. The pictures of the guy on accounts, that is the most minute of symptoms. If we keep getting pulled to dealing with symptoms, we're never gonna be able to deal with the problem. It's funny, it's kind of like, it's kind of the dichotomy. There's a perception that the team is small, which it is not. And then there's also like, the team's too big.
00:22:26
Speaker
or not too big, that the perception that the team is too big and can deal with everything. I completely agree. So from my perspective, as someone who doesn't work at META, I expect thousands of people to be thrown at these problems. And I don't know how many, and I'm not going to ask you how many are working on this right now, but I also understand that larger organizations
00:22:50
Speaker
Drive the need for swim lanes and clear clear goals and and not you know stepping on each other's toes not just politically but really to orchestrate organization in this in the in the shared mission and that leads to big company problems that are clear but then my question is
00:23:08
Speaker
What is meta doing? And I think that gap of knowledge is creating room for these accusations that are real that 8 out of 10 scams are coming from meta platforms. Yes, but how many did you stop? Maybe it's 8 out of 10 scams, but maybe the number of scams would have been, you know, 3 or 10 or 15 times
00:23:29
Speaker
more than it is now. And you're actually stopping a lot. We really don't know. And I think Mehta's, and I'm not going to ask you about Mehta's PR policy right now, but Mehta's PR policy from my view has always been air on the side of silence and avoidance. But I
00:23:47
Speaker
I think the time will come and not asking a question here and this is a statement that I'm making where meta is going to need to be and not just meta by the way we're picking on meta but just because we have access to work there but it's going to be other companies to like Google with YouTube and Google search.
00:24:06
Speaker
will need to give some information, not necessarily to me, but definitely to regulators and other stakeholders about what they are stopping and what they are doing and how they're planning to
00:24:20
Speaker
collaborate with other institutions in order to collectively fight scams, be it financial institutions, be it law enforcement, be it retailers and others, I think that's going to need to happen as well. Yeah. And I think that's the point I want to get at. In the end, let's pick on LinkedIn for a second. And don't worry, LinkedIn, I'm not going to really pick on you.
00:24:47
Speaker
These big butchering scams start on LinkedIn, but they start on LinkedIn. If you think about what happens there, it's very minute and it's interesting. It's a very minute because it's like, hey, let me talk to you about my money, blah, blah, blah. Let's go to WhatsApp. Let's go to the telegram.
00:25:11
Speaker
It's small. The amount of data that LinkedIn has to work with is very, very small. But then they have kind of the stigma that it started on LinkedIn. But then again, it's small. And because it's so small, the level of work in engineering is a lot bigger. So the something that I thought about is the more that is happening on your platform,
00:25:42
Speaker
Actually, the easier it is to in fact see it.
00:25:46
Speaker
Okay. So let's go to, let's go to those platforms. So if we're talking about Mena, then something that might start at Facebook and then they take it off platform to WhatsApp or Instagram and off platform to WhatsApp. What about those opportunities for collaboration for cross-referencing things? Okay. So yeah. So, so that's where I was going. I was going with and I went on a tangent. My hope, or I think it's everybody's hope is that everybody collaborates. It's.
00:26:14
Speaker
the tech platforms and the banks and the regulators. I believe there's an honest attempt to do that across all industry. And with law enforcement, it's just every piece of the puzzle, they all have their things that
00:26:32
Speaker
I can't say, I can't share. Or I'm not picking on meta, it's everyone. Or you go to law enforcement and it's not big enough. It's not prevalent enough. Law enforcement sometimes doesn't know how to deal with it. So it's funny, I talked to my friends at meta and they're frustrated with law enforcement because they're like, here's stuff, do something. And law enforcement has a giant queue of stuff they need to do. So the ideal situation
00:26:59
Speaker
is that everybody works together. I will say that within meta, the cross-pollination works really well. It's a lot of work. It's sometimes more complex than others. I can't speak to WhatsApp, but it does happen. And we were really working to figure it out. I think going back to something we talked about a little earlier, personally, I felt,
00:27:27
Speaker
while I don't wanna attack the press by any means, I felt that sometimes the constant dogging of meta created the situation in which meta doesn't speak. I'm not gonna say it's good or bad or that meta comms is good or bad, but it creates this situation that like, I can't speak, I'm not gonna put anything out there because whatever it is that I will put out there,
00:27:55
Speaker
high possibility that we'll get spun into something negative about me. That's why I'm hesitant about saying things and I needed to make sure that I can say things because I don't want it to get spun around to something
00:28:09
Speaker
Negative. And then you have, if you think about it, you have the other, let's say Google. Does Google need to be attached publicly to meta and get attached to bad press? Because meta is the golden child of getting bad pressed. Whether it's their fault or not. That's where I feel that if people really want to help, and this is not just media,
00:28:31
Speaker
try to bring the problem up. I saw this thing on LinkedIn and it's like immediately like meta doesn't care meta is a scammer and it's like what does that narrative give to you beyond feeling good at the moment? It doesn't help anybody.
00:28:47
Speaker
Oh, that makes me want to ask a lot of questions, but I will focus on one specific area. I think of weakness that I've seen. And if you talk about, you talked about LinkedIn and I see on Reddit as well, one area of weakness is, is in my mind is Facebook marketplace. That is like a heaven for scams. It's known for being.
00:29:10
Speaker
a very scammy for every one good reply. There are many other bad replies and what is going on there? I know that you didn't work on Marketplace, but... Yeah, like you said, I didn't work in Marketplace. I don't know the inner workings, so I won't speak to their inner workings. I'm not very surprised that it happens in Marketplace because Marketplace, that's where people exchange money.
00:29:37
Speaker
if if you think about all the other areas you can go to to scam people marketplace makes the most sense as a scammer that's where people exchange money it's hard because it's really hard to detect intent it's really difficult like true you can you can continuously chase the reports and you know i if anybody hears this
00:30:07
Speaker
And once one thing they can do as an external person, when you see something, report, report, report, report. Because if there's no reports, it's very hard to find these people. I think it's a classic case of taking things off platform too, right? You can't know who's good and bad if you don't have the data, as you said, in the reporting and marketplace is kind of a
00:30:31
Speaker
the facilitator of the communication, but then people do take it off platform and the scams actually happen later when people say, you know, my cousin will come pick this up, send me, I'll send you a check, all these fake check scams and et cetera, et cetera. The other thing about, I think in the last less than two years, scammers have gotten a lot more sophisticated. I've seen all kinds of things with pig maturing and we're seeing on the other side with
00:31:01
Speaker
Just like someone's selling something and then they even have a support that you can call. We're no longer in this very simplistic romance scam age. If you fall for a romance scam, it doesn't mean that you're simple. It's just their mechanisms were fairly straightforward. And if you look at it, they weren't even going for that much money.
00:31:30
Speaker
Now it's, let's get as much as I can. Let's make it, it's becoming a business. And once it's becoming a business, and if you're a business and there's an adversary ahead of you that that's their business to defraud your business,
00:31:48
Speaker
That's very reminiscent to nation state to me. I'm sure I didn't work on nation state, but they're not as complex, but they're working towards getting there. They're working to get towards the organization of like, this is what we do. We are a company and we're here to defraud people. And to defend against that is very, very difficult. Yeah, from military experience and other experiences,
00:32:16
Speaker
Attacking is a lot easier than defending and a lot less time and resource consuming.
00:32:25
Speaker
And by the way, I think it's not considered nation state, but I think from a financial loss perspective, this is getting close to nation states. It's not, it's not attacking the government infrastructure or systems, but it is attacking our economy and global economy, but definitely the US economy as part of the global economy to the scale that these scams are happening, which we're yet to
00:32:49
Speaker
really grasp and understand. So clearly this is a huge problem and clearly we don't know what META

Impact of Layoffs on Scam Prevention

00:32:56
Speaker
is doing. We only know what others are saying in the press that there are a lot of scams happening on META.
00:33:02
Speaker
But it's also on the flip side, you had a really important role on the integrity team. You talked about the strategic role that you were playing in kind of being the gatekeeper and focusing on keeping the patient alive and which is, you know, really focusing on the bigger picture and understanding the MOs. But unfortunately there were a lot of layoffs and men and including yourself. Again, you had a lot of knowledge, a lot of know-how and others on the integrity team. So what does that mean about the importance of
00:33:50
Speaker
to fight these things. Regulations are coming, if they haven't come yet already. The business has made decisions to streamline some stuff. Streamline, get working on things that are more critical. In the short term, maybe it makes it a little more thrashy, but I don't think it's affecting the efforts at large. I really don't. I mean, even catching up with people,
00:33:54
Speaker
this topic to Meta in general.
00:34:19
Speaker
The work continues, the focus continues, at least where I am, the focus from this work has not shifted. It's just things got shuffled around for efficiency sake or whatever it is that means that I don't work there anymore. Yeah, I don't see it as something that will be of detriment to the work we've done.
00:34:46
Speaker
Well, that's good to know. I wanted to ask a question about shutting down accounts you mentioned earlier with the, you know, the fake pictures and the, um, the fact that you are seeing things that might imply that there's a bad account or something like that. And I also know that, and I think from as myself as a Facebook user who used to be on Facebook a lot. And in the last few years, I'm not really on Facebook. I am using other meta platforms.
00:35:15
Speaker
but not Facebook, the number of users is declining and I'm assuming that that has some impact in shutting down accounts. Do you see any changes in shutting down fake accounts? How easy is it from your perspective? What do you need to prove in order to shut down an account that was reported by platform users? And how's the process changed in that whole era of preserving voice? That's a really good question.
00:35:45
Speaker
It's funny, I'll get, I've had people internally say, why don't you just take down all the fake accounts, which always make, made everybody chuckle. Like, would you like to show them to us? If we only focus on fake accounts, there are a lot of signals that an account is fake. It could be a picture used many times. It could be, uh, other, other pieces of infrastructure that you can say that they're overlapping, that something's going on there.
00:36:13
Speaker
This doesn't look right. When someone reports a fake account, I see a lot of it with external people reporting 1,000 fake accounts. Like, here's 1,000 fake accounts. They're fake because I said so. Which is fine. If it was me looking at them, I'd be like, oh, these are high-fidelity probably fakes.
00:36:33
Speaker
In the era of preserving voice, you can't just say, X said so, so these are fakes. Even if I saw it and I said so, I need to align it to a policy. Because whenever you take something down, you need to have evidence.
00:36:52
Speaker
and you need to have the ability to stand behind it. If one day someone goes after you and says, hey, this was actually not a fake account. The other thing that I personally had a hard time with is like, if I find a fake account and I found all the accounts that are connected to it by infrastructure, I would say, these are all fake accounts. Boom, down. You can't do that because it happened to me in the past that I took something, I took a very large swath of things down and then
00:37:21
Speaker
that one false positive gets to the news and then there's assumptions over assumptions and it was really funny to see those assumptions were sitting inside like oh they did this the silences I'm like dude it was a compromise account like you just didn't know and it was just little old me and that Facebook making decisions so
00:37:41
Speaker
You have to be very careful, even if you have like these thousands of accounts. Let's say we have one or 10 people dedicated to looking at these thousands of accounts, thousands of accounts. They need to make sure that it's fake, make sure that it's fake under this us specific policy, and then by the policy understand what exactly they can do.
00:38:07
Speaker
Do you take down the account? Do you give the account the ability to come back? What's really interesting is sometimes even as an investigator, you'll look at something and you'll say, why would anybody, would a good user ever do this? There's no chance that this would ever be done by a good user.
00:38:26
Speaker
And then someone in another business area or a user will say, Oh, we do this all the time. This, we are good users and we do this all the time. And it's that, that's the thing that sometimes gets frustrating because you'll get railed on by like, why didn't you take these accounts down? It's obviously fake.
00:38:43
Speaker
It's obvious to you. It might not be obvious to everybody. It might not be fake. It might be used to something. I don't know. And I don't want to get pinned on like, oh, you said this is not fake. I don't know. But the process behind it is more than just like clicking a button and taking things down because you said so or because I think so. It's a lot more cumbersome and
00:39:13
Speaker
My team, for example, is not the team to take down fake accounts. I actually, it's difficult for me because I'd rather these fake accounts stay for a minute so I can investigate them. But that's difficult because you can't just leave harmful accounts on the platform.

Cross-Platform Collaboration with Law Enforcement

00:39:32
Speaker
So the more accounts you take down, the less information you have. It's not like people that you can investigate them later. It's gone.
00:39:41
Speaker
So yeah, it's a lot more complex than people would like it to be. That's a really tricky balance that you need to maintain there between
00:39:53
Speaker
I also understand your perspective from a kind of intelligence view. You want to leave them. You want to understand the trail, the MOs. And if you shut them down, you don't have that visibility. And then you have to go look for the other fake accounts that they created on the platform because you shut down this plague account. So that's kind of that game of whack-a-mole, which you don't want to play. You want to catch the MOs. And that brings me to another question about
00:40:21
Speaker
industry collaboration. And we're always, you know, this whole conversation, we were talking about what meta can do for us to preserve our safety better. And now I think the question is what, how can we help meta in a way, what industry collaboration could improve a bill meta's ability to execute. And you talked about law enforcement earlier, what other forms of collaboration do you think will help solve this problem?
00:40:48
Speaker
I'll start with the last one, with law enforcement. It's funny, I listened to Erin talk a couple of times and these are her words of law enforcement has a technology problem, it appears. One, it's difficult for law enforcement to investigate these crimes, especially when you can't really prove the amounts of how, there's always gonna be a threshold where the FBI can't get involved in every single thing.
00:41:19
Speaker
So that's difficult. I will say that we always had the appetite to bring things to law enforcement. As far as my team was, let's send this to law enforcement. From an enforcement perspective, I really like sending things to law enforcement because it's cool and it's exciting. How effective is it?
00:41:45
Speaker
don't think it's super effective. So if we think about it from a large scale scam that's prevalent in a location, like pick a location that's prevalent there. Okay, I found the foot soldiers, I found the person and then I found the head of the snake and I got him arrested. It's very difficult to measure what actual difference that made because there's
00:42:11
Speaker
a million snakes. So I spent a lot of time and energy, or law enforcement spent a lot of time and energy catching that one bad guy. But that type of abuse is so prevalent in that location, that's the job. That's what they do to feed their children. They're not going to stop, probably. Someone recently said,
00:42:33
Speaker
that a lot of the questions come towards law enforcement, come towards the social media. But I don't hear a lot of questions about what about the socioeconomic issues in these locales breeds this? How can we help fix that?
00:42:53
Speaker
Right. And whose responsibility is it, right? So we'll just narrow it down to, okay, the US is being attacked and these operations are happening in Southeast Asia, for example, Cambodia. So what will the US do in Cambodia in order to change the situation? That's the type of
00:43:12
Speaker
question we're asking here, and that's a good one. The issue here is that it's so abstract, so it's really hard. There's nothing I can do about the U.S. policy, not much I can do there. So it's too abstract to attack.

Preventative Measures in Social Media

00:43:26
Speaker
So I'll just go down to why is law enforcement not doing anything?
00:43:29
Speaker
Right, but one of the things that I also talked to Aaron about in the previous episode of Game Rangers, talking about this topic, is that even they had to redefine success when it comes to these types of scams. And success is not necessarily going to be catching the criminal.
00:43:46
Speaker
because this criminal is not necessarily in state, they can't necessarily get to them, it's virtual, it's harder, but they can seize them money. And I think one thing that social media platforms can do is prevent scams, not just catch the fake accounts, but actually prevent the scams by looking at the nature of the luring, the psychological impact, the emotions, the really looking at how does a scam look like, and then maybe somehow
00:44:16
Speaker
cautioning the user. So, yes, there's a bunch of things that are getting done and kind of being worked around this.
00:44:29
Speaker
Yes, there are the how do we augment the Help Center? How do we send a message every time there's something that looks fishy? That's one side of the scope. So as you work at a cross-functional team, those things happen because you have the people that bring up the kind of dirt from the bottom and then the people, okay, how can we fix this on a product level?
00:44:52
Speaker
which does happen and I really love seeing it happen. The other space is how do we, we're not gonna, it's going to be very hard to stop them from consistently creating new accounts, but you can stop these accounts like at creation, at they're doing something, at account takeover. You can do that and but that takes
00:45:17
Speaker
It's just a lot of work. It's just a lot of work, a lot of time. It takes time to identify these things. So if you ask how external sources can help is some of what external sources are doing now report. There's scam haters unite, for example, they send giant reports. And those reports are great to, well, you can't just put those into classifiers and say, learn from this because I don't know for a fact
00:45:48
Speaker
But I remember starting working on pig butchering. I don't have evidence of pig butchering on platform. I don't. But if I get law enforcement reports and I get activist reports,
00:46:02
Speaker
The one thing I will say that is missing from this equation that will make it better is you have to give people who are doing this grace. You have to understand that when you say jump, they cannot immediately do what you asked them to do. And what happens is this. Someone external gives information. A team, I'm not going to talk about meta.
00:46:30
Speaker
other companies that I've talked to start working on it. And then that person becomes impatient and belligerent. And at that point, I want to fight the bad guys, but I'm a person too. And at that point, I don't want to work with you anymore. I will go find my intel somewhere. So you made yourself obsolete because it's one, I don't want you to go to the press.
00:46:59
Speaker
I don't want you to yell at my people if it didn't happen to my team. But I know, again, I've seen this in another company. If it was my team and you're giving my team information, but at the same time you're berating my team, I am closing that down. I am worried about my team. I'm not worried about you. I'm worried about my team's well-being. And if they get beaten down by you,
00:47:23
Speaker
I'll find another way to find the bad guys. But what you did at that point is made me not want to work with you anymore. And then you can rail all you want and have all the information you want, but you're completely useless at this point.
00:47:36
Speaker
I think that's a really, really strong point as often talking to, you know, we're all humans at the end of the day trying to fix, make things better and make the world a safer place and a better place. And it might be frustrating if you have Intel and you want it to be acted upon and executed against, but processes take time and there are many considerations to
00:47:55
Speaker
put this puzzle together, not just taking the input from one party, but really putting them all together in a way that takes a bigger perspective. And I do appreciate that we're talking about huge companies with multiple considerations.
00:48:11
Speaker
I wanted to thank you so much for being honest and open and willing to answer very tough questions, which I think are really important ones and sharing your insight with us. And I'm looking forward to speaking again soon and good luck with your new path. Thank you for having me and thanks for your time. Appreciate it. Thank you. I hope you enjoyed this episode. I would love to hear your thoughts on this one. Feel free to DM me on LinkedIn.
00:48:40
Speaker
and yet it's bigger a living.