Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
How Addictive is TikTok Really? Applying the BrainSafe Framework image

How Addictive is TikTok Really? Applying the BrainSafe Framework

The Dopamine Slot Machine
Avatar
0 Plays2 seconds ago


This episode uses the BrainSafe framework to break down how TikTok’s design drives engagement, focusing on its opaque algorithm, highly personalised “slot machine”-style content feed, lack of natural stopping points, and subtle friction that makes disengaging harder than starting. It argues that while social features play a role, the platform’s core pull is algorithmic, continuously learning what keeps users hooked and delivering it with increasing precision—particularly impacting younger users.

Recommended
Transcript

Intro

00:00:11
Andrew Wilmot
Good morning, good day, good evening. Whenever you are, welcome to the Dopamine Slot Machine, the podcast that discusses what you need to know about the technology your children are using. How are they designed to get your kids hooked?
00:00:21
Andrew Wilmot
How do make money from your children? And what can you do to make sure that your child's relationship with technology is a positive one? My name is Andrew, I'm a dad of two and a lifelong gamer. And today we're taking a long and careful look at TikTok.
00:00:33
Andrew Wilmot
Again, actually, so we've previously discussed this in quite a freeform way, but this episode I'm applying the Brain Safe Standard to TikTok. So, if you didn't listen to last episode, the BrainSafe standard is a framework that we've built for assessing how addictive a digital product actually is.
00:00:50
Andrew Wilmot
Today, I'm going to be showing you that framework in practice using TikTok and scoring TikTok against this framework, dimension by dimension, just like we do with a paying client. And at the end, we'll give you a cumulative score, run the red flag check and tell you what it would take for TikTok to actually pass.
00:01:07
Andrew Wilmot
you haven't heard the Brain Safe episode, don't worry, do a very quick refresh on the framework before we get into the assessment. The short version is this. The Brain Safe Addictive Interface Assessment Framework, or the BAIAF, for short, evaluates digital products across six dimensions of addictive potential.
00:01:23
Andrew Wilmot
Each dimension is scored from one to five, and the scores combine into a severity classification. Now critically, the framework is designed to assess the structural properties of a product, not just whether it uses a known bad feature, which means it can evaluate something TikTok properly instead of just ticking off.
00:01:41
Andrew Wilmot
Odd choice words. Does it have infinite scroll? Yes or no. But asking what does the product actually do to people using it now TikTok was a really obvious first choice to give an example of the BrainSafe framework for three key reasons. Firstly, the scale of it. It's got billions of monthly active users, 1.9 billion monthly active users globally, and about 14% of those, and that's probably an underestimate, and that's still between, there's still around 270 million people are aged 13 17.
00:02:17
Andrew Wilmot
TikTok reports that 90% of users open the app every single day. And globally, the average user spends 95 minutes over an hour and a half a day on it.
00:02:27
Andrew Wilmot
For children, the figure from parental controls app, Custodio, is around 75 minutes a day. It's the most used app amongst children worldwide. Whatever this product is doing, it's doing it at a scale that almost nothing else in human history has touched.
00:02:42
Andrew Wilmot
Second, there's the regulatory relevance. In February of this year, the European Commission issued preliminary findings that TikTok's core design features, infinite scroll, autoplay, push notifications, personalized recommendations, constitute addictive design in violation of the Digital Services Act.
00:02:58
Andrew Wilmot
So that's not coming from a campaigner, that's the executive arm of the European Union in formal proceedings. Additionally, Keir Starmer has highlighted two very key features of TikTok, as almost as if he was thinking of TikTok when he wrote his Substack article, Infinite Scroll and Auto-Plane Videos, as things that the government wants to crack down on.
00:03:16
Andrew Wilmot
So if there's already regulatory pressure here. And finally, TikTok is such a clean case study of what the BAIAF is built to evaluate.
00:03:28
Andrew Wilmot
If you wanted to design a product from scratch that maximally exploits the six dimensions of framework measures, you'd probably end up rebuilding TikTok. So to start off, a quick refresher. Six dimensions each scored one to five, where one means this product is absolutely fine and five means genuinely concerning.
00:03:47
Andrew Wilmot
The dimensions are autonomy analysis, which is can the user make free and informed decisions? Variable reward profiling. Does the product engineer unpredictability to sustain engagement?
00:03:58
Andrew Wilmot
Friction asymmetry. Is it harder to leave than to arrive? Temporal boundary erosion. Does the product remove natural stopping cues? Social obligation engineering. Does the product manufacture social pressures and vulnerability amplification?
00:04:12
Andrew Wilmot
Are these mechanisms being deployed on populations who are more vulnerable to them? You add the scores together, you get a cumulative severity. 6 to 12 is low, 13 to 19 to 24 is high and 25 to 30 is critical. We've got a separate red flag check, so there's four specific combinations that trigger automatic concern regardless of cumulative score and we'll touch on those later.
00:04:34
Andrew Wilmot
So that's the framework. Now let's apply it to TikTok. We'll start off with the first dimension, which is autonomy analysis. Can the user make free informed decisions about their engagement with this product? So the For You page, which is the algorithmic fee, that is the default and primary service of TikTok, is one of the least transparent recommendation systems in mainstream technology. The user doesn't know why they're seeing what they're seeing. They don't know what they are not seeing. They cannot meanfully audit how their behavior shapes the next video or the one after that.
00:05:02
Andrew Wilmot
The optimization objective is not disclosed. The model is not disclosed. The guardrails, if there are any, are not disclosed. Now there is a following feed and a friends feed available, but the For You page is the default service. It's the one that loads as soon as you open the app, the one that has been internally optimized to be as compulsive as possible.
00:05:20
Andrew Wilmot
Switching away from it requires a deliberate action that most users never take. Notification defaults are aggressive. Onboarding pushes contact syncing, location permissions and not notification opt-ins. There are a series of screens where the affirmative option is bright and prominent.
00:05:36
Andrew Wilmot
The decline option is greyed out or buried. That design architecture is not neutral. It is built to produce a specific outcome, which is maximum permission, maximum engagement, with minimum friction on the way in.
00:05:49
Andrew Wilmot
Now, you can delete your account, right? But this requires a 30 day waiting period during which any login reactivates the account. So even the act of leaving is harder than signing up,
00:06:02
Andrew Wilmot
And so for all those reasons, TikTok scores are five. The cumulative effect across the opacity of the devolved configuration, the asymmetry is comprehensive.
00:06:14
Andrew Wilmot
A user on TikTok cannot in any meaningful sense make informed decisions about what is being done to them whilst it is being done. And so it's the textbook definition of a maximum severity finding on the autonomy analysis dimension.
00:06:27
Andrew Wilmot
Dimension number two, variable reward profiling. So this is, you can imagine the slot machine dimension. Does it produce, does it engineer unpredictability to sustain engagement? And yet again, if there was one single product on the planet that most embodies this, it is the For You page on TikTok.
00:06:45
Andrew Wilmot
You open the app, a video plays. Some videos are funny, some are emotional, some are infuriating, and some aren't engaging any way whatsoever. Some are what you want, some are what you need before knowing you needed it and you don't know in advance what you're going to get. You swipe up, the next video plays.
00:07:04
Andrew Wilmot
There's uncertainty to it, anticipation, possibility of a reward. And to be clear here, when you're gambling, the reward is financial but also the light and sound feedback.
00:07:18
Andrew Wilmot
With TikTok, the reward is an engaging video. This is variable ratio reinforcement. It's the same reinforcement schedule that BF Skinner identified in the 50s with rats and pigeons. It's the same reinforcement schedule that gambling regulators have spent decades writing rules around.
00:07:33
Andrew Wilmot
It's the most addictive form of reward delivery known to behavioral science. When the slot machine player pulls a lever, they don't know what's coming. When a TikTok user swipes up, they don't know what's coming. The mechanism is the same.
00:07:45
Andrew Wilmot
neural circuitry being engaged is the same. What's different is that a slot machine in the UK has to be in a licensed venue, age restricted, and has harm minimization requirements, as TikTok is on the phone in your child's bedroom at three o'clock in the morning.
00:07:59
Andrew Wilmot
It goes further, right? Because this is not just generic variable reward, it's personalized variable reward. Slot machines will deliver the same reinforcement schedule to every player, but TikTok's algorithm learns your engagement patterns. It learns which content types make you stop scrolling, which make you watch all the way through, and which make you rewatch.
00:08:17
Andrew Wilmot
It builds a model of your specific reward responsiveness and then tunes what you're seeing to maximize that. A static infinite scroll is a fishing net cast over population, but the TikTok algorithm algorithm is a fishing line cast specifically for each individual with bait selected based on the individual's observed behavior.
00:08:37
Andrew Wilmot
There is no slot machine analog for this. The slot machine industry would not be allowed to do this if it tried. It is, in a meaningful technical sense, a more aggressive reward system than a casino floor.
00:08:50
Andrew Wilmot
The slot machine test we use in BrainSafe asks, if the engagement loop of the product could, without meaningful alteration, be described as the slot machine, variable input, variable outcome, reward optimized for anticipatory activation rather than delivered value, so does it build anticipation, maintain anticipation rather than deliver something of value to the user, it fails this dimension.
00:09:12
Andrew Wilmot
TikTok could basically be this test. it's it's impossible to score this lower than five. There is no argument for scoring this lower than five. This gets maximum possible marks against it in this category.
00:09:27
Andrew Wilmot
Friction asymmetry mapping. Dimension three, is it harder to leave than to arrive? Engaging with TikTok is effortless. You tap the app icon, you're on the For You page, it's already playing a video.
00:09:39
Andrew Wilmot
There's no login prompt. There's friction whatsoever, right? Between the impulse to open the app and being inside that variable reward loop. In fact, like when i when I had TikTok, I remember opening it and not remembering why hadn't opened it. Not even remembering opening it. It was just so automatic.
00:09:59
Andrew Wilmot
Disengaging is harder. You can close the app. That's fairly straightforward, although it does take more actions to close the app than it does to open it. What's harder though is reducing your engagement level, so turning off notifications, hiding the app from your home screen so it stops catching your eye or deleting the account.
00:10:16
Andrew Wilmot
Notification settings are layered, so you navigate to your profile, then to settings, then to notifications, then through individual toggles, and the defaults all set at maximum. As mentioned, the account deletion is a 30-day calling off period, and there's no single step delete my account today permanently option.
00:10:32
Andrew Wilmot
What pushes this dimension to a 4 rather than 3 is the targeted nature of this friction. TikTok's algorithm doesn't just choose your content, it shapes your retention experience. Users showing signs of churn can receive different in-app messaging than highly engaged users.
00:10:47
Andrew Wilmot
That's very calibrated. However, It doesn't quite get a five because the basic engagement versus disengagement asymmetry is less aggressive than a paid streaming service cancellation flow or Snapchat streak penalty.
00:11:03
Andrew Wilmot
You can stop using TikTok without losing anything immediate and tangible. The friction is real, but it's not necessarily catastrophic. Now, something to mention here is TikTok has been experimenting with messaging streaks and gamified messaging streaks.
00:11:18
Andrew Wilmot
And so we would want to revisit this dimension in the future, particularly if that became a more core part of the platform. Dimension four, temporal boundary erosion. Does the product remove natural stopping cues?
00:11:30
Andrew Wilmot
We can just describe it, right? The For You page is an infinite vertical feed. There is no natural stopping point. There is no pagination. There's no end, no progress indicator, no indication of how many videos you've watched.
00:11:42
Andrew Wilmot
Each video plays the moment the previous one ends. And there is no, are you still watching prompt? There's no built-in session timer, nothing. Now, there are some tools, so users under 18 are now devoted to a 60-minute daily limit, after which they're prompted to enter passcodes to continue.
00:11:58
Andrew Wilmot
does rely on users to be honest about their age, but whatever. There's a family pairing feature for parents as a digital wellbeing dashboard. And these are real mitigations.
00:12:09
Andrew Wilmot
But the reality is the 60 minute prompt is a passcode away from being dismissed. It does not stop the session it interrupts it. The digital wellbeing tools exist, but require the user to navigate to them and configure them. So the default experience is unbounded, auto played,
00:12:25
Andrew Wilmot
paginationless engagement. It never stops. It stops only when the amount of time you can spend in that environment finishes. And you see this in the actual behavioural data. The average user opens the app 20 times a day.
00:12:39
Andrew Wilmot
The average global daily session has up to 95 minutes, children 75 minutes a day. That's the predictable outcome of a fee designed to suppress the natural moments at which a user would pause and ask whether they want to continue.
00:12:54
Andrew Wilmot
There's a term for this that I came across recently, a time fog, the designed absence of temporal cues. It produces what ah has been called by gambling researchers the machine zone. It's the state where users come out of a session and genuinely don't know where the time went.
00:13:10
Andrew Wilmot
TikTok's perfected that production of that state at scale, at an intensity no slot machine floor has ever achieved.
00:13:18
Andrew Wilmot
In fact, the European Commission's preliminary findings against TikTok explicitly cited infinite scroll and autoplay as central to its concern. The regulators have looked at what I'm looking at and are reaching the same conclusion. So again, no way on earth that this could score less than a five.
00:13:35
Andrew Wilmot
Moving on, social obligation engineering. Does the product manufacture social pressures that compel engagement beyond organic desire? This was a little bit harder to think about. um And I want to explain why, because it would be easy to score every dimension of five and walk away. TikTok has elements of social obligation engineering.
00:13:54
Andrew Wilmot
Direct messages have read receipts, live streams create real time attendance pressure by creators, ask for viewers to stay, to send gifts, to comment, keep the stream visible to the algorithm. Comment culture creates social expectations around responding to friends posts and follower accounts and like accounts are publicly visible on every account.
00:14:10
Andrew Wilmot
But compared to a platform like Snapchat, TikTok's primary engagement mechanism is not social, it's algorithmic. The pull isn't, my friend will be sad if I don't reply. The pull is, the next video might be the best one I've ever seen.
00:14:23
Andrew Wilmot
Now, again, TikTok is experimenting with streaks. I don't think I've talked about the Streaks right, actually. It's a Streak Pet. So it's virtual evolving creatures and direct messages designed to encourage daily interaction between users. So if you message a friend for three consecutive days, you can then invite them to hatch and co-parent a pet. And you have to have continued daily interaction to continue that pet. But that's, um I believe, still being rolled out. It's not given to everybody yet at this point. So again, something that we would have to revisit as that expands.
00:14:55
Andrew Wilmot
Anyway. anyway Yeah, so Snapchat streaks are extreme example of this bilateral obligation, but TikTok so far, still experimenting with it doesn't have anything that's the equivalent, you can stop using TikTok without your friends noticing.
00:15:11
Andrew Wilmot
I want to caveat that though. So culturally, there is a thing about sending each other TikToks and making sure you watch them all, that kind of thing that is known that is observed. And
00:15:23
Andrew Wilmot
That's not necessarily social obligation engineering, though. So that is an emergent social obligation that's come from natural use of the platform.
00:15:34
Andrew Wilmot
Right. The platform isn't actively so far. Let's watch this TikTok streak pet thing isn't encouraging users to do that. It's not trying to build that. So the social obligation pressures are real, particularly around live stream culture, power of social relationship viewers, form of creators, and also the need to keep up with what's on TikTok trends to know what you your friends are saying as well.
00:16:02
Andrew Wilmot
But that is algorithmic driven compulsion driver rather than socially obligation, social obligation engineered one. So it's contributing, but not the main problem.
00:16:13
Andrew Wilmot
I do want to pause on the parasocial point because, again, this dimension needs ongoing review. Children are forming genuine emotional attachments to TikTok creators they've never met. They feel obliged to keep up.
00:16:26
Andrew Wilmot
They feel they're letting the creator down by missing posts. And the platform doesn't engineer this directly, but it benefits from it and does nothing to mitigate it. So overall, we're going to score TikTok a four on social obligation engineering.
00:16:37
Andrew Wilmot
The final dimension, vulnerability amplification. Are the mechanisms in dimensions 1 to 5 being deployed on populations who are particularly susceptible? TikTok has 270 million users aged 13 to 17 worldwide.
00:16:53
Andrew Wilmot
About a quarter of 3 to 5 year olds in the yeah UK are active users of TikTok. This is a conservative count. platform's age verification is a self-declared birthday on sign-up and there's loads of evidence that under 13s are using the platform in significant numbers.
00:17:09
Andrew Wilmot
so it's going to be much larger than the formal teen demographic. And children are spending i'll load a load of time on here. I've said a few times this episode already, but 75 minutes a day, more than any other social platform.
00:17:21
Andrew Wilmot
We know that teenage brains respond more intensely, and to child brains as well, to social validation cues than adult brains. They've got less developed prefrontal regulations. They've got less ability to override impulses generated by this reward system.
00:17:35
Andrew Wilmot
The variable reward and mechanics that an adult might find annoying can be compulsive for a teenager.
00:17:41
Andrew Wilmot
But the structural problem remains. The 4U page algorithm is the same algorithm. The reinforcement profile is the same reinforcement profile. The personalized reward calibration learns from adolescent engagement patterns just as it does from adult ones.
00:17:56
Andrew Wilmot
And because adolescents, children, teenagers are more responsive to this type of reward, the algorithm will, by design, deliver them a more intensely calibrated experience.
00:18:07
Andrew Wilmot
The teen who spends three hours a day on TikTok is not seeing the same content, is not using the same product really, as an adult who spends 30 minutes a day. They're seeing a version of the product that has learned from them more deeply, calibrated to them more intensely, and is more compulsive for them than a less engaged user.
00:18:24
Andrew Wilmot
the The more you use the product, the more you use TikTok, the more compulsive it gets. So this is what we call vulnerability responsive personalisation. The system doesn't have to be designed to exploit vulnerable users.
00:18:36
Andrew Wilmot
An algorithm that optimises for engagement will mechanistically concentrate the most intense experience on the most responsive users. And the most responsive users are going to be the most vulnerable.
00:18:47
Andrew Wilmot
The system amplifies vulnerability as a structural property. So again, there's no version of this dimension that could be scored lower than five. The core demographic is young and the core mechanism is engagement optimised personalisation.
00:19:03
Andrew Wilmot
So that's five for autonomy analysis, five for variable reward profiling, four for friction asymmetry,
00:19:14
Andrew Wilmot
four for social obligation engineering, five for temporal boundary erosion and six and five for vulnerability amplification. So that is a cumulative score of 28 out of 30.
00:19:28
Andrew Wilmot
Critical. For context as well, the BrainSafe white paper includes works assessments of several products, Snapchat, which scored 26, Netflix, which scored 16, Timestable Rockstars, which scored 19, Signal, which scored seven, and TikTok is the highest score that we've applied to a real product.
00:19:48
Andrew Wilmot
It's more severe than Snapchat, which is, given Snapchat's streak mechanic, a really extraordinary thing to say. I mentioned as well that we've got some red flag conditions, and any one of these triggers additional scrutiny, regardless of the other score. Let's run through them.
00:20:01
Andrew Wilmot
So red flag one. Any single dimension at maximum severity, score of five. This has been triggered four times with TikTok. Red flag two, what is called the roach motel, where there's both friction asymmetry and low autonomy.
00:20:18
Andrew Wilmot
So users cannot freely disengage and the system structurally prevents it. This has been triggered. Red flag three, the gambling mirror, so variable reward and vulnerability, both being at four or above, mirrors the structural features of gambling addiction on a vulnerable population.
00:20:34
Andrew Wilmot
It's the cleanest case of this that we've looked at so far. And red flag four, which is open-ended erosion. So temporal boundary being at four above with no built-in time awareness mechanism functioning at default.
00:20:49
Andrew Wilmot
And again, this is triggered. So all four of red flags are triggered. So applying the brain safe methodology, we can definitively say that TikTok is not brain safe. It would not be eligible for any tier of certification without fundamental redesign of basically the entire product,
00:21:07
Andrew Wilmot
There's no world in which you can make safe for kids TikTok, ignoring what Coverstar say. And I've got my own beef with Coverstar as you might've heard in a previous episode. But let's let's ah take a what if test here, a thought experiment. So we take the digital design pattern, strip away the digital context and reconstruct it as a real world interaction.
00:21:28
Andrew Wilmot
If the physical version would be considered unacceptable, then the digital version would warrant some scrutiny. Imagine a building, you walk in, And by the way, having to walk into the building makes it... So even in this example, it's got more friction than TikTok. But you walk in. Inside, there's a row of chairs facing the stage.
00:21:46
Andrew Wilmot
On the stage, a series performers begin appearing one at a time, and each performance lasts between 15 and 90 seconds. As soon as one performance ends, the next one begins. There's no break. There's no interval.
00:21:57
Andrew Wilmot
There's no programme telling you how many performance there will be. And behind the wall, there's a team of behavioural scientists watching you. They observe which performance held your attention, which one's made you laugh, which one's made you cry, which one's made you stay sitting forward.
00:22:12
Andrew Wilmot
Based on those observations, they select the next performance specifically for you. They are in real time optimising the line-up of the show to keep you in your seat. You can leave. There's a door.
00:22:22
Andrew Wilmot
There's no natural break in the performance at which leaving feels like the obvious thing to do. The performance just keeps coming. There's no end. There's no thank you and goodnight. If you stand up, you are interrupting something.
00:22:34
Andrew Wilmot
The chairs are full of children. Most of them have been in the building for 90 minutes. Some have been there for three hours. The performances continue. The behavioural scientists continue to observe and the line-up continues to be optimised.
00:22:46
Andrew Wilmot
That is the For You page transposed into a physical room. If we saw that arrangement in the real world, we wouldn't let our children walk into it. The fact that it happens in an app rather than a building doesn't change that.
00:22:58
Andrew Wilmot
So difficult question, how could TikTok remediate? If a product product like TikTok came to us and said, what would it take? Where would we even start? So firstly, defaults.
00:23:10
Andrew Wilmot
Devolt the following feed, not to the For You page for all users. Let the algorithmic feed be a deliberate choice, not the surface that opens when the app launches. Second, break the autoplay.
00:23:22
Andrew Wilmot
cho tap to play every video. Insert deliberate pauses between the scrolling. The next video in three seconds tap to continue mechanism is not a hostile change, it's the same mechanism that Breakfast cereal commercials uses natural breakpoints.
00:23:39
Andrew Wilmot
We accept these in television, we can accept them here. Make session time obvious, not buried in a wellbeing dashboard on the main interface, a counter that says you've been scrolling for 43 minutes, you've watched 183 videos.
00:23:56
Andrew Wilmot
Be honest about the time that your product is taking up. Ace, differentiated personalization. Constrain the algorithm's adaptive intensity for under 18 accounts. Cap the rate at which the model is allowed to update its model of a child's user's engagement triggers.
00:24:11
Andrew Wilmot
This is very technical, but it's exactly the kind of structural change that a serious safety regime would require.
00:24:18
Andrew Wilmot
Fifth, account deletion. Single step, immediate to permanent. Forget that 30 day calling off period. It should be as easy as creating an account. These aren't utopian asks.
00:24:28
Andrew Wilmot
It doesn't fundamentally make TikTok a financially unviable product. It would meaningfully shift the BAIF score downward.
00:24:40
Andrew Wilmot
BAIAF score downward. Now, could TikTok's commercial model survive this? I don't know. Perhaps they would need to more aggressively advertise. Perhaps they would need to more aggressively monetize. Perhaps it would count as a lovely buzzword at the moment, n-shitification, which is the process of technical technology products getting worse as companies stop being able to be funded off of venture capital money and instead need to actually make money from their users.
00:25:10
Andrew Wilmot
It's a difficult question. But frankly, if TikTok can't survive without harming its users, then maybe it doesn't deserve to. Now, what can you as parents do? So first and most importantly, if your children are using TikTok and you've decided not to take them off TikTok, talk about what the algorithm is.
00:25:29
Andrew Wilmot
Show them in structural terms how it works. Sit down and look at TikTok together. Who decides what they see? Why this video, not another? What does it know about you?
00:25:42
Andrew Wilmot
Kids, teenagers, they don't like the feeling of being controlled, as any parent can attest. but And that also applies to the technology they use. They don't like the idea of algorithms being refined to capture their attention.
00:25:57
Andrew Wilmot
the The algorithm loses some of its grip once its mechanism is visible. Awareness isn't a complete defence, but it's a good starting point. Second, set up screen time controls properly.
00:26:07
Andrew Wilmot
Put a hard limit. Turn off push notifications. There's no notification from TikTok worth interrupting your child's evening for. Make the account private. Put DMs off. Doesn't solve problem with its root.
00:26:19
Andrew Wilmot
Does reduce the scope of the issue. And finally, model the behavior you want. If you don't want your child to use TikTok, delete it yourself. Stop using it yourself.
00:26:31
Andrew Wilmot
It's harmful to adults as well. I've removed all short form video and I don't regret it. it added It really adds nothing to your life. It certainly added nothing to mine. It's one of the best things you can do. Just just remove it.
00:26:47
Andrew Wilmot
If you're going to ask your children to remove it, delete it yourself. You do not need TikTok. You will not suffer any consequences from removing TikTok. You will find other ways to entertain yourself, other ways to fill time when there's actually time that needs to be filled, but you will also be more connected to the people that matter around you.
00:27:09
Andrew Wilmot
Longer term, there is an ongoing policy conversation, respond to the UK consultation, deadline 26th of May, speak to your MP, demand change. So, to summarise, TikTok scored 28 out of 30 on the Brain Safe Addictive Interface Assessment Framework, putting it in the critical severity band.
00:27:27
Andrew Wilmot
All four automatic red flag conditions were triggered. It is not brain safe. If you'd like to read more about how the framework works, you can find the BrainSafe white paper at brainsafestandard.co.uk. We're also publishing dimension by dimension breakdowns on LinkedIn. Search for BrainSafe and follow us there. But that's all we have time for today. Thank you so much for joining us.
00:27:46
Andrew Wilmot
Don't forget that if you've got some questions for us, or if you or your children have been impacted by the issues we've discussed today, that you can get in touch with us on our website, thedopamineslockmachine.co.uk. This has been the Dopamine Slot Machine.
00:27:57
Andrew Wilmot
Thank you, and see you soon.

Outro