Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Coverstar: TikTok for Kids? image

Coverstar: TikTok for Kids?

S1 · The Dopamine Slot Machine
Avatar
0 Plays1 second ago

This week on The Dopamine Slot Machine, Andrew takes a look at Coverstar, an app often marketed as a safer alternative to TikTok for kids.

But how safe is it really?

In this episode we break down how Coverstar works, why short-form video platforms are so addictive by design, and the risks parents should understand before letting their children use it.

We discuss:

• Why removing direct messages does not eliminate stranger interaction
• The reality of children posting public videos online
• Algorithmic feeds, validation loops, and the “dopamine slot machine” effect
• Livestreaming and the risks of broadcasting to strangers
• Practical steps parents can take if their child is already using the app

If your child has mentioned Coverstar or if you are being told it is “TikTok but safe,” this episode will help you understand what is really going on behind the scenes.


Recommended
Transcript

Intro

00:00:11
Andrew Wilmot
Good morning, good day, good evening. Whenever you are, welcome to the dopamine slot machine, the podcast that talks about what you need to know about the games your children are playing, how they're designed to get your kids hooked, how do they make money from them, and what can you do to make sure that your child's relationship with video games is a positive one.
00:00:26
Andrew Wilmot
My name is Andrew, I'm a dad of two and a lifelong gamer. And again, not talking about a video game today, but definitely something related, as there's a lot of video game related content on here.
00:00:37
Andrew Wilmot
I'm talking about an app called CoverStar. So if you've not heard of it, and I've only heard of it a couple of times actually before yesterday evening, it's it's an app that describes itself as a safer alternative to TikTok, as sort of TikTok for kids. Now I had a parent in one of the group chats I'm in ask if anybody's ever heard of it.
00:00:58
Andrew Wilmot
And so I went and downloaded it and I spent some time on it last night and I spent some time on it this morning, a cup of coffee, and I've read over the website and I've done some research on it.
00:01:09
Andrew Wilmot
And so i really want to dig into this claim of ah this safer alternative for kids because i think it's a really interesting one. So Before we get into how it claims to be safer, what actually is it? So CoverStar is a short form video social media platform where users record videos, as far as I can tell, usually dancing, lip syncing, performances. I saw a few like child musicians on there, but also like video gaming clips and actually also content moved straight from TikTok. So I saw a number of people who've been quite successful on TikTok making comedy content, literally just reposting the same videos on there.
00:01:49
Andrew Wilmot
You can scroll through a feed of these videos, like them and comment on them, follow accounts. If you are familiar with TikTok, it will feel very, very familiar to you here. It's basically the same user interface.
00:02:03
Andrew Wilmot
it It works almost exactly like TikTok or YouTube Shorts. Where Coverstar is trying to be a bit different is that it's positing itself as being safer for kids. The main thing is that there are no direct messages. Now on TikTok, you can message other users, you can have a conversation, you can use it as a chat app.
00:02:21
Andrew Wilmot
You can't do that with Coverstar. You can't message other users. Now that that's a really good thing compared to most social media. And if that was something that was rolled out across every platform that caters to kids, which TikTok does, I don't care what they say, TikTok is absolutely geared towards kids and kids are using it. But if every platform that children used, you weren't able to message each other, that would be a massive boon for safety. However, i actually need to caveat what I'm saying because I just said then that you can't message each other. That's not strictly true. You can't direct message each other, but you can comment on videos. So there is still stranger-stranger interaction here. And typically when you've got
00:03:14
Andrew Wilmot
human and AI-assisted moderation. So apparently every piece of content on Cover Start is reviewed using a combination of trained human moderators and next-gen AI-assisted technology.
00:03:24
Andrew Wilmot
I can't comment on how well this works because I'm not going to start posting a load of material on Coverstar. It is 99% children. And if adults are on that platform, then it's generally with their kids, as in the kids are managing a social media account and then featuring their parents. It also says that the feed is designed to be positive and age appropriate, not addictive.
00:03:47
Andrew Wilmot
They say, we do not promote endless scrolling, radicalization, or echo chambers found on other platforms. And it is true that I didn't come across any political content or inappropriate content the same way that TikTok has.
00:04:02
Andrew Wilmot
um and we'll come on to the claim about to say endless scrolling, but I saw no direct evidence of this safer field, and I'd be really interested to understand the mechanics of how they do it.
00:04:13
Andrew Wilmot
Is it just that they're pitching at an audience that is unlikely to be creating political videos? I don't know. Are they judging what that content is being posted and choosing to weigh it based on whether or not they consider it radicalisation or echo chambers?
00:04:29
Andrew Wilmot
There's not really much in the way documentation available on this. and In a similar way, they say that they enforce strict content guidelines protect the users. Sexualized content, harmful trends, self-harm themes, and adult-style content is not allowed on CoverStar.
00:04:43
Andrew Wilmot
um That first one, sexualized content, there is a lot of wiggle room here. Now, again, I didn't see any explicit content in my time there. I don't know whether it wouldd be possible to post it, and I'm not going to try.
00:04:59
Andrew Wilmot
What I did see is a lot of... ah young young people doing quite sexualized dances in outfits that I wouldn't be comfortable my children, I mean I wouldn't be comfortable my children having a social media profile and making content for anonymous users on the internet anyway. But it's not a too much of a stretch I think to say that some of this content that is directly available on the site
00:05:31
Andrew Wilmot
would appeal to people who have an unhealthy interest in children so yeah it's not it's not explicitly sexualized a lot the time but i think particularly when it's doing dance trends that have come over from tick tock which often do have a suggestive element to them. I don't think that's entirely true. Harmful trends, I'm going to come into that in in a second.
00:05:59
Andrew Wilmot
Now, again, I didn't see anything along lines of TikTok challenges. However, they do have their own form of challenges. Transparency and reporting. We publish clear safety rules, moderation standards and content guidelines. These are less clear than they like to say. And then finally, it's got global regulatory alignment. CoverStart is COPPA compliant and designed to meet leading international youth safety standards worldwide.
00:06:21
Andrew Wilmot
So firstly, being COPPA compliant isn't a boast. right That's like saying, ah we pay we have a job that pays minimum wage. It's it's the bare minimum. um Just so you understand a little bit more about it, COPPA, which is the Children's Online Privacy Protection Act, COPPA compliance requires operators of websites, apps and online services directed at children under 13 or those with actual knowledge they are collecting data from them. So this does actually include TikTok, Facebook, etc.
00:06:47
Andrew Wilmot
to provide clear privacy policies, obtain verifiable parental consent and secure data. Key steps include limiting data collection, implementing security, and offering parents access to review or to delete the child's information.
00:06:59
Andrew Wilmot
So that's what it claims, right? But fundamentally, children are still posting videos that are visible to other users on the platform. There's no strict minimum age. So you do end up with very young children posting videos online.
00:07:12
Andrew Wilmot
ah You know, that's typically dancing, lip syncing, talking to a camera. But adults can still create accounts. And I put my birth date in, honestly. So the the platform knows I'm a 31-year-old man. And it just let me go watch these kids. um And then it's got live streaming. So it actually gates being able to do a live stream behind having 20,000 users. Sorry, 20,000 followers.
00:07:37
Andrew Wilmot
And that means not everybody can do it. So it's not freely available. But I wonder if gating... functionality behind follower counts creates a perverse incentive for children to try and maximize follower counts.
00:07:55
Andrew Wilmot
And even then some youths, including very young ones, live-streaming their videos to whoever happens to be watching real-time broadcasting to strangers. And sure, you might, but let's take them at their word about the quality of their content moderation, that they won't be able to do that real-time.
00:08:13
Andrew Wilmot
So if a child broadcasts something inappropriate live, That's then online basically forever. Now, a lot of parents assume that because an app is aimed at children, it's going to be some kind of closed community. It's not.
00:08:27
Andrew Wilmot
I'm a 31 year old man. set up an account. I could still I could interact with these kids through the comment section, even though direct messages aren't a thing. I can follow their accounts. I can do a comment like, do you have Snapchat? Follow me on Instagram. And then that interaction moves somewhere else. Someone with far fewer protections.
00:08:45
Andrew Wilmot
And then there's the addictive design. That's why you come to this podcast, where you want to know about the addictive design. Like every short video platform, CoverStar uses algorithmic feeds and endless scrolling.
00:08:56
Andrew Wilmot
The goal here is simple with this you sir user set user interface, is to keep users watching for as long as possible. We know the impact of infinite scroll. We know the impact of the endless variety just at a swipe. That is well documented.
00:09:12
Andrew Wilmot
If you've been listening to this podcast for a while, you don't need me to go into that. If you haven't, just know that short form video platforms based around a scrolling mechanism like TikTok, YouTube Shorts, Instagram Reels, Facebook facebook Reels, I think they're called, are highly addictive.
00:09:28
Andrew Wilmot
This is documented. We know this. CoverStar introduces some other functionality which I don't see elsewhere. It's got reward systems, which you normally find in games. So there's a virtual currency called Star Coins, where you can buy things in the app.
00:09:48
Andrew Wilmot
And this will be like, in the same way to you might be able to purchase a skin for your character in a game, you can purchase, custom be able to customise your channel with things that you pay for with Star Coins.
00:10:03
Andrew Wilmot
And then there's something called star power, which is a type of score that you get by posting, interacting with followers and getting getting likes, comments and followers through your use of the app.
00:10:15
Andrew Wilmot
So they talk about how they're trying to focus children on creativity. i don't see what following TikTok trends has to do with creativity, but we'll put that aside. But this scoring, this gamification makes it less about creative creativity and more about approval seeking. They're able to
00:10:34
Andrew Wilmot
able to quantify how good you are at the app. I came across one parenting site that put it quite well, saying that making videos and hoping for likes can affect a child's self-worth, make them obsessed with checking the app.
00:10:48
Andrew Wilmot
That was care.com, by the way.
00:10:51
Andrew Wilmot
So, Star, sorry, CoverStar has these really strong validation loops. Kids post a video, they check the app, how many likes did it get? Did anyone comment? Did their follower count go up? It's exactly the kind of design we talk about here. It is a dopamine slot machine.
00:11:06
Andrew Wilmot
Every time you check, you might get validation or you might not. And that unpredictability is what will keep children coming back and wanting to create content. And that's putting aside all of the very addictive features that you have in just scrolling. It's putting aside the really aggressive use of star coins, right? And If you've been listening to this podcast a while, you will know this already. But by making these purchases through a digital currency, as opposed to paying with real money, by abstracting it, you make it even harder for the user. And this works on adults, right? Let alone kids potentially as young as five who have no idea of the value of money. I say five, there will be children younger than that using this platform. i mean, God knows a quarter of three to five-year-olds on TikTok, according to the latest Ofcom stats.
00:11:57
Andrew Wilmot
um It abstracts a away the value of money and makes it more difficult to understand, to to judge whether what you're paying money for is actually worth that money.
00:12:07
Andrew Wilmot
It is a really underhanded trick when used on adults. And I don't think should be allowed to be used on children at all. I think that it is, I think it's one of the worst things you can do in an app that is geared to children.
00:12:23
Andrew Wilmot
i even I haven't even talked about streak mechanics and challenges here. So here's where it does something that's even worse than TikTok that TikTok doesn't do. I'm going scroll through it right now. So it's got a it's got this challenge section. You can create your own challenges, right?
00:12:38
Andrew Wilmot
um I think you have to get premium for that, which costs £6 a month. um Premium also gives you longer videos, no ads. You can do hauls.
00:12:51
Andrew Wilmot
Cool videos. Get ready with me videos and vlogs. Anyway, challenges. So it's time-limited challenges where you get prize prizes for... um Yeah, after the challenge ends, rankings are locked in and badges are displayed. Rankings are based on star power received during the challenge time frame. After the challenge ends, rankings are locked in. i Sorry, I just repeated that again here.
00:13:16
Andrew Wilmot
Like for like may be removed from the challenge. Spamming on other people's posts and reposting someone else's video will be removed from the challenge. Please follow the challenge prompt correctly. Your person may be removed from the challenge. So these time-limited challenges, these street mechanics, they create a sense of FOMO when combined with that star power value is going to be hugely powerful, hugely impactful for children.
00:13:38
Andrew Wilmot
And then there's groups. So you can um join a group to take part in group challenges. ah Just going through some of these now. Dancers, Pickle Gang, Dance Forever, Soccer Gals, Jesus is the Best.
00:13:53
Andrew Wilmot
There's few Jesus is the Best ones, actually. Lulu Girls. Anyway, the more friends you invite to this, to your group, the more friends of yours that join,
00:14:05
Andrew Wilmot
um the higher up in the rankings in this group you are as well. So you're being incentivized to bring your mates into whatever group you're in as well. Now, anybody who has ever dealt with a runaway WhatsApp group will immediately be getting the hairs on the back of the neck stick up here.
00:14:22
Andrew Wilmot
Incentivizing children to form groups like this is a safeguarding risk. It is hugely impactful in terms of addictive design. It just should not be being done.
00:14:33
Andrew Wilmot
So, in Coverstars, we've got a social media platform that I can definitively say does not have direct messaging, and that is a plus compared to other social media platforms.
00:14:47
Andrew Wilmot
However, it has... even more aggressive use of addictive to design, as far as I can tell, than TikTok. I'm not going to count what it says about an algorithm that doesn't try to be addictive, because that goes against having the the fundamental UX choices they've made with the infinite scroll.
00:15:04
Andrew Wilmot
I'm not going to give them credit for that without evidence. And I did not see any evidence of that claim whilst I was exploring the app. It's very aggressively monetized compared to TikTok.
00:15:16
Andrew Wilmot
here's the thing, like, I talk about aggressive monetization quite a lot on this podcast, and I, TikTok's just not appropriate for kids. it's It's not appropriate for adults, really. It is, it is mind-poison.
00:15:27
Andrew Wilmot
It is one of the worst things I think you can do for your cognitive function as an adult is to get into watching short form videos, let alone as a child. But TikTok is not aggressively monetized. It'll have ads.
00:15:38
Andrew Wilmot
um you'll be able you You can buy things on TikTok lives and TikTok shops, but it's not part of the core experience. Scrolling through there, I was getting prompts to do exactly that. I was getting prompts to take part in challenges. It was breaking the infinite scroll loop but only to try and get me and to engage with more addictive features.
00:15:58
Andrew Wilmot
And it's still social media. So you're still bringing with it the same risks of public exposure, attention-seeking reward systems, interactions with strangers. It is still just children posting content online for an audience they don't control.
00:16:13
Andrew Wilmot
It's scary. It's not appropriate for children. Now, Most of my audience are going to be firmly embedded within the smartphone-free childhood movement. And so it's probably unlikely that any of you have your child using CoverStars.
00:16:30
Andrew Wilmot
But if you do, there are a few positive things that you can do here. So firstly, if your child is using the platform, the safest thing you can do is to remove them from the platform. The safest thing you can do is to remove them from their smartphone or tablet. This is mobile-only app that you cannot have this on a desktop. You cannot have this on a laptop. You can't have this on a games console.
00:16:50
Andrew Wilmot
ah Now, it has parental controls, but the user can change those parental controls with no oversight. There's not like an account linking functionality. Typically with parental controls, you'll have it linked to a parent's account, and that parent then controls what the child has access to, controls the specific controls remotely. Here is within the account. So the moment you look away, your child can swap those settings back.
00:17:15
Andrew Wilmot
Still, if your child is using the platform, make sure the account's set to private. Review followers and comments together with them regularly. Never, ever buy StarCoins. Remove any payment cards linked to the account, linked to the phone. That's just good practice anyway with any device you give your children.
00:17:34
Andrew Wilmot
Then try to make usage intentional rather than passive. So they want to create a video, do it with them, post it, leave the app. Don't let them get into the endless scrolling.
00:17:46
Andrew Wilmot
But keep devices in shared spaces in the house as well. So with a lot of this, the risk and harm profile is drastically reduced if they can't have the device in their bedroom. If they can't have the device in their bedroom, they can't be scrolling at three in the morning. If they can't have the device in their bedroom, there's always an element of passive supervision,
00:18:04
Andrew Wilmot
they're less likely to be trying to engage in um of unsupervised direct message contact with strangers. the best form of defence is always going to be an open and honest conversation with your child. And the the age range for cover stars is going to be quite large. here you you'll have You might have teenagers who are interested in it. You might have very young children who's come across it with their friends and think it's just a cool video. how Can I watch cover stars at home or somewhere in between? So you have to.
00:18:37
Andrew Wilmot
Gauge your child's maturity when speaking to them about what sort of content they might see as a general rule, if your child's too young to for you to feel comfortable talking about.
00:18:50
Andrew Wilmot
on unprompted, unsolicited sexual contact from strangers, they're too young to be on social media. If they're too young for you to talk about the upsetting content, and I don't mean, oh, if there's something that upsets you, I mean, talk about, hey, you might see somebody die on the internet someday, you might see gore, you might see somebody hang themselves on the internet. Now, again, they claim there's a lot of safety here. So maybe that's less likely on this platform.
00:19:18
Andrew Wilmot
But I'm speaking generally here. If you do not feel comfortable talking to your children in explicit terms about the type of content they may come across on social media, they shouldn't be on social media.
00:19:30
Andrew Wilmot
If you are, having that open line of conversation is going to be really important, really helpful in building a positive relationship with technology that your child has.
00:19:42
Andrew Wilmot
Making sure your child understands that once something's uploaded, it's basically there forever. And other people can see it, including people they don't want to see it. So having a talk about, hey, you want to post, um you know, dances, TikTok dances of you on on cover stars that your friends are doing. um Have you spoken to your child about online predators? Have you spoken to them about grooming paedophiles? Do they know what sex is?
00:20:09
Andrew Wilmot
Do they know, and are you comfortable speaking to your child about the fact that there may be adult men watching their profile and masturbating over it? Or even comfortable saying that out loud. So, but if you're not comfortable speaking with your child about that, then you shouldn't be letting them on this platform.
00:20:28
Andrew Wilmot
Yeah, so we're going to see a lot of apps like this, by the way, that are trying to position themselves as a child-safe social media. This isn't the first I've seen either. There's one I've been sent a couple of times, Trebella. This has had a lot of airtime in the news as well.
00:20:44
Andrew Wilmot
I haven't had the chance to actually explore it yet. like Oh, it is now available. Oh, join the waitlist for Google Play. I have an Android, notturn not an iPhone, and so I haven't been able to explore this. What I've seen, I'm not very positive about it.
00:20:57
Andrew Wilmot
um I don't think there can be a child safe social media. I don't think that unlimited access to the internet is safe for children in any form, particularly not through a mobile device.
00:21:11
Andrew Wilmot
But whenever we hear these claims about child safe social media or a child safe alternative to TikTok or child safe alternative to YouTube, your first thought should be scepticism. It should be Let's look at all the ways this is not child safe. Never take it at safe value face value.
00:21:30
Andrew Wilmot
I will give one shout out to one site, which is not social media, but it's it's actually a search engine. KIDL. I'm a big fan of KIDL. So KIDL is a child-oriented search engine.
00:21:45
Andrew Wilmot
We use this when we are doing research for projects with our eldest. We use this as the search engine. We're quite comfortable with her using this her directing it rather than me typing. So if she wants to drive whilst we're there and it's still supervised, we're happy for her to use Kiddle.
00:22:01
Andrew Wilmot
The difference between Kiddle and other sites is that the results are either hand-picked and checked by their editors or really strongly filtered through safe search filters.
00:22:14
Andrew Wilmot
And even having like bad words will be blocked by by the the AI they use. I've tried to break Kiddle. I always try and break anything that... claims to be child safe, and 99% of the time I manage to, easily.
00:22:29
Andrew Wilmot
I've not been able to, kiddo. And that gives me a lot of confidence using it. And it's not a social media app. It's it's it's Google. it's It's being able to look up stuff on the web, look up facts.
00:22:44
Andrew Wilmot
iss It's got its own sort of Wikipedia equivalent. It's got its own images search. It's a great example of what an actual child safe alternative for something looks like.
00:22:55
Andrew Wilmot
it doesn't look like taking everything but removing one specific feature, that being direct messages, and implementing a load of extra addictive design. That's not charge safe at all.
00:23:05
Andrew Wilmot
And that's all for today. Don't forget that if you've got any questions for me, or there's a game or app that you'd like me to cover, you can get in touch at thedopamineslotmachine at gmail.com. or through our very fancy website, the dopamine slot machine.co.uk, which is stuffed full of guides that I've written. a new version I've actually just published today, which has a load, load extra guides, including cover stars, five nights at Freddy's, Poppy Playtime, 4chan, Clash Royale, Genshin Impact, StumbleGuys.
00:23:35
Andrew Wilmot
So if you haven't looked at it for a little while, go look at it again. There's some really cool new materials on there. I'm particularly proud, by the way, of the memes and lingo page. Quite often I'll see memes and lingo guides for parents shared elsewhere.
00:23:49
Andrew Wilmot
And you know i've I've in the past been a little bit, what what's the phrase, chronically online. You know, as a teenager, I very much grew up spending a lot of time on 4chan. i was I was there when Gamergate and the of proto-red pill movement started. I've seen all of that firsthand. And so a lot of the time I'll see memes and memes guides for parents that are just missing the point with a lot of them.
00:24:13
Andrew Wilmot
So I'm proud of this because it's... At least I like to think that I understand what the kids are talking about a lot of the time.
00:24:23
Andrew Wilmot
I understand what look maxing is. Everybody's talking about looks maxing these days after the Louis Theroux documentary, which didn't really have anything to do with looks maxing. um But <unk> I've seen loads of follow-ups on, Louis Theroux talked about, the looks maxing community. From people who don't really understand what looks maxing is. it is horrific.
00:24:43
Andrew Wilmot
But at its core, it starts off with a 14-year-old looking up on the internet why girls don't like him. Anyway, go check out the website, even if you've already checked it out. There's some new cool stuff on here.
00:24:55
Andrew Wilmot
This has been the Dopamine Slot Machine. Thank you, and see you soon.

Outro