By Felicity
Almost half of children in the UK say they worry about staying safe online, according to the UK Safer Internet Centre. That anxiety isn’t coming out of nowhere. New tech like AI and VR keeps changing how kids learn, play, and talk. The good news: most risks can be managed with a mix of clear rules, simple habits, and quick reporting when something feels wrong. The gap is knowing what to look for and what to do next.
Safer Internet Day has helped push these conversations into classrooms and living rooms. But the everyday questions remain the same: How do you spot a fake? How do you respond to a cruel comment? Which settings actually protect a child’s account? And what if the problem happens inside a game that looks harmless?
What children say they worry about online
Misinformation is at the top of the list. Young people scroll past newsy claims, AI-generated images, and viral videos that look convincing. The fix starts with a pause: Who posted this? What do they gain from it? Can you find the same fact from a mainstream outlet or a known expert? If the claim is huge but the source is tiny, treat it as unproven. Reverse-image tools and basic search tricks help, but the main skill is healthy doubt. Kids don’t need to distrust everything—they just need to check before they share.
Online arguments are another flashpoint. Disputes escalate fast in group chats and comment threads. The safest move is to slow it down: don’t hit reply in anger. Save the messages (screenshots help), mute or block the person, and report the behavior inside the platform. If threats appear or the pile-on continues, tell a trusted adult right away. A short break from the app can reset the whole situation. None of this is about “giving in”—it’s about not letting someone else control your mood or your time.
On social media, age checks are still imperfect and kids can bypass them. That means families need to use what’s already built into the apps: private accounts, restricted DMs, tight friend lists, comment filters, and limits on who can tag or mention you. Turn off location sharing for young users. Audit followers every few weeks. If a child feels pressure to accept every request, agree a simple rule like “we only add people we know offline.”
Roblox deserves special attention because of its size and the way it works. It’s a platform of user-made games, which means safety tools must keep up with endless new spaces and chat rooms. The NSPCC and Childline say calls about Roblox-related concerns have grown five-fold since 2020, with cases involving cyberbullying, inappropriate content, and strangers asking for personal details. One child told Childline that constant insults and threats in a Roblox game wrecked their mood for days. Another said a friendly-looking pizza game led to creepy messages from older players. Roblox does have filters, parental controls, and reporting tools, but they aren’t automatic protection. Adults should set the chat level, review who can message, and turn off invites from people outside the child’s friend list. Keep an eye on spending settings too—small in-app purchases add up fast.
It’s also worth naming what “unsafe” can look like in plain terms. Red flags include a person asking to move the chat to a private app, pushing for images or personal info, using flattery to build fast trust, or promising rewards for secrets. If the contact gets angry when boundaries are set—or tries to make the child keep something from family—that’s a clear stop sign.

What families and schools can do right now
Start with a short, written family tech plan. It doesn’t need to be formal or perfect. Agree on which apps are allowed, what privacy settings are on, when and where devices are used, and what happens if something goes wrong. Put the “tell me if you’re worried” rule at the center. No blame for asking for help.
For children and young people, try this quick safety checklist:
- Pause before sharing: check the source, date, and author. If you can’t verify it, don’t amplify it.
- Lock down privacy: private account, no location sharing, restricted DMs, tight friend list.
- Control your space: use comment filters, block, mute, and report tools early—not as a last resort.
- Keep personal info off-limits: full name, school, address, phone, passwords, and routine details stay private.
- If someone makes you feel weird or scared, stop, save evidence, and tell a trusted adult right away.
For parents and carers, a few practical steps go a long way:
- Set up devices together. Create the account with your child and walk through the safety settings on the spot.
- Use platform parental controls. On Roblox, review chat settings, friend requests, spending limits, and what games are accessible.
- Co-play and co-scroll. Sitting alongside a child for 15 minutes shows you more than any guide—ask what they enjoy and what bugs them.
- Keep devices in shared spaces for younger kids, and set clear bedroom/device rules at night.
- Model the behavior you want: no oversharing, respectful comments, and healthy time limits.
- Make reporting normal. Practice where the report button is on each app your child uses.
For schools, the basics matter just as much:
- Have a named online safety lead and a clear reporting pathway for students and staff.
- Teach digital citizenship early and often: source-checking, bystander behavior, privacy, and consent.
- Run scenario drills: “What would you do if…?” helps students practice a response before they need it.
- Keep parents in the loop with short guides and termly briefings, not just emergency emails.
- Use peer mentors. Students often listen best to other students trained to help.
Trusted learning resources can help make all this stick. BBC Stay Safe uses characters and quizzes to make privacy and kindness easy to understand. Safe Surfing with Doug breaks down common risks in simple, age-appropriate language. Grid Club offers a controlled space for kids to try skills without the noise of a live social feed. Kidsmart splits advice by age—under 11s need different guidance from teens—so families can pick what fits.
When a situation crosses a line, there are clear paths to help. Use in-app tools first: report, block, and mute. Keep evidence: screenshots of messages, usernames, timestamps, URLs. For worries about grooming, sexual messages, or exploitation, report to CEOP (Child Exploitation and Online Protection). If a child is at immediate risk, contact the police. Schools should log incidents and escalate to safeguarding leads without delay.
Not sure when to escalate? Treat any of these as urgent: threats of violence, requests for sexual images, persistent contact after blocking, doxxing or attempts to find a child offline, or talk of self-harm. If your gut says “this feels wrong,” act on it and seek help.
Regulation is changing too. The UK’s online safety rules now put more pressure on big platforms to reduce harms and assess risks to children. That should improve things over time. But families don’t need to wait for a policy to kick in. The everyday settings—private accounts, strict chat controls, time limits, and quick reporting—deliver immediate protection.
And remember the goal isn’t to scare kids off the internet. It’s to help them use it with confidence. Curiosity is healthy. Games and chats can be joyful and creative. With a simple plan, a few checks, and a promise that adults will listen without overreacting, young people can explore, learn, and have fun while keeping online safety front and center.