Convenience or Safety – How Do You Choose Your Child’s Messenger App?
Back in the early 2000’s I got my first phone. I was probably one of the first people in my grade 12 class to even have a phone. It was a blue flip-phone and was pay-as-you-go. I gave myself a budget every month, and would load that amount onto my phone, after going to the Radio Shack to purchase the prepaid card. Every minute spent talking cost me, every text message sent.
Fast forward today where I don’t even blink at my phone bill, with unlimited text messages and calls within the country. I use an app to message my 13 year old niece and to have group conversations with my brothers. My other niece is 7, and not on any messaging platform. I get updates from her mom on social media, but my direct conversations with her are limited to when we see each other on the occasional weekend or family event.
As someone who grew up in a tight-knit community, I am conscious of the importance of the relationship I have with my nieces and nephews, and my role as the ‘cool aunt’. Since they don’t live in the same area as me, my influence becomes limited and messaging platforms become important forms of communication.
When presented with all the options for connecting digitally with family and friends, how do we choose? Convenience, really.
But it doesn’t matter. The real question is: what makes an effective messaging app for families? What is needed for my 13 year old and 7 year old nieces to participate and enjoy the same spaces?
The Children’s Online Privacy Protection Act, restricts what information websites, apps, and online games can collect and store from children (12 and under). These spaces also have to actively protect children’s Personally Identifiable Information, or PII. . Meaning that a child cannot post their full name, phone number, address, etc. The fines for violating COPPA can be fairly hefty, which is why many companies state that their platforms are for users 13+. They don’t have to follow COPPA guidelines, and any user who lies about their age on the platform is at fault instead of the company.
COPPA is extremely important for protecting children’s identity (giving away PII can put children at risk for identity theft) and protecting them from risky interactions.
Any messaging platform that has children in it, needs to be COPPA compliant.
At Mazu, we talk about core values and their importance a lot. It is central to everything we do and we believe that any platform for children needs to also have core values at its center.
Core values are how children learn to navigate themselves through life. How to interact with others in healthy and positive ways.
Children learn by observing. What are our children observing when they enter digital spaces? Is it negativity and inauthenticity? Or are people talking about truth, respect, and hope?
In the current ecosystem there is no ‘Beginner’s guide to online interactions.’ No one teaches our children the proper way to engage with these spaces. As the adults who were slowly introduced to these platforms or perhaps even created them, we tend to have a better understanding. Plus, like myself, many of us were raised in communities, before digital took over our lives, and we take those lessons and values we learned on the playgrounds of our childhood with us when we go online.
Our children are being raised in a digital village, and so online platforms need to reflect back and mimic our real-life villages. They do that by having and promoting core values.
While the debate between free speech and moderation rages on our adult platforms, most people can agree that children’s spaces need moderation. We would never let a 7 year old see a rated R movie — the language, content, and imagery are not appropriate. So why would we allow that same 7 year old to inhabit spaces that have the potential for rated R content to come through?
Human moderation is a key component. AI can identify inappropriate language, and although it is getting better with videos, image, and language context, that human aspect provides an extra layer of security.
What does it mean to be verified? Verifying parent accounts ensures that people are who they say they are. That they’re real parents, and that, yes, they can have control over their children’s accounts.
Part of COPPA compliance requires children’s spaces to be ad-free. And we think it’s important to highlight, because children don’t always recognize when they are being advertised to — “Some 82% of middle-schoolers couldn’t distinguish between an ad labeled “sponsored content” and a real news story on a website” (Wall Street Journal). Ads are disguised as product reviews or look like shows. Even when they are clearly ads, children are very susceptible to the messaging.
Think of yourself as an 7 year old, what kind of influence did commercials have over what you put on your Christmas list or what you bought at the toy store?
At a younger age, brand familiarity is what drives purchases, instead of thinking more critically about things such as quality, company ethics and working conditions, where the product is made, etc. Instead of being provided a variety of choices to look at, advertising narrows what children are exposed to — the brands with more money to spend on targeted advertising are more likely to be seen by users.
At the end of the day it’s the parents and guardians who are responsible for making the decisions that will affect their children. It’s not about helicoptering, but maintaining some control over what your children have access to, when they have access, and how much.
As parents (aunts and uncles too), our team at Mazu has thoughts about what makes a good platform for children and families. But we don’t always have all the answers, or the answers that are right for each individual family. Which is why it’s important for us to give power back to parents.
We’d love your thoughts on what you want to see in a messaging app for families. Leave us a comment, or visit our website and have your say. Let’s build it together!