Why Instagram May Not Be Safe for Your Kids
Parents may want to get familiar popular apps because platforms like Instagram may not be safe for youngsters, especially if their kids are using it—which they probably are. It is one of the most popular apps for kids and teens, even it’s not the best place for them to be.
As it turns out, Instagram is also a popular place for stalkers, scammers, even creeps and pedophiles to congregate. This startling report has been showing up in investigative articles within the Wall Street Journal that started in early 2023.
This report, even though it seems shocking, somehow doesn’t seem to make the nightly news or big headlines. Maybe it isn’t sensational enough yet, but it should be, and thanks to the Wall Street Journal it’s starting to get attention.
PROTECT KIDS ON THE INTERNET WITH PARENTAL CONTROLS
Not only is the news somewhat disturbing it also is getting the attention of people who are concerned about the safety and well-being of children on the Internet.
Is Instagram safe and do they (Meta, the company that owns Facebook and Instagram) truly care about kids?
The news articles came about after the Journal opened new accounts on Instagram “pretending” to be in their early teens. Their goal was to find out what kind of suggestions Instagram’s algorithms might recommend to a 13 year-old boy’s or girl’s feed. They also wanted to see what kind of ads a 13-year-old might be exposed to.
It didn’t take very long for the results of the experiment/test to start rolling in, and the results were surprising. Surprising, that is, if you think a very public and popular app should protect kids from seeing content they didn’t explicitly or specifically ask for.
It didn’t take long for the WSJ “teens” to be exposed to the R-rated side of Instagram (maybe even X-rated) in no time at all.
And as it turns out, the Journal wasn’t the only organization that conducted this type of experiments. The Center for the Protection or Kids in Canada conducted a similar test totally on their own, and they had similar results, which greatly concerned them about the integrity of Instagram, who claims they care about protecting children from the negative aspects of the internet.
An insider’s concern about Instagram’s attitude toward safety.
Another story that appeared in the Journal in November 2023 provided even more evidence that, at a minimum, apps like Instagram can expose young users to an experience that they didn’t ask for, don’t want, and isn’t easy to put an end to.
The article presented an insider’s account, a former Meta consultant, regarding the issues at Instagram when it came to filtering content and responding to user concerns.
Here’s how we worded his letter, as it appeared in the Journal: “I wanted to bring to your attention what I believe is a critical gap in how we as a company approach harm, and the people we serve experience it.’
His letter told his client this—that even though Instagram was saying (even perhaps believing) that it was doing a good job with its safety efforts, the company in fact was “deluding itself.”
He included a few statistics he’d gained firsthand which 1) felt were very important for the leaders at Meta needed to know 2) showed the types of abuse or inappropriate behavior that went on routinely at Meta.
One statistic was this:
- “One in 8 users under the age of 16 had experienced unwanted sexual advances.
- This problem was ongoing. The teens said they experienced sexual harassment and that it had occurred over the last week.
Meta seems to minimize negative reports.
According to to the consultant, Meta seemed to have no real genuine interest in responding to problems they thought were on the fringes of their own beliefs, statistics, and systematic way of handling problems.
The situation revealed a dilemma for not only Meta, but any type of business. If your own assessment is positive, how much credence do you give to problems that seem to be popping up in every corner of the platform?
After all, do we expect companies to have a perfect record? And if they say they address all the problems that come their way, do we believe them?
The consultant had raised concerns at various times to the higher ups at Instagram, about some of the shortcomings of Meta’s content filtering efforts. He and his team were aware of some of the issues that users had complained of.
Before leaving Meta in 2015, according to the Journal article, the consultant outlined, in a detailed report, his concerns about the gaps in Instagram’s content filtering efforts.
He was certain, or hopeful, that Meta would take his observations and suggestions seriously.
While they accepted his report, they requested that he “tone down” the language in it to indicate that Instagram had been aware of and addressing these issues at all time.
His own daughter’s experience.
One complaint that Instagram users have is that it seems the user has to take action to block outsiders who may post offensive or unwanted comments on an Instagram page. When the consultant’s daughter posted photos of a car restoration project she was working on, with her dad, she received crude comments from adult men who cruise the platform.
As the consultant discovered, there was not much his daughter—let along any Instagram user—could or can do to stop that type of response, which could be seen as a form of unwanted attention or even abuse.
He decided to let Instagram know that there was still a lot to be desired when it came to protecting teenagers on the platform. He decided to contact a handful of executives and, as a former employee who’d addressed the topic of content safety, let them know the problems he was seeing were serious.
What Instagram did was interesting: They offered him a new job with the organization. He accepted the position, hoping that he could affect real change and progress this time around.
That didn’t happen. In fact, the battle for an Instagram that’s more responsive to real problems affecting real kids is ongoing. And now U.S. politicians know more about it.
The consultant testified before congress at the end of 2023.
Instagram may not be safe—is that okay with everyone?
Without stating it outright, though they come close, the Journal is pointing out to the public, to you, that Instagram is a company that seems more concerned about keeping all users engaged, adding subscribers and selling ad space than protecting kids under 18 from unwanted sexual harassment.
It’s so unclear.
This seems clear however—until Instagram is called out on its inconsistencies by a higher authority it will be business as usual for them. That business is saying we’re working on problems while not truly changing the problem or our business model.
To listen to interviews with experts on dangers facing children online and offline, follow the Easy Prey podcast, hosted by Chris Parker, CEO of WhatIsMyIPAddress.com.
Related Articles
- All
- Easy Prey Podcast
- General Topics
- Home Computing
- IP Addresses
- Networking Basics: Learn How Networks Work
- Online Privacy
- Online Safety
How Machine Learning Works in Fraud Detection
As Artificial Intelligence technology continues to evolve, the multitude of ways in which it can help us…
[Read More]Computer Security Incident Management Requires Planning Ahead and Making Hard Calls
A lot of things about cybersecurity aren’t easy. From evaluating the value of your digital assets to…
[Read More]Quantum Computing and Cybersecurity: Preparing for the Future
Today’s world moves at lightning speed compared to the previous generation. To stay ahead of the curve,…
[Read More]About Those Online Plagiarism Tools…Do They Actually Work?
If you have ever been a student or a teacher, you know how big of a deal…
[Read More]Using ChatGPT at Work: What to Do and What Not to Do
When ChatGPT exploded onto the scene in 2022, you could immediately see the ripples that went through…
[Read More]Why Romance Scam Prevention Matters: Key Statistics and Insights
October 3rd is World Romance Scam Prevention Day. Though the officially recognized day is new, the need…
[Read More]