Meta, TikTok and Snap are participating in an online safety ratings system

Major Social Media Platforms to Undergo New Safety Ratings System

A growing number of prominent social media companies, including Meta, TikTok, and Snap, have signed on to a new rating system designed to evaluate their effectiveness in safeguarding the mental health of adolescents. The Safe Online Standards (SOS) initiative, spearheaded by Dr. Dan Reidenberg, Managing Director of the National Council for Suicide Prevention, aims to establish a set of clear standards that social media platforms must adhere to in order to protect young users from exposure to harmful content.

The SOS program is a significant development in the ongoing push for greater accountability among tech giants when it comes to user safety and mental health. The evaluation process involves the submission of documentation on policies, tools, and product features by participating companies, which will then be assessed by an independent panel of global experts. Upon completion of this review, platforms will receive one of three ratings: "use carefully," "partial protection," or "does not meet standards."

The highest achievable safety rating, "use carefully," carries a blue badge indicating that the platform has met the necessary standards for protecting adolescent mental health. However, critics argue that even this top-tier rating may be overly lenient, with requirements such as having "reporting tools accessible and easy to use" and having "privacy, default, and safety functions clear and easy to set for parents."

The involvement of major social media companies in the SOS program comes after years of criticism over their handling of user data and exposure to suicidal content. Meta, in particular, has faced intense scrutiny following allegations that it buried internal research showing the negative effects of its products on users' mental health.

The Mental Health Coalition's long-standing partnership with Meta dates back to 2020, when the organization launched a campaign aimed at destigmatizing mental health and connecting people to resources during the COVID-19 pandemic. Since then, the group has worked closely with Meta to develop initiatives such as "Thrive," which allows tech companies to share data regarding materials that violate self-harm or suicide content guidelines.

The SOS program is seen by many as a crucial step forward in ensuring greater accountability among social media platforms when it comes to user safety and mental health. As the technology landscape continues to evolve, it remains to be seen whether these ratings will have a meaningful impact on protecting young users from harm.
 
This new safety rating system is like a wake-up call for all these big companies ๐Ÿšจ! They need to step up their game and prioritize our mental health over profits. I mean, think about it, if you're not careful, your social media habits can easily spiral out of control ๐Ÿ˜ณ. And what's the worst that could happen? A few bad apples in the form of suicidal content? No way, we gotta be proactive here ๐Ÿค! The rating system is a great start, but I'm hoping it'll lead to more than just lip service from these companies. We need real change, not just a blue badge ๐ŸŒŠ!
 
im curious about this new rating system... wont it just make ppl feel good 4 a sec bout their ratings & forget 2 actually do somethin 2 change their policies ๐Ÿค”๐Ÿ’ก its also weird that they're callin out meta specifically - r they tryna stir up trouble or is it real ๐Ÿค‘ i mean on one hand, its nice 2 see big tech companies takin responsibility for user safety... but on the other hand, isnt it a lil easy 4 them 2 just slap a blue badge on their website & call it a day ๐Ÿ˜ด what if these ratings are just a Band-Aid solution 4 a way bigger problem? ๐Ÿคทโ€โ™€๏ธ
 
I'm low-key hyped about this new SOS program ๐Ÿคฉ, finally some accountability for these big tech giants! I mean, Meta's been getting roasted for ages over their handling of user data and mental health stuff, so it's about time they got called out. And let's be real, TikTok and Snap have had their own issues with safety and wellbeing too. It's about time they stepped up their game ๐Ÿš€.

But, I'm not gonna lie, the fact that they're only getting a "use carefully" rating if they meet certain standards is kinda whack ๐Ÿ˜’. Like, how hard is it to make reporting tools accessible and easy to use? It should be a no-brainer, but apparently it's too much to ask from these guys ๐Ÿคทโ€โ™‚๏ธ.

Still, I'm down for any efforts that aim to protect young users online ๐ŸŒŸ. We need more initiatives like this to hold tech companies accountable for their impact on mental health. And who knows, maybe these ratings will actually make a difference ๐Ÿ’ช. Fingers crossed!
 
Ugh, can't believe some companies are still not taking mental health of teens seriously ๐Ÿคฏ! Like, I get it, they gotta make money but come on, who dies from watching cute cat vids all day? ๐Ÿคฃ It's not that hard to implement these basic safety measures, right? "Report tools accessible and easy to use" c'mon, like how hard is that?! ๐Ÿ˜ฉ And what about the blue badge thingy, does it even make a difference?! I mean, if they've already messed up so many times, do we really expect them to magically start caring now?! ๐Ÿคทโ€โ™€๏ธ
 
๐Ÿคทโ€โ™€๏ธ Like, finally! Something's being done about our kids' online safety... or at least some companies are trying to pretend they care ๐Ÿ˜’. Meta and TikTok, y'all have been getting roasted for your handling of user data and mental health concerns, so it's nice to see you're finally jumping on the bandwagon ๐Ÿš€. A rating system? That sounds super easy to game... will be interesting to see how this plays out ๐Ÿ‘€. And, I mean, "use carefully" vs "partial protection" is just a fancy way of saying they still don't have all the answers ๐Ÿ”ฎ. But hey, progress, right? ๐Ÿคž
 
omg u no how worried i am about these new rating systems for social media ๐Ÿคฏ like its about time som1s doin sumthin 2 prevent all the drama & anxiety thats goin on online rn ๐Ÿ’” meta's been in so much trouble 4 years w/ mental health concerns & still, they havnt really done enuf to change things. i hope dis new SOS program actually makes a diff & that the blue badge thingy is more than jus a marketing gimmick ๐Ÿ™ lets c how it plays out!!! ๐Ÿ˜ฌ
 
omg I'm like totally relieved that meta is being held accountable for its mental health practices ๐Ÿ™๐Ÿ˜Š I mean I've been using their platform since 2018 and sometimes I feel like they're not doing enough to prevent cyberbullying and suicidal content. I remember when my cousin passed away in 2022, she was a high school student who struggled with anxiety and depression. She used to be active on snapchat, but her parents had to limit her usage after she started experiencing some really dark times. it's crazy how much of an impact social media can have on our mental health, you know? ๐Ÿคฏ I'm all for the new rating system and I hope it makes a real difference in protecting young people from harm ๐Ÿ’–
 
I think its about time we held social media giants accountable for their role in shaping adolescent experiences ๐Ÿค”. The SOS initiative is a step in the right direction, but I'm skeptical that even the "use carefully" rating will be enough to safeguard young minds. We need more than just a badge; we need meaningful change ๐Ÿ’ก. How about platforms prioritize transparency and user data protection above profits? Only then can we trust these companies to protect our most vulnerable users ๐Ÿค.
 
OMG, finally some major social media companies are taking steps towards being more responsible ๐Ÿ™Œ. I mean, we've all had those awkward moments where you stumbled upon something super disturbing online and couldn't shake it off ๐Ÿคฏ. But now they're trying to make these platforms safer for our teens ๐Ÿ‘ง. It's about time! I'm so glad Meta is part of this initiative, especially since they've been under fire lately ๐Ÿ”ฅ. The idea of a clear rating system seems like a solid start - who doesn't want to know if a platform is "use carefully" or what? ๐Ÿค”
 
I'm all for this new rating system ๐Ÿค! It's about time major social media companies took responsibility for their role in shaping our online experiences. I mean, who doesn't want to see those blue badges popping up on their feeds? ๐Ÿ˜Š It'll be interesting to see how the ratings play out, though. I'm not sure "use carefully" is being harsh enough... what about "improve now"? ๐Ÿค” Still, it's a step in the right direction! We need more transparency and accountability from these platforms. And can we get some better reporting tools for parents? ๐Ÿ“Š It's time to make social media safer for our teens ๐Ÿ‘
 
Back
Top