Concerning #ICYMI news out of New Mexico: The state is suing Snapchat, alleging the platform's failure to prevent child sexual exploitation. See the article here: https://lnkd.in/gTeh9hZy This lawsuit raises important questions about social media companies' responsibility to protect vulnerable users. It will be interesting to follow the legal developments and their implications for online safety. #ChildSafety #OnlineSafety #NewMexicoLaw
Marc Grano’s Post
More Relevant Posts
-
I could not agree more with this analysis from Elly Hanson on OnlyFans and the latest Reuters' investigation. No amount of good PR can change the reality of what happens on OnlyFans, as outlined so clearly by the Reuters' articles - the sharing of Child Sexual Abuse Material containing very young children, explicit videos of underage girls who have been trafficked and exploited, videos of rape and sexual assault and the non-consensual sharing of explicit images and videos. Nor can it change the reality of the impact of such a site on the normalisation of commodified sex as Elly Hanson highlights in her essay. The commercial sex industry is built on exploitation and abuse of primarily women and girls. OnlyFans is not separate to this - moving from the street to online does not make the selling of sex any safer or less oppressive. Decades of research tell us that women who sell sex (online and offline) suffer from higher rates of PTSD, suicidal ideation, drug and alcohol abuse and a plethora of other harms, including online stalking and abuse, and doxxing. Reports from women who sell explicit imagery on OF's have reported being forced to perform more and more hardcore acts in order to make any money, pushing passed their own limits. As one creator explains, she will only breach her own boundaries for a substantial amount of money. When asked what that amount would be she said ‘“I’d probably start at about £50.”’ https://lnkd.in/euHkn2ze OnlyFans generated $1.1billion in 2022.
“A wound that will never heal”. Another must read from Reuters – their report, released today, into child sexual abuse and exploitation on OnlyFans. It is distressing, shocking and deeply implicational. https://lnkd.in/e52xxUrP OnlyFans CEO Keily Blair loves to boast how safe the site is, e.g. “we know the age and identity of everyone on our platform” - but as the Reuters' expose makes clear, either claims like this are untrue or OnlyFans are tolerating the abuse of children. The bamboozling boasts of OnlyFans have been very effective at convincing many in the child protection space that they are a safe platform. I truly hope that this article, alongside various others recently published, cause people to think again and bring the scrutiny and challenge that has been largely missing of late. The OnlyFans' business model is highly lucrative, yet carries significant risk for those who find themselves caught up in its world. I explore the wider harms of their normalisation of commodified sex in my recent essay at Fully Human: https://lnkd.in/eTS5qSyS A few weeks ago CEO Keily Blair posted on LinkedIn that “the haters are my motivators”, gaining a massive metaphorical round of applause from her fans responding with the sentiment “haters gonna hate”. Instead of such self-congratulatory rhetoric, can we please have empathy, humility, acknowledgement and change.
OnlyFans vows it's a safe space. Predators are exploiting kids there.
reuters.com
To view or add a comment, sign in
-
UK Regulator Challenges Apple’s Efforts to Combat Child Sexual Abuse Content https://lnkd.in/dsrY_h5R To Get all latest news and updates Join us on WHATSAPP group https://lnkd.in/dnztZkZg #apple #childprotection #icloud #imessage #encryption
UK Regulator Challenges Apple’s Efforts to Combat Child Sexual Abuse Content
english.newstracklive.com
To view or add a comment, sign in
-
💸🌐 The increasing trend of self-generated sexual content by children and young people on platforms like OnlyFans is alarming. Motivated by financial gains and the pursuit of internet popularity, an increasing number of young individuals are drawn to these platforms. A recent investigation by Reuters highlights a troubling scenario: despite strict safeguards, explicit content involving minors still emerges, causing profound emotional distress for the victims and their families. 🔍 We are on the brink of fielding our survey with 2000 young people in Thailand later this month for our "Leaked" Project, which aims to delve deeper into the motivations behind why young children create and share such content. This new data will help find out what factors are driving young people to create sexual content for sites like those named in this investigation - critical information if we are going to address the harm that can result. 🛡️ https://lnkd.in/dK8eQQit
OnlyFans vows it's a safe space. Predators are exploiting kids there.
reuters.com
To view or add a comment, sign in
-
The ease of use and features like end to end encryption in some of these messaging apps is contributing to increased sharing of CSAM by the pedophiles. For instance what is whatApp doing to ensure that their platform is not being used for production and distribution of child Sexual Abuse Materials.
In the wake of the Huw Edwards scandal, we warn that nothing is stopping child sexual abuse imagery spreading on WhatsApp. We're urging Meta to do more to stop the sharing of this criminal material on the platform.
'Nothing stopping' child abuse sharing on WhatsApp, group warns
bbc.co.uk
To view or add a comment, sign in
-
AI-generated child sexual abuse videos are a growing threat. Utilizing tools such as advanced #WebFiltering can assist law enforcement in detecting and preventing the spread of such harmful material online. https://bit.ly/4ddS9rQ
AI advances could lead to more child sexual abuse videos, watchdog warns
theguardian.com
To view or add a comment, sign in
-
The increasingly dangerous and exploitative world of sexual offenders and those who trade in #CSAM. Some of the most prolific and tech-savvy offenders are in this sphere. We need to support law enforcement agencies and especially Internet Watch Foundation (IWF) in thier efforts to limit and expose this horrific crime. But we should demand more from national governments and the technology industry to do more to prevent images being uploaded (created) in the first instance.
AI advances could lead to more child sexual abuse videos, watchdog warns
theguardian.com
To view or add a comment, sign in
-
The Wall Street Journal: Snap Inc. Failed to Warn Users About Sextortion Risks, State Lawsuit Alleges -- https://lnkd.in/gQNqmdzR More and more is coming out about Snapchat after the state of New Mexico filed a lawsuit against the platform. Internal documents show that Snapchat has known about deadly threats like sextortion for years, and has done little to nothing to warn children or protect them. Snapchat is: ❗️The #1 parent-reported platform for sharing child-sexual abuse material ❗️The #1 platform where most minors have reported having an online sexual interaction ❗️The #2 highest platform used for sextortion (after Instagram) ❗️The #3 for platforms on which minor users reported having a sexual experience with an adult ❗️The #1 most identified platform for the recruitment of sex trafficking victims Sound like a safe platform for our kids?
To view or add a comment, sign in
-
🚨 The NSPCC is urgently flagging that Apple is vastly underreporting child sexual abuse material. As reported by The Guardian: "In a year, child predators used Apple’s iCloud, iMessage and Facetime to store and exchange CSAM in a higher number of cases in England and Wales alone than the company reported across all other countries combined, according to police data obtained by the NSPCC." Read that again. Apple's services enabled child sexual abuse at higher rates in England and Wales alone than the company reported worldwide. That's a terrifying discrepancy. As Heat Initiative CEO Sarah Gardner also warned: “Apple does not detect CSAM in the majority of its environments at scale, at all. They are clearly underreporting and have not invested in trust and safety teams to be able to handle this.” At the same time, Apple continues to reference their now year-old statement (which was really a leaked email to Heat Initiative) as to why they won't do more. And yet, other privacy-forward companies are navigating how to support the rights of all users, including child sexual abuse victims.
UK watchdog accuses Apple of failing to report sexual images of children
theguardian.com
To view or add a comment, sign in
-
A recent Freedom of Information Request by NSPCC has shown there was a 25% increase in child abuse image offences in the UK in 2022/23. We know from services like Childline that young people are targeted by adults on various platforms, and the data shows widespread use of social media and messaging apps to commit these crimes, with Snapchat being involved in 44% of instances where the platform was known and Meta Platforms in 25% of instances. The social media and tech companies really need to step up, take responsibility and take swift action to address this, and Ofcom needs to become stronger and more ambitious in its approach, effectively enforcing the Online Safety Act. #nspcc #childline #onlinesafety #childprotection #safeguarding #ofcom #meta #facebook #instagram #snapchat Snap Inc. #abuse #crimes #policing #ukpolice #regulation #onlinesafetyact #onlineharms #childabbuseimages #internet #action #act #petition #everychildhood
As child abuse image crimes increase, we’re calling on Ofcom and tech companies to take action
nspcc.org.uk
To view or add a comment, sign in
-
Social media has made online safety even more dangerous, especially for children who can be targeted by predators. Our latest blog discusses a recent story of two girls who were forced to attempt commercial sexual exploitation on Aurora Ave, after being manipulated online. Read more about what employers can do when cases such as this one occur: https://lnkd.in/gvvwJfvw
Unveiling the Hidden Dangers of Social Media Predators and What Businesses Can do to Help Stop Child Sex Trafficking
bestalliance.org
To view or add a comment, sign in