Editor’s note: Kara Alaimo, associate professor of communication at Fairleigh Dickinson University, writes about issues affecting women and social media. Her book, “This Feed Is on Fire: Why Social Media Is Toxic for Women and Girls — And How We Can Reclaim It,” will be published by Alcove Press in 2024. The opinions expressed in this commentary are her own. Read more reviews on CNN.



CNN

Tech executives could face the prospect of spending time behind bars in Britain if they deliberately ignore rules designed to protect children online under a proposed amendment to a data protection bill. online security.

Kara Alaimo

As currently drafted, the bill would require social media companies to identify and remove content that promotes self-harm, including content glorifying suicide, and not allow children under 13 years old to use their platforms. In a written statement to parliament, the secretary of state for digital, culture, media and sport, Michelle Donelan, said tech leaders who act in “good faith” would not be affected, but those who “consent or agree” not to follow the new rules could face a prison sentence.

Hopefully this bill will pass. For too long, technology leaders have shied away from responsibility for the detrimental impact their products can have on those who use them. And while it is unlikely that a law similar to this amendment to the British Bill will ever be passed in the United States – given its fiercely pro-business climate, its broad constitutional protection of free speech, and its regulations that limit the liability of internet platforms for what their customers post online – other countries should consider similar penalties for leaders of the technology.

The tech industry, of course, disagrees. TechUK, an industry trade association in the country, said the prospect of a prison sentence would not make social media safer for children, but would discourage investment in the country. But I think this law would do the exact opposite: serve as a warning to tech leaders that they are responsible for what the products they make do.

Part of the reason tech leaders have evaded personal responsibility for their impact on society for so long is because of the way we think about social media. We talk about what happens in real life to distinguish it from what happens online. But the effects that social networks have on users – especially children – are often felt in “real” life.

For example, in September a UK coroner ruled that the “negative effects of online content” were partly responsible for the suicide of 14-year-old Molly Russell. The Guardian reports that in the six months before her death in 2017, Meta data revealed that Molly had viewed 2,100 posts related to self-harm, depression and suicide on Instagram.

Meta, Instagram’s parent company, admitted that Molly had seen content that violated its community standards and in 2019 added new policies against graphic images depicting self-harm. It has also started offering resource links for users viewing depressive content.

But, in 2021, US Senator Richard Blumenthal’s team created an account claiming to be that of a 13-year-old girl and tracked accounts promoting eating disorders. Instagram then promoted messy eating accounts with names like “eternally hungry”. Instagram told CNN it took down the accounts and they shouldn’t have been approved in the first place because they violated the platform’s rules against content promoting eating disorders.

And a chilling report the Center for Countering Digital Hate released last month explains what happened when researchers created TikTok accounts claiming to be those of 13-year-olds, quickly stopping and liking content on the mental health and body image. In less than 2.6 minutes, TikTok was showing suicidal content.

Within eight minutes, the platform recommended content on eating disorders. When an account used a name suggesting that the user was vulnerable to an eating disorder, TikTok offered even more of this type of terrible content. TikTok said the content researchers saw does not reflect what other users see due to the limited sample size and time constraints of the study and it removes content that violates its standards. and provides resources to those in need.

And Frances Haugen, a former Facebook staffer turned whistleblower, revealed in 2021 that Meta was well aware of Instagram’s harmful effects on some younger users. But Haugen said the company chose to prioritize money over child protection. Meta said it was developing controls and parental supervision features to help teens regulate their use of Instagram, and CEO Mark Zuckerberg disputed Haugen’s characterization of the company as false.

In the United States, members of Congress have passed just two laws regulating how companies interact with children online in the past 25 years – one requiring parental consent for sites to collect data about children under the age of 13 and who holds the sites responsible for facilitating human trafficking and prostitution.

There’s no reason technology leaders should be absolved of responsibility for what their products can do to users. This amendment UK should also be a wake-up call to parents and other social media users about the dangers we and our children may face online.

If jail seems draconian, it’s nothing compared to the price Molly Russell and her family paid. But, five years after his suicide, social platforms are still delivering the same kind of toxic content to vulnerable young people. This needs to stop, even if it means putting tech executives behind bars.

Source link

Leave A Reply