The past few days have not been kind to Instagram.
First came newly unsealed documents alleging Meta, the app’s parent company, knowingly has millions of users under the age of 13 on Instagram.
The complaint, filed by 33 states and first reported by The New York Times, also claims Meta protected the information from being disclosed to the public.
Things only got worse when a Wall Street Journal report detailed how Instagram’s Reels, its short-form video platform, pushed “risqué footage of children as well as overtly sexual adult videos” to accounts following children influencers on Instagram. The content also appeared alongside ads from major brands, according to tests conducted by WSJ.
Business Insider’s Katie Notopoulos has a rundown on how a relatively scandal-free summer for Meta has erupted into a full-blown crisis at Instagram.
The concerns raised over Reels couldn’t come at a worse time. Executives highlighted successful monetization efforts for Reels during Meta’s most recent earnings report. UBS also noted Reels was even outperforming TikTok.
And the fix for Instagram’s Reels might not be simple or one Meta’s willing to make.
Trying to stop Instagram from recommending harmful content to people interested in it requires “significant changes to the recommendation algorithms that also drive engagement for normal users,” the WSJ reported, citing conversations with current and former Meta employees.
Meanwhile, company documents indicated Instagram’s safety staffers “are broadly barred from making changes to the platform that might reduce daily active users by any measurable amount,” WSJ also reported.
No comments:
Post a Comment