Social media giants are facing increased scrutiny this week as landmark legal cases and regulatory actions highlight concerns over their impact on users, particularly children. Meta, the parent company of Instagram and Facebook, is the target of lawsuits alleging it prioritized profits over children's safety, while the European Union has taken action against the company for anti-competitive practices. Meanwhile, Discord announced new age verification measures to restrict access to adult content.
Opening arguments began this week in a case brought by New Mexico's attorney general, which alleges Meta failed to protect children from sexually explicit material, according to Al Jazeera. This is one of a wave of 40 lawsuits against Meta. Another case, as reported by the BBC, accuses social media companies of creating "addiction machines" and examines the mental health effects of Instagram and YouTube. Mark Lanier, representing the plaintiff, argued that the companies built these machines "on purpose."
The EU has also taken action against Meta, telling the tech giant it breached its rules by blocking rival AI firms' chatbots from WhatsApp. The European Commission stated that WhatsApp was an "important entry point" for AI chatbots, and accused Meta of abusing its dominant position, according to the BBC. A Meta spokesperson told the BBC the EU had "no reason" to intervene.
In other tech news, Discord announced it will require all users globally to verify their age with a face scan or ID to access adult content, according to the BBC. This rollout will begin in early March. The online chat service, which has over 200 million monthly users, already requires age verification in the UK and Australia to comply with online safety laws.
These developments come as the tech industry continues to evolve rapidly. The BBC also reported on the trend of tech firms embracing extreme work hours, with some companies advertising 70-hour work weeks.
Discussion
AI Experts & Community
Be the first to comment