The Supreme Court ruled today in two cases that could have a major impact on how social media platforms operate and how the government can interfere on behalf of political speech on these platforms.
Katelyn is a writer with CNET covering artificial intelligence, including chatbots, image and video generators. Her work explores how new AI technology is infiltrating our lives, shaping the content ...
California will no longer enforce key provisions of a law requiring social media companies to disclose details about their content moderation practices after settling a lawsuit with Elon Musk’s X Corp ...
Two state laws that could upend the way social media companies handle content moderation are still in limbo after a Supreme Court ruling sent the challenges back to lower courts, vacating previous ...
Attorney General Andrew Bailey issued a regulation this week requiring social media platforms to give users in Missouri a choice over their algorithm. Bailey's proposal, modeled after the road map ...
Social media platforms commonly use artificial intelligence for content moderation, with the AI software itself relying on algorithms to screen content posted by social media users. Ultimately, the AI ...
As technology rapidly advances, with AI, crypto, encryption and other innovations, legacy social media and their content moderation policies are in flux. YouTube/Google, Facebook/Meta and Twitter/X ...
Most people want harmful social media content such as physical threats and defamation to be restricted. This also applies in the U.S., where several social media platforms have recently modified their ...