- Aug 17, 2017
- 1,609
What if YouTube stopped making recommendations?
What if a state official in Texas or Florida required Instagram to not remove vaccine misinformation that’s against the app’s rules?
Or what if TikTok remade its “For You” tab so that content moderators needed to OK videos before letting them appear?
The Supreme Court this week opened the door to radically different ways of thinking about social media and the internet. The court is poised to hear as many as three cases this term about the legal protections that social media companies have used to become industry behemoths, and about the freewheeling latitude the companies now have over online speech, entertainment and information.
Its rulings could be the start of a new reality on the internet, one where platforms are much more cautious about the content they decide to push out to billions of people each day. Alternatively, the court could also create a situation in which tech companies have little power to moderate what users post, rolling back years of efforts to limit the reach of misinformation, abuse and hate speech.
The result could make parts of the internet unrecognizable, as certain voices get louder or quieter and information spreads in different ways.
“The key to the future of the internet is being able to strike that balance between preserving that participatory nature and increasing access to good information,” said Robyn Caplan, a senior researcher at Data & Society, a nonprofit organization that studies the internet.
Last edited by a moderator: