Above: #Ferguson, on Twitter Below: Rest of America, on Facebookpic.twitter.com/C8u2lqK40d
— Anup Kaphle (@AnupKaphle) August 18, 2014
Social media is controlled by algorithms – a mathematical formula that dictates what you see and when. In the past week, people have noticed something curious about the way these algorithms have filtered news about protests in Ferguson, Mo., over the fatal shooting of unarmed black teenager Michael Brown.
Twitter vs. Facebook: my tweetstream is almost wall-to-wall with news from Ferguson. Only two mentions of it in my Facebook news feed.
— Mark_Hamilton (@gmarkham) August 14, 2014
Again, #Ferguson is dominating my Twitter feed, and is nonexistent on Facebook.
— Brian McComas (@briancmccomas) August 18, 2014
This is so true and troubling “@monteiro: If you want to pretend Ferguson isn’t happening just go to Facebook.”
— Tim Dickinson (@7im) August 19, 2014
The fundamental differences between the two platforms help explain the disparity.
“Because of its brevity, and the ease with which updates can be shared, Twitter is a much more rapid-fire experience than Facebook, and that makes it well suited for quick blasts of information during a breaking-news event like Ferguson,” Mathew Ingram of Gigaom pointed out. The non-newsy content that clutters the platform also makes it ill-suited for following breaking news, he added.
Another huge difference? Algorithms. Your Twitter feed isn’t controlled by an algorithm. You see the tweets of people you follow in real time. But Facebook uses a complicated algorithm to determine what ends up in your news feed. They won’t reveal exactly how it works, but the company has said it ranks the content based in part on what you’ve liked, clicked or shared in the past.
Ars Technica’s Casey Johnson suggested Facebook’s algorithm also weeds out controversial content — racially charged protests, perhaps? — from users’ news feeds: “There is a reason that the content users see tends to be agreeable to a general audience: sites like [BuzzFeed, Elite Daily, Upworthy, and their ilk] are constantly honing their ability to surface stuff with universal appeal. Content that causes dissension and tension can provide short-term rewards to Facebook in the form of heated debates, but content that creates accord and harmony is what keeps people coming back.”
Johnson backed up her theory with a Georgia Institute of Technologystudy of how political content affects users’ perceptions of Facebook. She summed up the findings: “The study found that, because Facebook friend networks are often composed of ‘weak ties’ where the threshold for friending someone is low, users were often negatively surprised to see their acquaintances express political opinions different from their own. This felt alienating and, overall, made everyone less likely to speak up on political matters (and therefore, create content for Facebook).”
For University of North Carolina sociologist Zeynep Tufekci, this sort of “algorithmic filtering” is more than a matter of technical differences. Last Wednesday, when there was rioting in Ferguson and journalists were being arrested, the events in Ferguson unfolded in real time on her Twitter feed. But on Facebook, where she follows a similar composition of friends, posts about Ferguson didn’t appear in her feed until the next morning. “Would Ferguson be buried in algorithmic censorship?” she wrote on Medium..
If so, that’s bad. “How the internet is run, governed and filtered is a human rights issue,” she wrote.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.