Facebook expands downvote tests on comments – TechCrunch


Mark Twain had it right: There’s no such thing as a new idea. To wit: Facebook is currently testing arrows to let users ‘up’ vote or ‘down’ vote individual comments in some of its international markets. Digg eat your heart out. Reddit roll over.

This particular trial of upvoting/downvoting buttons is limited to New Zealand and Australia for now, according to Facebook (via The Guardian).

The latest test is a bit different to a downvote test Facebook ran in the US back in February — when it just offered a downvote option. (And if clicked it hid the comment and gave users additional reporting options such as: “Offensive”, “Misleading”, and “Off Topic”.)

The latest international test looks a bit less negative — with an overall score being recorded next to the arrows which could at least reward users with some positive feels if their comment gets lots of upvotes. Negative scores could do the opposite though.

It’s not certain whether the company will commit to rolling out the feature in this form — a spokesman told us this is an early test, with no decision made on whether to roll it out for Facebook’s 2.2BN+ user base — but its various tests in this area suggest it’s interested in having another signal for rating or ranking comments.

In a statement attributed to a spokesperson it told us: “People have told us they would like to see better public discussions on Facebook, and want spaces where people with different opinions can have more constructive dialogue.  To that end, we’re running a small test in New Zealand which allows people to upvote or downvote comments on public Page posts. Our hope is that this feature will make it easier for us to create such spaces, by ranking the comments that readers believe deserve to rank highest, rather than the comments that get the strongest emotional reaction.”

The test looks to have been going on for a couple of weeks at least at this point — a reader emailed TC on April 14 with screengrabs of the trial on comments for New Zealand Commonwealth Games content…

 

Facebook emphasized the feature is not an official dislike button. If rolled out a spokesman said it would not replace the suite of emoji reactions the platform already offers so people can record how they feel about an individual Facebook post (and reactions already include thumbs up/thumbs down emoji options).

Rather its focus is on giving users more granular controls to rate comments on Facebook posts.

The spokesman told us the feature test is intended to see if users find the upvote/downvote buttons a productive option to give feedback about how informative or constructive a comment is.

Facebook users with access to the trial who hover over a comment will see a pop-up box that explains how to use the feature, according to the Guardian — with usage of the up arrow encouraged via text telling them to “support better comments” and “press the comment up if you think the comment is helpful or insightful”; while the down arrow is explained as a way to “stop bad comments” and the further instruction to: “Press the down button if a comment has bad intentions or is disrespectful. It’s still ok to disagree in a respectful way.”

It’s likely Facebook is toying with using comment rating buttons to encourage a more broad crowdsourcing effort to help it with the myriad, complex content moderation challenges associated with its platform.

Responding quickly enough to hate speech remains a particular problem for it — and a hugely serious one in certain regions and markets.

In Myanmar, for example, the UN has accused the platform of accelerating ethnic violence by failing to prevent the spread of violent hate speech. Local groups have also blasted Facebook for failing to be responsive enough to the problem.

In a statement responding to a critical letter sent last month by Myanmar civil society organizations, Facebook conceded: “We should have been faster and are working hard to improve our technology and tools to detect and prevent abusive, hateful or false content.”

And while the company has said it will double the headcount of staff who work on safety and security issues, to 20,000 by the end of this year, that’s still a tiny drop in the ocean of content shared daily over its social platforms — so it’s likely looking at how it can co-opt more of the 2.2BN+ humans who use its platform to help it with the hard problems of sifting good comments from bad: A nuanced task which can baffle AI — so, tl;dr, the more human signals you can get the better.


Like it? Share with your friends!

853
22145 shares, 853 points

What's Your Reaction?

Epic Epic
0
Epic
Dislike
0
Dislike
Like Like
0
Like
Fake Fake
0
Fake

Facebook expands downvote tests on comments – TechCrunch

MainStreet Econ

Join the MSE Community

reset password

Back to
MainStreet Econ
Choose A Format
Trivia quiz
Series of questions with right and wrong answers that intends to check knowledge
Poll
Voting to make decisions or determine opinions
Story
Formatted Text with Embeds and Visuals
List
The Classic Internet Listicles
Open List
Open List
Ranked List
Ranked List
Video
Youtube, Vimeo or Vine Embeds
Image
Photo or GIF
Gif
GIF format

Send this to a friend