Big News: FounderDating is joining OneVest to build the largest community for entrepreneurs. Details here
Latest Notifications
You have no recent recommendations.
Name
Title
 
MiniBio
FOLLOW
Title
 Followers
FOLLOW TOPIC

Question goes here

1,300 Followers

  • Name
    Entrepreneur
  • Name
    Entrepreneur
  • Name
    Entrepreneur
  • Name
    Entrepreneur
  • Name
    Entrepreneur
  • Name
    Entrepreneur
  • Name
    Entrepreneur
  • Name
    Entrepreneur

How to incent good behavior on social media (avoid spam/trolls)? Human moderation vs. algorithms?

I am working on a social community forum/platform in the politics space. Clearly politics is controversial. Would appreciate people's thoughts on ways to approach the objective of content moderation. The ideal solution is one that fully incentivizes civil discourse, while avoiding inflammatory, spammy, or otherwise unproductive user-generated content.

I'd be especially interested in hearing from anyone on the FounderDating staff. It doesn't appear like there's much/any moderation in the Discuss forums here, and for the most part I think the threads here are high quality. Perhaps that's simply a testament to the application/voucher/paid membership process that sufficiently filters the types of users on the site.

The way I see it there are three approaches to moderation:
- human moderators (i.e. reddit - with mods posting guidelines/expectations and the power to remove people)
- algorithmic moderation (textual parsing of comments/language & preventing bad posts)
- user selection filtering (rigorous signup/application process to weed out the bad eggs)

Thoughts on what is most effective?

5 Replies

Derick Smith
1
0
Derick Smith Entrepreneur • Advisor
CEO @ Chainreactor
Brian

Firstly it may be beneficial to think about your own beliefs and the principles of free speech. Does any discussion, especially political, benefit from free speech, or is it less than desirable?

If you decide free speech is a good thing, then perhaps the way to solve the perceived problems of "inflammatory, spammy or otherwise unproductive" contributions may be best moderated by making people responsible for what they say.

An approach that has been proposed in the cryptocurrency community is to add a cost component i.e. any commentator needs to put down a bond to have the right to comment. If the community find their contribution useful (liked) they get a reward, adding to their bond value. If their comment is deemed undesirable (disliked) they are penalized, reducing their bond value.
Roger Hector
0
0
Roger Hector Advisor
CEO @ TopTrack LLC
Brian, You may want to consider a prominent reminder of this key principal at the very top of the page. If you name it: "Civiil Discourse", it will remind everyone that this is required to participate in the discussion. This can also be reinforced with a subtitle such as: "Only thoughtful reasoning allowed" (or something like that). The effectiveness of other approaches to moderation will depend a bit on the scale you operate at, and some value can be found in each direction. But clearly you should expect bad posts.
David Fridley
2
0
David Fridley Entrepreneur
Founder at Synaccord
Brian, I am working on a similar project. - How to we get around the political polarization and gridlock to find the solutions that unite us.

I think there is no, one solution and you have to many things. Here is what we've figure out about the methods you mentioned:
- human moderators - even with good intentions they are corruptible, especially over time when large sums of money are involved. (eg Congress). But it's also very hard to be unbiased and remain so over time. But what if you have 160M participants. It doesn't scale with consistency.

- algorithmic moderation - while simple strait forward checks are appropriate the algorithms used need to be transparent and well understood by the participants. If people don't understand why their posts get rejected the system will lose legitimacy. (School teachers in some places are going crazy because their evaluations depend on algorithmic calculations based on student test scores that they have no visibility into, and the algorithms are proprietary)

- user selection filtering - important but only goes so far - you want to make sure that people are confident that other people only get one account, and that appropriate people are participating (eg. not the ones's hired on fiverr to vote up a point). But it's a democracy and everyone should get to participate.

What we're working on is more like reddit's voting up process. Each post is shown to a small random group of people. They can vote it up. If no one votes it up then it only impacted that small group, but not the entire community. Also, people are encouraged to give feedback so that the person can learn how what they wrote is perceived by other people.
Brad Harkavy
2
0
Brad Harkavy Advisor
General Manager at LiveData, Inc.
One of the reasons FounderDating works is that is a community in which admission is restricted and generally like-minded. To get a two sided political discussion going among folks from different political view points will be much more difficult unless of course, you aggressively enforce a policy of NO"inflammatory, spammy or otherwise unproductive" Perhaps, you can grant the group members the right to flag policy violators to a human curator. The curator has the right to bar specific posts or people if they violate the policy.

Pierre-R. Wolff
1
0
Pierre-R. Wolff Advisor
Business and Corporate Development Executive / Professional Connector / Kitesurfer
From my experiences at Tribe and later with Livefyre, politics is perhaps one of the toughest categories to hope for civil discourse. Foreign Policy is perhaps one of the better sites at managing this and they use a combination of human and algorithmic moderation. The way it tends to work is that the algorithmic phase knocks off a good bit of the spam, but is also able to do a pretty good job at classifying offensive content (ie. bullying, racism, profanity, etc.). The classified comments with less confidence are highlighted for review by the human moderators, but they can also review all comments if they chose to. Fox News for example, has a tougher time just because of the sheer volume of inflammatory comments. They had taken an approach of allowing all comments to make it on to the site, but could quickly remove anything as it was flagged by users or by their moderators. Both Livefyre and DISQUS have spam detection and moderation tools so you might want to check out if their services make sense for your site.
Join FounderDating to participate in the discussion
Nothing gets posted to LinkedIn and your information will not be shared.

Just a few more details please.

DO: Start a discussion, share a resource, or ask a question related to entrepreneurship.
DON'T: Post about prohibited topics such as recruiting, cofounder wanted, check out my product
or feedback on the FD site (you can send this to us directly info@founderdating.com).
See the Community Code of Conduct for more details.

Title

Give your question or discussion topic a great title, make it catchy and succinct.

Details

Make sure what you're about to say is specific and relevant - you'll get better responses.

Topics

Tag your discussion so you get more relevant responses.

Question goes here

1,300 Followers

  • Name
    Details
  • Name
    Details
  • Name
    Details
  • Name
    Details
  • Name
    Details
  • Name
    Details
  • Name
    Details
  • Name
    Details
Know someone who should answer this question? Enter their email below
Stay current and follow these discussion topics?