Facebook Turns to the Crowd to Monitor the Crowdies

facebook

Facebook has begun testing a system that's in vogue at the moment: Using its own users as a data-crunching system. Nothing terribly new there--except that Facebook's using its crowd to actually moderate the rest of the crowd and stamp out the nasty bits, which is a whole new ethically-intriguing level

It's called the "Facebook Community Council" and according to the group's motto it exists to "harness the power and intelligence of Facebook users to support us in keeping Facebook a trusted and vibrant community." This all sounds very lofty, very un-dictatorial and much more hippyish, power-to-the-people than Facebook sometimes seems, with moves like its blanket decisions on user-privacy.

The whole point of the FCC (hah! nice acronym) is to check items published on Facebook for offensiveness along the lines of personal attacks, violence, drug abuse and so on. To that end, FCC members are only allowed to click on one of the following alert flags inside a special FCC members-only app if they find something objectionable: Spam, Acceptable, Not English, Nudity, Drugs, Attacking, Violence. The "attacking" bit apparently applies to public figures, which is where we begin to see the complexity and inherent risks of the FCC itself.

Because Facebook would seem to be deeming "attacks" on non-public figures as something the FCC can't moderate. This will be for blindingly obvious and sad reasons: Facebook's covering its legal arse, and doesn't want to get embroiled in user-to-user abuse cases for fear of the legal and financial ramifications. And when you think about it, this is going to apply to pretty much everything the FCC recommends. Facebook is going to have to tread very carefully when it deals with items FCC members have deemed offensive. You only have to look at the whole stupid, inane fiasco stirred up when Facebook's puritanical execs decided that pictures of breastfeeding babies were offensive, and began to take them down. And what if U.S.-based FCC members take offense to something that European Facebookers would deem wholly acceptable?

Now of course, the opinions of the FCC members are going to be smoothed and averaged out before any actions are taken, but it's still going to be tricky moral ground for Facebook. How will Council members deal with Holocaust denial issues on Facebook pages, for example? One could frame the act of denial itself as a "violent" act, but Facebook has until now taken an extremely hands-off approach to this issue, and has stirred up a hornets nest of controversy on the matter. Will the FCC be empowered to challenge this sort of content on Facebook? Or will it merely be a paper tiger?

That's the obvious worry here. While the concept of the FCC sounds admirable, and would seem to be a sign that Facebook is addressing user concerns over inappropriate content in a very democratic way, it may not end up being as effective as it sounds like it could be. And, in fact, Facebook may have set up the FCC for a totally different reason: To data-mine the opinions of the moderators themselves, as a way of tapping the moral sensitivities of its user base. Is that good news or bad? Given recent draconian moves by Facebook to redefine what user data it can share with the Internet, I'd tend to think it's more of a bad thing.

[Facebook Community Council via InsideFacebook]

Add New Comment

13 Comments

  • Catherine Fitzpatrick

    I often wonder if basic law and civic are taught in schools anymore when I read analysis like this that doesn't start with a basic awareness: that a committee like this *appointed by a company* and not *elected freely* by any recognizable and valid constituency *is not progressive and NOT democratic*. The very nature of this body conditions its output, which will always and everywhere be obstructed by the very company that appointed it.

    It's totally misleading that you can have a "crowdsource" that is in fact *appointed by the company itself* (even if from "enthusiastic volunteers* that it filters and vets). A real "crowdsource" would be a free body that developed independently from the company Facebook.

    Finally, there should be an awareness that if any civil or criminal penalties would be attempted to be applied, this would indeed be a matter of the rule of law and the U.S. Constitution and its Supreme Court. And under the landmark decision of Times v. Sullivan, in fact public figures enjoy less protection than private persons, and in fact journalists (and by extension, bloggers or Facebook wall posters) can criticize -- even with ad hominem attacks! -- public figures. Such a case simply wouldn't make it to trial. There would be burden on the injured party to show that there was indeed libel, i.e. not true; that it was deliberate (i.e. "with deliberate malice") and that the public figure was actually harmed in his ability to make income (it's very hard to mount cases like that -- see Westmoreland v. Time magazine).

    The community is not policing itself when the community's police are made by the police themselves, Facebook. And Facebook itself isn't a democratically elected body in a liberal state, with checks and balances. It's just a private company without any obligation to adhere to First Amendment standards, as its right to association, and its right to set any policy it likes, trump the First Amendment -- and that can be a good thing for citizens' committees, too.

    However, increasingly, these companies play a HUGE role in dominating and determinating the framework and rules for all public discourse, not just on entertainment or cultural issues, but political matters. And in that sense, I think if they are to play the game of appearing liberal and progressive, they should apply First Amendment rules to their own press-like function. They should not remove content unless they have a court order from a case ruling that in fact a party was indeed defamed. Obviously, people can venue shop and get rulings like that in England or Russia (a Russian company owns a lot of shares of the company Facebook!) but even there, it's not as easy as mob rule or fake astro-turfed "crowdsourcing" created by the company itself.

  • GaryFPatton

    Yours is a thought-provoking and helpful article, Kit. Thank you!

    As I read, I wondered how the FCC might handle additional contentious issues you didn't mention like abortion, sexuality, plus that other biggie besides politics and taxes ...religion. Within the latter lies the current active and emotion-charges discussion regarding the connection between so-called Islamism and terrorism.

    I've posted your article onto my FaceBook Wall for my Friends to read and consider from their own perspective.

    Regards,
    @GaryFPatton
    The People Development Guy in Toronto
    http://is.gd/1DCOm

  • Felix Desroches

    This is very strange - Facebook is first and foremost a community among friends (and friends of friends), and sometimes things get ugly. It would almost be like sending a "moderator" into a schoolyard to make sure that Jimmy doesn't tell Sally she looks stupid in that dress. C'mon.

    On the other hand, friends together can do bad things, like start a Hitler youth group movement, for example. Why doesn't Fb get the entire community to vote on a set of guidelines for "appropriate behavior" that we all must do our best to follow? At least that way the FCC (terrible name choice, by the way, especially given the real FCC's conservative leanings) doesn't get carte blanche to remove pictures of my drunken escapades with friends because they might be offensive.

  • David Molden

    @kit - yes you're right with NLP. Communities grow and develop, very slowly if its gov't, faster if private. I wonder how they will deal with developing capabilities and user demands? Central control goes with the ownership of the technology - how many commandments will they monitor?

    www.quadrant1.com

  • David Molden

    I will be interested to see if this works to the community's advantage. It kind of follows the NLP presupposition 'all procedures should increase choice' - this certainly has the potential to do that, but it could also become a set of limitations. Stand back and let the games begin.

    http://www.quadrant1.com

  • David Molden

    I will be interested to see if this works to the community's advantage. It kind of follows the NLP presupposition 'all procedures should increase choice' - this certainly has the potential to do that, but it could also become a set of limitations. Stand back and let the games begin.

    http://www.quadrant1.com

  • Kit Eaton

    @Sandra. This point has come up in regards to Facebook before, and has taxed many a legal mind (and the wallets of those paying for them!). The piece I read this morning seemed to hinge on whether Facebook is "private" property--a walled garden, with free membership--or not, and whether freedom of speech applies in this case.
    @Michael. Bingo! Exactly the sort of complex issue that'll come up.
    @Jim. It's thorny, I'll agree. But look at the mess Wikipedia's gotten into with its user-editable publications in the past. Facebook's got to tread a similar path, but get it right--hundreds of millions of angry users won't be pretty.
    @David. ... as in Neuro Linguistic Programming? The policing policy is, in this case, muddled by being apparently patrolled by the community itself, with central control over content retained at Facebook HQ. How does this affect your thinking?

  • Sandra Pearson

    My gut reaction is that this is a 1st Amendment issue - is the content I post on Facebook owned by me? If yes, then this could violate my 1st Amendment rights. If it is owned by Facebook then it is their right to censor me.

    Hmmm...this will take some pondering. And a revisit of the Facebook TOS.

  • Jim Canto

    I commend Facebook for taking on the issue. I'm not a Facebook junkie. I simply recognize the moderation dilemma and have been searching for the answer so that I might apply it within communities I'm developing. I spend my days thinking about how I can create a valuable destination for those I seek to attract. I'm not simply after page views.

    Facebook, and any community manager, online or offline for that matter, has two choices; 1.) Put controls in place or 2.) Don't put controls in place. The whole topic comes down to the old saying; "You can't please all of the people all of the time."

    To choose not to put any controls in place is to trust every individual will do the "right" thing (whatever that is) all the time. Yet, the moment ANY "rules" are implemented, someone, or some group becomes alienated.

    Therefore, Facebook has two choice: Let 350 MILLION people try to get along in a world with no rules... or... make an attempt at understanding the "moral sensitivities" of it's users so that it might come up with community moderation rules which will alienate as few as possible.

    But, you can't please all of the people all of the time.

    I run a modest group of less than 1000 people and I've kept the activity very low for this same reason; I've been trying to figure out how to build in balance so that the group may grow in a healthy fashion, beneficial to as many members as possible. The only answer I continually come up with is to develop some sort of council designed to create and enforce community behavioral guidelines. My "math" leads me to the same conclusion Facebook as come to. I simply have not come up with a framework for that approach yet. I will definitely have a close look at Facebook's approach to draw some inspiration. Kudos to them for trying.

    Anyone have a better idea for a social architecture?

  • Michael Bloomquist

    This is all fine and good however I'm sure the "FCC" can't decifier my sarcasism in a wall post to one of my friends. ie - "I will beat you like a red-headed step child when you bring your sorry ass over here for poker night." Not that I would actually post that, however I bet someone will! MB

  • David Molden

    I will follow this with interest as it seems to follow the NLP presupposition that 'all procedures should increase choice'. If the community is policing itself then only the community can take away its choice. I will be interested to see how much positive choice this method adds to the system, or how many limitations are imposed.

    www.quadrant1.com

  • GaryFPatton

    Yours is a thought-provoking and helpful article, Kit. Thank you!

    As I read, I wondered how the FCC might handle additional contentious issues you didn't mention like abortion, sexuality, plus that other biggie besides politics and taxes ...religion. Within the latter lies the current active and emotion-charges discussion regarding the connection between so-called Islamism and terrorism.

    I've posted your article onto my FaceBook Wall for my Friends to read and consider from their own perspective.

    Regards,
    @GaryFPatton
    The People Development Guy in Toronto
    http://is.gd/1DCOm

  • David Molden

    I will be interested to see if this works to the community's advantage. It kind of follows the NLP presupposition 'all procedures should increase choice' - this certainly has the potential to do that, but it could also become a set of limitations. Stand back and let the games begin.

    http://www.quadrant1.com