Facebook’s Inconsistent Policies or Unmanageable Data!
Recently a leaked document by Guardian has raised concerns over the content publishing policies of Facebook. Having 2 billion users and 1.3 million posts shared every minute it’s imperative for Facebook to react in constructive manner.
The Guardian has seen more than 100 internal training manuals, spreadsheets and flowcharts that give unprecedented insight into the blueprints Facebook has used to moderate issues such as violence, hate speech, terrorism, pornography, racism and self-harm.
This leak has proved that social platforms like Facebook cant provide 100% safe environment to brands and to people. For Example #Kill Jews may be meaningful to Facebook team and it may be removed but something like kick you Muslims out may not be traced or considered as realistic threat of violence.
At scale, user-generated content provides too great of a challenge. And this doesn’t necessarily bode well for advertisers.
“Advertisers are demanding more than what these platforms can currently provide,” said Ari Applbaum, vice president of marketing at video advertising platform AnyClip. “Until artificial intelligence solutions are robust enough to provide 100% assurance, manual screening of content is replacing AI, and it’s not sustainable in the long run.”
But while some advertisers may not be happy about the context in which their ads appear, they do have some control over the process.
“Every brand has their specific set of criteria in terms of their own limits and thresholds,” said Marc Goldberg, CEO of Trust Metrics, a publisher verification firm. “I don’t think this leak will impact Facebook’s business, but it will introduce new conversations around specific concerns and whether the company is doing enough for brands.”
Without any doubt there are grey areas but this is one major loophole of which Facebook is aware about. Facebook will be hiring thousands of moderators to monitor content but these moderators have to abide by the rule book given by the company.
In a Facebook post, Zuckerberg said: “We’re working to make these videos easier to report so we can take the right action sooner—whether that’s responding quickly when someone needs help or taking a post down. This is important. Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate.”
Source: eMarketer
Previous Article
« 4 Tactics to Enrich Video AdvertisingNext Article
Popularity of Sports Among Internet Users »