YouTube Enlists ‘Trusted Flaggers’ to Police Videos
By Alistair Barr and Lisa Fleisher
Google has given roughly 200 people and organizations, including a British police unit, the ability to “flag” up to 20 YouTube videos at once to be reviewed for violating the site’s guidelines.
The Financial Times last week reported that the U.K. Metropolitan Police’s Counter Terrorism Internet Referral Unit has been using its “super flagger” authority to seek reviews – and removal – of videos it considers extremist.
The news sparked concern that Google lets the U.K. government censor videos that it doesn’t like, and prompted Google to disclose more details about the program. Any user can ask for a video to reviewed. Participants in the super flagger program, begun as a pilot in 2012, can seek reviews of 20 videos at once.
A person familiar with the program said the vast majority of the 200 participants in the super flagger program are individuals who spend a lot of time flagging videos that may violate YouTube’s community guidelines. Fewer than 10 participants are government agencies or non-governmental organizations such as anti-hate and child-safety groups, the person added.
In either case, Google said it decides which videos are removed from YouTube. “Any suggestion that a government or any other group can use these flagging tools to remove YouTube content themselves is wrong,” a Google spokesman said.
Google’s guidelines prohibit videos that incite people to commit violence, or that show animal abuse, drug abuse, under-age drinking or bomb making, among other topics. Google maintains a separate system to monitor for copyright infringement.
The news about the super flagger program comes as some governments pressure social-media sites that they blame for civil unrest. In Turkey, Prime Minister Tayyip Erdogan threatened this month to ban Facebook and YouTube because they “encourage every kind of immorality and espionage for their own ends.”
British officials say they use the program to refer videos to YouTube that they believe have violated the U.K.’s Terrorism Act. These are then prioritized by YouTube, according to Sarah Buxton, a spokeswoman at the U.K. Home Office.
“YouTube may choose to remove legal extremist content if it breaches their terms and conditions,” she added.
Google was not pressured to let the U.K.’s counter-terrorism unit into the program, the person familiar with the program explained. Instead, the government agency showed an interest in YouTube’s guidelines and spotted videos that violated the rules, the person added.
More than 90% of the videos identified by super flaggers are either removed for violating guidelines, or restricted as not appropriate for younger users, the person familiar with the program said. That’s a far higher percentage than regular users who occasionally flag dubious content, the person said.
http://blogs.wsj.com/digits/2014/03/17/youtube-enlists-trusted-flaggers-to-police-videos/
Folks, it may be time to sign up and use Vimeo as an alternative to using Youtube, so simply sign up right now for free at Vimeo.com.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment